laekov/fastmoe

A fast implementation of Mixture of Experts (MoE) for PyTorch, enabling efficient large-scale neural networks.

Python
AI & Machine Learning
LLM Frameworks
Apache-2.0

1.8K

Stars

202

Forks

Jan 25, 2021

Created

Feb 10, 2025

Last Updated

Project Analytics

Stars Growth (1 Month)

+12

+0.7% change

Avg Daily Growth (1 Month)

+0.4

stars per day

Fork/Star Ratio (All Time)

11.0%

Good engagement

Lifetime Growth

1.0

stars/day over 1.9K days

Stars Over Time

Forks Over Time

Open Issues Over Time

Pull Requests Over Time

Commits Over Time

AI-Generated Tags

mixture-of-experts
large-scale-neural-networks
pytorch-library
ai-modeling
distributed-training
cli-tool

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.