davidmrau/mixture-of-experts

PyTorch implementation of the Sparsely-Gated Mixture-of-Experts layer, a powerful AI model architecture.

Python
AI & Machine Learning
LLM Frameworks
GPL-3.0

1.2K

Stars

110

Forks

Jul 19, 2019

Created

Apr 19, 2024

Last Updated

Project Analytics

Stars Growth (1 Month)

+7

+0.6% change

Avg Daily Growth (1 Month)

+0.3

stars per day

Fork/Star Ratio (All Time)

8.9%

Normal engagement

Lifetime Growth

0.5

stars/day over 2.4K days

Stars Over Time

Forks Over Time

Open Issues Over Time

Pull Requests Over Time

Commits Over Time

AI-Generated Tags

mixture-of-experts
moe
re-implementation
sparsely-gated
ai-research
pytorch

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.