A fast implementation of Mixture of Experts (MoE) for PyTorch, enabling efficient large-scale neural networks.
1.8K
Stars
202
Forks
Jan 25, 2021
Created
Feb 10, 2025
Last Updated
+0.7% change
stars per day
Good engagement
stars/day over 1.9K days
Free programming books repository with AI-powered features
A comprehensive computer science study plan for software engineers, with a focus on coding interviews and AI-related topics.
Awesome Python framework and resource collection
Personal AI assistant for messaging platforms, self-hosted and multi-channel
Sign in to leave a comment or vote
Sign InNo comments yet. Be the first to comment!
Get weekly updates on trending AI coding tools and projects.