Showing 1-1 of 1 projects
A library for building Mixture-of-Experts (MoE) models from the LLaMA language model with continual pre-training.
Get weekly updates on trending AI coding tools and projects.