Explore Projects

Discover 1 open source projects

Active filters (1):
Search: continual-pre-trainingร—
Clear all

Showing 1-1 of 1 projects

pjlab-sys4nlp/llama-moe

A library for building Mixture-of-Experts (MoE) models from the LLaMA language model with continual pre-training.

1.0K
Archived
Python
LLM Frameworks
Continual Pre-training
PyTorch
#llama#moe#continual-pretraining

Stay in the loop

Get weekly updates on trending AI coding tools and projects.