pjlab-sys4nlp/llama-moe

A library for building Mixture-of-Experts (MoE) models from the LLaMA language model with continual pre-training.

Python
AI & Machine Learning
LLM Frameworks
Apache-2.0

1.0K

Stars

62

Forks

Jul 24, 2023

Created

Dec 6, 2024

Last Updated

Project Analytics

Stars Growth (1 Month)

-3

-0.3% change

Avg Daily Growth (1 Month)

-0.1

stars per day

Fork/Star Ratio (All Time)

6.2%

Normal engagement

Lifetime Growth

1.0

stars/day over 957 days

Stars Over Time

Forks Over Time

Open Issues Over Time

Pull Requests Over Time

Commits Over Time

AI-Generated Tags

llama
moe
continual-pretraining
mixture-of-experts
emnlp-2024

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.