fla-org/flash-linear-attention

Efficient implementations of state-of-the-art linear attention models for large language models and NLP tasks.

Python
AI & Machine Learning
LLM Frameworks
MIT

4.5K

Stars

423

Forks

Dec 20, 2023

Created

Mar 5, 2026

Last Updated

Project Analytics

Stars Growth (1 Month)

+142

+3.3% change

Avg Daily Growth (1 Month)

+5.1

stars per day

Fork/Star Ratio (All Time)

9.4%

Normal engagement

Lifetime Growth

5.6

stars/day over 808 days

Stars Over Time

Forks Over Time

Open Issues Over Time

Pull Requests Over Time

Commits Over Time

AI-Generated Tags

large-language-models
machine-learning
natural-language-processing
efficiency
attention-models

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.