Dao-AILab/flash-attention

Optimized attention mechanism for deep learning

Python
AI & Machine Learning
Inference
BSD-3-Clause

22.5K

Stars

2.5K

Forks

May 19, 2022

Created

Mar 5, 2026

Last Updated

Project Analytics

Stars Growth (1 Month)

+388

+1.8% change

Avg Daily Growth (1 Month)

+13.9

stars per day

Fork/Star Ratio (All Time)

10.9%

Good engagement

Lifetime Growth

16.2

stars/day over 1.4K days

Stars Over Time

Forks Over Time

Open Issues Over Time

Pull Requests Over Time

Commits Over Time

AI-Generated Tags

flash-attention
deep-learning
pytorch
cuda

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.