Separius/awesome-fast-attention

A curated list of efficient attention modules for building AI-powered applications with transformers.

Python
AI & Machine Learning
LLM Frameworks
GPL-3.0

1.0K

Stars

106

Forks

Jul 31, 2020

Created

Aug 23, 2021

Last Updated

Project Analytics

Stars Growth (1 Month)

+1

+0.1% change

Avg Daily Growth (1 Month)

+0.0

stars per day

Fork/Star Ratio (All Time)

10.4%

Good engagement

Lifetime Growth

0.5

stars/day over 2.0K days

Stars Over Time

Forks Over Time

Open Issues Over Time

Commits Over Time

AI-Generated Tags

attention
transformer
self-attention
multihead-attention
awesome
efficient

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.