openai/sparse_attention

Examples of using sparse attention, a technique for efficient Transformer-based sequence modeling.

Python
AI & Machine Learning
LLM Frameworks

1.6K

Stars

191

Forks

Apr 12, 2019

Created

Aug 12, 2020

Last Updated

Project Analytics

Stars Growth (1 Month)

+4

+0.2% change

Avg Daily Growth (1 Month)

+0.1

stars per day

Fork/Star Ratio (All Time)

11.9%

Good engagement

Lifetime Growth

0.6

stars/day over 2.5K days

Stars Over Time

Forks Over Time

Open Issues Over Time

Pull Requests Over Time

Commits Over Time

AI-Generated Tags

sparse-attention
transformers
sequence-modeling
large-language-models
open-source

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.