Explore Projects

Discover 116 open source projects

Active filters (1):
Search: attentionร—
Clear all

Showing 101-116 of 116 projects

tspeterkim/flash-attention-minimal

A minimal implementation of the Flash Attention algorithm in CUDA for efficient AI model inference.

1.1K
Archived
Cuda
LLM Frameworks
Inference
#cuda#attention-mechanism#deep-learning

emedvedev/attention-ocr

A Python library for OCR text recognition using a CNN-based seq2seq model with visual attention, compatible with Google Cloud ML Engine.

1.1K
Archived
Python
Computer Vision
API Frameworks
TensorFlow
#ocr#text-recognition#computer-vision

gordicaleksa/pytorch-original-transformer

A PyTorch implementation of the original Transformer model with interactive visualizations.

1.1K
Archived
Jupyter Notebook
LLM Frameworks
Computer Vision
PyTorch
#attention-mechanism#transformer#deep-learning

richliao/textClassifier

A text classifier library that uses Hierarchical Attention Networks for document classification.

1.1K
Archived
Python
LLM Frameworks
API Frameworks
Python
#text-classification#attention-mechanism#hierarchical-attention-networks

chengtan9907/OpenSTL

A comprehensive benchmark for spatio-temporal predictive learning, with a focus on AI-powered weather forecasting and video prediction.

1.1K
Stable
Python
Computer Vision
Predictive Learning
PyTorch
#weather-forecasting#video-prediction#self-supervised-learning

cardwing/Codes-for-Lane-Detection

A deep learning-based library for efficient lane detection, using self-attention distillation.

1.1K
Archived
Lua
Computer Vision
PyTorch
#lane-detection#deep-learning#cnn

citusdata/pg_shard

An unmaintained database sharding extension for PostgreSQL, superseded by the Citus distributed database.

1.1K
Archived
C
API Frameworks
Databases
#database#sharding#postgres

metaskills/experts

Experts.js provides an easy way to create and deploy OpenAI assistants as modular AI agents with expanded memory and attention.

1.1K
Archived
JavaScript
Agents & Orchestration
AI Agents
React
#openai#multi-agent-systems#ai-assistants

basho/rebar

rebar is an Erlang build tool that provides dependency management and other build-related functionality.

1.1K
Archived
Erlang
API Frameworks
#erlang#build-tool#dependency-management

Separius/awesome-fast-attention

A curated list of efficient attention modules for building AI-powered applications with transformers.

1.0K
Archived
Python
LLM Frameworks
API Frameworks
Python
#attention#transformer#self-attention

graphdeeplearning/graphtransformer

Graph Transformer Architecture for developing graph neural networks with attention mechanisms.

1.0K
Archived
Python
Graph Neural Networks
API Frameworks
React
#aaai#attention#graph-neural-networks

DeepWism/DeepWism-R2

A next-generation AGI system that redefines intelligence as entropy reduction rather than attention modeling.

1.0K
Experimental
LLM Frameworks
Agents & Orchestration
Unknown
#AGI#entropy-reduction#T3CEDS

brain-research/self-attention-gan

Self-Attention GAN, a Python library for generative adversarial networks using self-attention mechanisms.

1.0K
Archived
Python
GANs
API Frameworks
PyTorch
#generative-adversarial-networks#self-attention#machine-learning

google-deepmind/neural-processes

This repository contains implementations of Neural Process variants for machine learning research.

1.0K
Archived
Jupyter Notebook
LLM Frameworks
Learning & Education
Jupyter Notebook
#neural-networks#machine-learning#research

kevinzakka/spatial-transformer-network

A Python library that implements Spatial Transformer Networks, a type of deep learning model that can learn spatial transformations.

1.0K
Archived
Python
Computer Vision
ML Ops
TensorFlow
#affine-transformation#attention#convnet

Relaxed-System-Lab/Flash-Sparse-Attention

Efficient implementations of Native Sparse Attention, a key component in large language models.

982
Stable
Python
LLM Frameworks
API Frameworks
Python
#machine-learning#attention-mechanism#large-language-models

Stay in the loop

Get weekly updates on trending AI coding tools and projects.