Explore Projects

Discover 20 open source projects

Active filters (1):
Search: moesร—
Clear all

Showing 1-20 of 20 projects

deepseek-ai/DeepSeek-V3

DeepSeek-V3 is a large-scale MoE language model with 671B parameters, optimized for efficiency and performance.

101.9K
Stable
Python
LLM Frameworks
Inference
DeepSeekMoE
#large-language-model#moe-architecture#deep-learning

vllm-project/vllm

High-throughput LLM inference engine for developers

72.1K
Active
Python
Inference
LLM Wrappers & SDKs
Hugging Face
#llm#inference#ai

hiyouga/LlamaFactory

Fine-tuning framework for 100+ LLMs & VLMs

67.9K
Active
Python
Fine-tuning
#llm#fine-tuning#ai

sgl-project/sglang

High-performance serving framework for large language and multimodal models

24.1K
Active
Python
Inference
LLM Frameworks
Python
#llm#inference#serving

NVIDIA/TensorRT-LLM

TensorRT LLM provides a Python API and optimizations to efficiently run large language models on NVIDIA GPUs.

13.0K
Active
Python
LLM Frameworks
PyTorch
#cuda#llm-serving#moe

modelscope/ms-swift

A Python library for using and fine-tuning over 900 large language models and multimodal models for various AI tasks.

12.9K
Active
Python
LLM Frameworks
Python
#llm#multimodal#fine-tuning

czy0729/Bangumi

An unofficial Bangumi (similar to Douban) client app for Android and iOS, built with React Native.

5.3K
Active
TypeScript
Component Libraries (React)
Cross-Platform
React
#android#ios#react-native

flashinfer-ai/flashinfer

A Python library for serving large language models (LLMs) with high performance, including GPU acceleration and distributed inference.

5.1K
Active
Python
LLM Frameworks
Inference
PyTorch
#llm#inference#cuda

zai-org/GLM-4.5

A foundational AI agent model for building agent-based, reasoning, and coding capabilities.

4.2K
Active
Python
LLM Frameworks
Agents & Orchestration
Python
#agent#glm#llm

journey-ad/Moe-Counter

A simple JavaScript library for adding a customizable visitor counter badge to your website.

2.8K
Stable
JavaScript
Component Libraries (React)
React
#badge#counter#visitor-counter

deepseek-ai/DeepSeek-MoE

DeepSeekMoE is a novel Mixture-of-Experts language model framework for building highly specialized AI coding assistants.

1.9K
Archived
Python
LLM Frameworks
AI Coding Agents
Python
#language-model#mixture-of-experts#ai-coding-assistant

laekov/fastmoe

A fast implementation of Mixture of Experts (MoE) for PyTorch, enabling efficient large-scale neural networks.

1.8K
Experimental
Python
LLM Frameworks
API Frameworks
PyTorch
#mixture-of-experts#large-scale-neural-networks#pytorch-library

KartoffelToby/better_thermostat

Custom Home Assistant component for smart thermostat control with AI-powered features

1.4K
Active
Python
Home Assistant
#home-assistant#thermostat#ai-powered

YelpArchive/MOE

A global, black box optimization engine for real-world metric optimization.

1.3K
Archived
C++
Agents & Orchestration
#optimization#black-box#real-world-metrics

BICLab/SpikingBrain-7B

Spiking Brain-inspired Large Models with efficient attention, MoE, and spike encoding for AI and ML developers.

1.3K
Stable
Python
LLM Frameworks
ML Ops
Python
#ai#machine-learning#llm

davidmrau/mixture-of-experts

PyTorch implementation of the Sparsely-Gated Mixture-of-Experts layer, a powerful AI model architecture.

1.2K
Archived
Python
LLM Frameworks
CLI Tools
PyTorch
#mixture-of-experts#moe#re-implementation

uccl-project/uccl

Efficient communication library for GPUs, covering collectives, P2P, and EP for AI/ML workloads

1.2K
Active
C++
GPU Frameworks
API Frameworks
C++
#ai#gpu#hpc

rhymes-ai/Aria

Aria is an open-source multimodal AI framework for building vision and language models.

1.1K
Archived
Jupyter Notebook
Agents & Orchestration
Computer Vision
Jupyter Notebook
#multimodal#vision-and-language#mixture-of-experts

HITsz-TMG/Uni-MoE

Uni-MoE is a large multimodal model family from Lychee, a Python library for AI model development and deployment.

1.1K
Stable
Python
LLM Frameworks
LLM Wrappers & SDKs
Python
#multimodal-models#large-language-models#ai-development

pjlab-sys4nlp/llama-moe

A library for building Mixture-of-Experts (MoE) models from the LLaMA language model with continual pre-training.

1.0K
Archived
Python
LLM Frameworks
Continual Pre-training
PyTorch
#llama#moe#continual-pretraining

Stay in the loop

Get weekly updates on trending AI coding tools and projects.