Explore Projects

Discover 12 open source projects

Active filters (1):
Search: ggmlร—
Clear all

Showing 1-12 of 12 projects

ggml-org/llama.cpp

Run LLMs locally in C/C++ with high performance

96.8K
Active
C++
Local Inference Engines
#llama.cpp#ggml#C++

ggml-org/whisper.cpp

High-performance C/C++ port of OpenAI's Whisper for speech recognition

47.2K
Active
C++
Inference
CLI Tools
#speech-to-text#c++#inference

ggml-org/ggml

A tensor library for machine learning, with a focus on large language models and automatic differentiation.

14.2K
Active
C++
LLM Frameworks
#machine-learning#tensor-algebra#automatic-differentiation

xorbitsai/inference

Unified, production-ready inference API to run open-source, speech, and multimodal models on cloud, on-prem, or your laptop.

9.1K
Active
Python
LLM Frameworks
Inference
PyTorch
#artificial-intelligence#llm#inference

leejet/stable-diffusion.cpp

A C/C++ implementation of Stable Diffusion and other diffusion models for image generation and processing.

5.5K
Active
C++
Computer Vision
Inference
#ai#image-generation#diffusion

guinmoon/LLMFarm

An offline iOS and macOS library for running large language models like LLAMA, GPT-2, and RWKV using the GGML library.

2.0K
Active
C
LLM Frameworks
iOS
Swift
#llama#ggml#gpt-2

ggml-org/llama.vim

A Vim plugin that provides LLM-assisted code and text completion to boost developer productivity.

1.9K
Active
Vim Script
AI Code Editors
IDE Extensions
Vim
#llm#code-completion#text-completion

marella/ctransformers

Python bindings for Transformer models implemented in C/C++ using the GGML library.

1.9K
Archived
C
LLM Frameworks
LLM Wrappers & SDKs
#transformers#llm#c++

sammcj/gollama

A Go-based command-line tool for managing Ollama LLM models on Linux and macOS.

1.7K
Stable
Go
LLM Frameworks
CLI Tools
#ai#llm#ggml

RWKV/rwkv.cpp

An efficient C++ implementation of the RWKV language model for fast CPU inference on various bit-width quantizations.

1.6K
Experimental
C++
LLM Frameworks
Inference
#language-model#llm#quantization

RahulSChand/gpu_poor

A JavaScript tool for calculating token/s and GPU memory requirements for large language models like LLaMa.

1.4K
Archived
JavaScript
LLM Frameworks
CLI Tools
#llm#gpu#quantization

ggml-org/llama.vscode

VS Code extension that provides LLM-assisted code and text completion for developers.

1.2K
Active
TypeScript
AI Code Editors
IDE Extensions
VS Code
#llm#completion#vscode

Stay in the loop

Get weekly updates on trending AI coding tools and projects.