Run local LLMs on any device with GPT4All
77.2K
Stars
8.3K
Forks
Mar 27, 2023
Created
May 27, 2025
Last Updated
+0.1% change
stars per day
Good engagement
stars/day over 1.1K days
Run LLMs locally in C/C++ with high performance
Web UI for local AI with multiple backends and offline capabilities
Run frontier AI models locally across devices using RDMA and tensor parallelism
Open-source ChatGPT replacement with local AI model support
Sign in to leave a comment or vote
Sign InNo comments yet. Be the first to comment!
Get weekly updates on trending AI coding tools and projects.