Showing 1-10 of 10 projects
Self-hosted, open-source AI alternative to OpenAI with local LLM inference, no GPU required
An open-source client for the KoboldAI AI language model that enables vibe coders to build with AI tools.
A free, open-source Rust inference server compatible with OpenAI-API, suitable for vibe coders
Provides GGUF quantization support for native ComfyUI models
A C++ library for building local AI inference platforms with support for ONNX models.
An open-source tutorial that helps developers deploy large language models (LLMs) like Ollama on CPU for production use.
Run AI models like LLaMA locally on your machine with Node.js bindings for llama.cpp and enforce JSON schema on the output.
A Go-based command-line tool for managing Ollama LLM models on Linux and macOS.
Interface for OuteTTS models, a Python library for text-to-speech using transformer-based models.
An open-source DevOps tool for packaging and versioning AI/ML models, datasets, code, and configuration into an OCI Artifact.
Get weekly updates on trending AI coding tools and projects.