Showing 1-4 of 4 projects
A free, open-source Rust inference server compatible with OpenAI-API, suitable for vibe coders
RamaLama simplifies local serving of AI models and enables their use for inference in production via containers.
Turn any computer or edge device into a command center for your computer vision projects.
The simplest way to serve AI/ML models in production, with support for popular models like Stable Diffusion and Whisper.
Get weekly updates on trending AI coding tools and projects.