containers/ramalama

RamaLama simplifies local serving of AI models and enables their use for inference in production via containers.

Python
AI & Machine Learning
LLM Frameworks
MIT

2.6K

Stars

306

Forks

Jul 24, 2024

Created

Mar 4, 2026

Last Updated

Project Analytics

Stars Growth (1 Month)

+58

+2.3% change

Avg Daily Growth (1 Month)

+2.1

stars per day

Fork/Star Ratio (All Time)

11.6%

Good engagement

Lifetime Growth

4.5

stars/day over 590 days

Stars Over Time

Forks Over Time

Open Issues Over Time

Pull Requests Over Time

Commits Over Time

AI-Generated Tags

ai
containers
inference-server
llm
vllm
hacktoberfest

Comments (0)

Sign in to leave a comment or vote

Sign In

No comments yet. Be the first to comment!

Stay in the loop

Get weekly updates on trending AI coding tools and projects.