Project comparison
Compare adoption, momentum, maintenance health, and project basics before choosing which tool to evaluate deeper.
A high-throughput and memory-efficient inference and serving engine for LLMs
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
Ollama has the larger GitHub footprint with 171.2K stars.
vLLM is currently growing faster at +606 stars this week.
Ollama has the stronger health score at 95/100.
| Signal | vLLM | Ollama |
|---|---|---|
| GitHub stars | 79.7K | 171.2K |
| Weekly growth | +606 | +450 |
| Health score | 93 | 95 |
| Contributors | 2.6K | 600 |
| Commits per week | 208.8 | 24.5 |
| Open issues | 4.9K | 3.2K |
| Language | Python | Go |
| License | Apache-2.0 | MIT |
| Last commit | 13h ago | 13h ago |
| Last release | v0.20.2 | v0.23.2 |
Get the fastest-growing projects, useful MCP servers, and technical reads in one weekly email.