Project comparison
Compare adoption, momentum, maintenance health, and project basics before choosing which tool to evaluate deeper.
LLM inference in C/C++
A high-throughput and memory-efficient inference and serving engine for LLMs
llama.cpp has the larger GitHub footprint with 109.6K stars.
llama.cpp is currently growing faster at +1.2K stars this week.
llama.cpp has the stronger health score at 100/100.
| Signal | llama.cpp | vLLM |
|---|---|---|
| GitHub stars | 109.6K | 79.7K |
| Weekly growth | +1.2K | +606 |
| Health score | 100 | 93 |
| Contributors | 1.7K | 2.6K |
| Commits per week | 86.2 | 208.8 |
| Open issues | 1.6K | 4.9K |
| Language | C++ | Python |
| License | MIT | Apache-2.0 |
| Last commit | 14h ago | 13h ago |
| Last release | b9113 | v0.20.2 |
Get the fastest-growing projects, useful MCP servers, and technical reads in one weekly email.