Project comparison
Compare adoption, momentum, maintenance health, and project basics before choosing which tool to evaluate deeper.
โก Python-free Rust inference server โ OpenAI-API compatible. GGUF + SafeTensors, hot model swap, auto-discovery, single binary. FREE now, FREE forever.
LLM inference in C/C++
llama.cpp has the larger GitHub footprint with 109.6K stars.
llama.cpp is currently growing faster at +1.2K stars this week.
llama.cpp has the stronger health score at 100/100.
| Signal | Shimmy | llama.cpp |
|---|---|---|
| GitHub stars | 4.8K | 109.6K |
| Weekly growth | 0 | +1.2K |
| Health score | 44 | 100 |
| Contributors | 4 | 1.7K |
| Commits per week | 0.2 | 86.2 |
| Open issues | 38 | 1.6K |
| Language | Rust | C++ |
| License | Apache-2.0 | MIT |
| Last commit | 1mo ago | 16h ago |
| Last release | v1.9.0 | b9113 |
Get the fastest-growing projects, useful MCP servers, and technical reads in one weekly email.