Project comparison
Compare adoption, momentum, maintenance health, and project basics before choosing which tool to evaluate deeper.
High-performance In-browser LLM Inference Engine
A high-throughput and memory-efficient inference and serving engine for LLMs
vLLM has the larger GitHub footprint with 79.7K stars.
vLLM is currently growing faster at +606 stars this week.
vLLM has the stronger health score at 93/100.
| Signal | Web Llm | vLLM |
|---|---|---|
| GitHub stars | 18.0K | 79.7K |
| Weekly growth | 0 | +606 |
| Health score | 46 | 93 |
| Contributors | 52 | 2.6K |
| Commits per week | 1.8 | 208.8 |
| Open issues | 144 | 4.9K |
| Language | TypeScript | Python |
| License | Apache-2.0 | Apache-2.0 |
| Last commit | 6d ago | 14h ago |
| Last release | v0.2.83 | v0.20.2 |
Get the fastest-growing projects, useful MCP servers, and technical reads in one weekly email.