Project comparison
Compare adoption, momentum, maintenance health, and project basics before choosing which tool to evaluate deeper.
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, DeepSeek, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discrete GPU such as Arc, Flex and Max); seamlessly integrate with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, vLLM, DeepSpeed, Axolotl, etc.
A single CLAUDE.md file to improve Claude Code behavior, derived from Andrej Karpathy's observations on LLM coding pitfalls.
Andrej Karpathy Skills has the larger GitHub footprint with 125.3K stars.
Ipex Llm is currently growing faster at 0 stars this week.
Ipex Llm has the stronger health score at 42/100.
| Signal | Ipex Llm | Andrej Karpathy Skills |
|---|---|---|
| GitHub stars | 8.8K | 125.3K |
| Weekly growth | 0 | 0 |
| Health score | 42 | 32 |
| Contributors | 124 | 7 |
| Commits per week | 0.0 | 0.6 |
| Open issues | 1.5K | 88 |
| Language | Python | N/A |
| License | Apache-2.0 | N/A |
| Last commit | 3mo ago | 23d ago |
| Last release | v2.2.0 | N/A |
Get the fastest-growing projects, useful MCP servers, and technical reads in one weekly email.