by ShipItAndPray
MCP server for LLM quantization. Compress any HuggingFace model to GGUF, GPTQ, or AWQ format. 6 tools: info, check, recommend, quantize, evaluate, push. Self-contained Python server โ no external CLI needed.
Get the fastest-growing projects, useful MCP servers, and technical reads in one weekly email.