RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
Embed a live health badge in a README or docs page.
[](https://www.ai-tools-scout.com/projects/rwkv-lm)See how this project stacks up against other inference tools.
Get the fastest-growing projects, useful MCP servers, and technical reads in one weekly email.