On March 18, a model called "Hunter Alpha" appeared on OpenRouter with no company name attached. No press release. No announcement. Just a mystery trillion-parameter model that immediately climbed the usage charts.

Within days, developers traced it back to Xiaomi โ€” a company best known for budget smartphones, not frontier AI. The model is MiMo-V2-Pro, and it's now one of the most-used models on OpenRouter.

What Makes It Different

MiMo-V2-Pro uses a Mixture-of-Experts architecture with roughly 1 trillion total parameters, activating around 42 billion per inference pass โ€” meaning it runs at a fraction of what full-scale inference would cost. It supports a 1-million-token context window, placing it in the same class as the most capable models available.

The pricing is where things get disruptive: at approximately $1 per million input tokens, it undercuts Anthropic's Claude Sonnet 4.6 (priced at ~$3/M tokens) by roughly 3x.

On agentic benchmarks, MiMo-V2-Pro performs close to Claude Opus 4.6 โ€” a significantly larger and pricier model. Coding evaluations show it outperforming Claude Sonnet 4.6 outright in several real-world tests.

The Stealth Drop

Xiaomi made no public announcement before the launch. The model simply appeared with the identifier "Hunter Alpha" and no attribution. Developer communities reverse-engineered the attribution over roughly four days before Xiaomi confirmed it.

The stealth approach is a sharp contrast to the marketing-heavy launches that typify Western AI labs. It's also a signal that Chinese consumer hardware companies โ€” not just dedicated AI research labs โ€” have quietly built the infrastructure to compete at the frontier.

What's Next

MiMo-V2-Pro is currently available via OpenRouter API. Xiaomi's MiMo model family also includes earlier open-source multimodal releases. Whether MiMo-V2-Pro will be open-sourced remains unclear, but its commercial availability through OpenRouter makes it accessible to developers today.