Xiaomi Debuts MIT-Licensed Trillion-Parameter MiMo Models

0
2
Xiaomi Releases Trillion-Scale MiMo AI Models With 1M-Token Context And MoE Architecture
Xiaomi Releases Trillion-Scale MiMo AI Models With 1M-Token Context And MoE Architecture

Xiaomi has open sourced its MiMo-V2.5 model family under the MIT License, bringing trillion-scale MoE models, 1M-token context windows and agentic multimodal capabilities to developers.

Xiaomi has open-sourced its MiMo-V2.5 model family under the MIT License, releasing model weights, tokenizers and model cards on Hugging Face for commercial use, continued training and fine-tuning without additional authorisation.

The release includes MiMo-V2.5-Pro, a 1.02-trillion-parameter Mixture-of-Experts model with 42 billion active parameters, and MiMo-V2.5, a 310-billion-parameter omnimodal model with 15 billion active parameters. Both support 1 million-token context windows and use Sparse Mixture-of-Experts architecture with hybrid attention.

MiMo-V2.5-Pro targets advanced coding, software engineering and long-horizon autonomous agents, sustaining 1000-plus tool-call sequences and supporting multi-step agent operations. Xiaomi said the model has been evaluated on compiler construction, full application generation and circuit design optimisation.

MiMo-V2.5 combines text, image, video and audio in a unified architecture, backed by a 48-trillion-token training corpus, a 729M-parameter Vision Transformer and integrated audio encoder.

Across the model family, Xiaomi highlights a 6:1 Sliding Window to Global Attention ratio, approximately 7x KV-cache reduction, Multi-Token Prediction for inference efficiency and FP8 mixed precision.

MiMo-V2.5-Pro was pre-trained on 27 trillion tokens and uses supervised fine-tuning, reinforcement learning, Multi-Teacher On-Policy Distillation and progressive context scaling to 1M tokens.

The models support deployment through SGLang and vLLM, extending Xiaomi’s push into open-source infrastructure for long-context agentic AI.

LEAVE A REPLY

Please enter your comment!
Please enter your name here