Home Content News French Startup Mistral AI Expands Open Source With Magistral Small 1.2 Release

French Startup Mistral AI Expands Open Source With Magistral Small 1.2 Release

0
58
Mistral AI Unveils Open Source Magistral Small 1.2 With 24B Parameters and 128k Context
Mistral AI Unveils Open Source Magistral Small 1.2 With 24B Parameters and 128k Context

Mistral AI has released its open source Magistral Small 1.2 model, offering 24B parameters, 128k context, and plug-and-play developer tools.

Mistral AI, the French artificial intelligence company, has officially launched Magistral Small 1.2, its latest open source inference model. With 24 billion parameters and licensed under the Apache 2.0 open source license, the release reinforces Mistral’s commitment to open source AI innovation.

The new version introduces several key upgrades. It supports a 128k context length, processes multiple languages and visual inputs, and integrates a visual encoder to handle text–image tasks. A novel [THINK] token has been added to encapsulate the inference process, improving the model’s expressive power and flexibility.

To simplify adoption, Magistral Small 1.2 comes with inference templates compatible with vLLM, Transformers, and llama.cpp, enabling plug-and-play deployment. Developers also benefit from a GGUF quantization version and Unsloth fine-tuning examples, which lower entry barriers for experimentation and customisation.

Alongside the open source model, Mistral upgraded its enterprise offering, Magistral Medium 1.2. This version powers conversational services on the Le Chat platform and has an API now available via La Plateforme, extending its reach into business applications.

The dual release highlights Mistral’s strategy of advancing both community-driven open source AI and enterprise-grade solutions. By making its models widely accessible and developer-friendly, Mistral is positioning itself as a leading open source-first AI company, equipping researchers, businesses, and innovators with powerful tools to drive efficiency and scalability in the evolving AI market.

 

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here