
Sarvam AI has open sourced two reasoning models built in India to power Indic-language AI applications, but early developer feedback highlights tooling and deployment gaps.
Sarvam AI has open sourced two large reasoning models—Sarvam 30B and Sarvam 105B—in a move aimed at seeding an India-led open-source AI ecosystem and advancing sovereign AI capabilities.
The models are foundation models built and trained entirely in India using compute provided under the IndiaAI Mission. Designed for reasoning, coding, and agentic workflows, they also prioritise performance across Indic languages.
Sarvam 30B is positioned as an efficient reasoning model suited for real-time deployments and currently powers Samvaad, the company’s conversational agent platform. The larger Sarvam 105B targets complex reasoning and coding tasks and runs Indus, Sarvam’s multi-step AI assistant.
Both models use a mixture-of-experts architecture with 128 experts to scale reasoning capabilities while keeping active compute lower. Sarvam 30B incorporates Grouped Query Attention (GQA) to reduce memory consumption, while Sarvam 105B adds Multi-Head Latent Attention (MLA) to improve long-context inference efficiency. Training datasets include code, mathematics, multilingual content, and specialised knowledge corpora, with strong emphasis on Indian-language data across 22 languages.
However, early developer feedback highlights ecosystem gaps that could slow adoption. Missing GGUF deployment formats make it difficult to run the models locally using tools such as llama.cpp, while integration with common inference frameworks such as vLLM remains limited.
Industry experts say such friction often affects open-source adoption. Sudipta Biswas, Cofounder of Floworks, noted, “I think with open source, while it is great for development and adoption, you really have to have things in a pre-packaged manner, so more people can use it.”
Despite the hurdles, the release marks a notable step toward building India-native open AI infrastructure—provided Sarvam can build a strong developer ecosystem before global players release competing multilingual models.












































































