Indian AI startup Sarvam has open-sourced two reasoning models (30B and 105B parameters) trained ent...
The AMW Read
Sarvam's release validates the sovereign AI/open-weight strategy (Frame 2) by utilizing state-level compute subsidies to challenge global frontier dominance via domestic infrastructure.
Indian AI startup Sarvam has open-sourced two reasoning models (30B and 105B parameters) trained entirely on domestic infrastructure under the IndiaAI Mission, with weights now available on Hugging Face. The 105B model uses Mixture-of-Experts architecture with just 10.3B active parameters, achieving compute efficiency while supporting 10 Indian languages and 128K token context windows. Backed by Rs 98.68 crore in GPU subsidies from 4,096 Nvidia H100 GPUs, this marks India's most significant sovereign AI release to date. These open models now power enterprise platforms like Samvaad and Indus, enabling developers to build India-specific AI applications without relying on foreign foundation models. This positions India alongside DeepSeek and Mistral in the global open-source AI race.

