
Cohere Labs released Tiny Aya, a 3.35B parameter open-weight model supporting 70+ languages that run...
The AMW Read
Cohere updates its player map with a compact, open-weight model that explicitly targets the trade-off between model size and linguistic breadth (cross.§B) to capture emerging markets.
NoveltySignificance
Foundation Models · Player MapScaling Laws
Cohere Labs released Tiny Aya, a 3.35B parameter open-weight model supporting 70+ languages that runs offline on laptops and phones. Launched at India AI Summit with regional variants for Africa, South Asia, and Asia-Pacific, this democratizes AI access by eliminating cloud dependency while capturing cultural nuances missing from English-Chinese dominant models. The strategic move challenges the "bigger is better" paradigm, proving compact models can unlock emerging markets previously overlooked by mainstream LLM development.



