
Arcee AI has released Trinity, a 400B sparse Mixture-of-Experts model challenging Meta’s Llama 4. Tr...
The AMW Read
The release of a high-parameter sparse MoE model by a small team validates the open-weight challenger frame (§B) and demonstrates significant training cost-efficiency relative to frontier labs.
NoveltySignificance
Foundation Models · Player MapScaling Laws
Arcee AI has released Trinity, a 400B sparse Mixture-of-Experts model challenging Meta’s Llama 4. Trained for $20M in 6 months using 2,048 Blackwell B300 GPUs, it delivers frontier reasoning under a permanent Apache 2.0 license. This marks a systemic shift where 30-person teams can break Big Tech’s foundation model monopoly. With 13B active parameters, Trinity offers a domestic, legally clean path for high-performance agentic workflows. 🚀
