
Latest news, updates, and announcements from Arcee.

Arcee has launched Trinity Large Thinking, a 400B-parameter sparse MoE LLM with only 13B active per token, built by a 26‑person team on a $20 M budget. The model is Apache 2.0 licensed, runs 2‑3× fast...

Arcee AI has released Trinity, a 400B sparse Mixture-of-Experts model challenging Meta’s Llama 4. Trained for $20M in 6 months using 2,048 Blackwell B300 GPUs, it delivers frontier reasoning under a p...