Home / Technology / Arcee AI Unveils 400B Parameter Open-Source Model
Arcee AI Unveils 400B Parameter Open-Source Model
28 Jan
Summary
- Arcee AI released Trinity, a 400B parameter open-source foundation model.
- Trinity rivals models from Meta and Tsinghua University in benchmarks.
- The company trained Trinity and two smaller models in six months for $20 million.

Arcee AI, a startup with only 30 employees, has introduced Trinity, a significant 400B parameter open-source foundation model released under the permissive Apache license. This move directly challenges the prevailing industry belief that Big Tech companies and their favored AI partners will monopolize the market.
According to benchmark tests, Arcee AI's Trinity model demonstrates performance comparable to established large-scale models like Meta's Llama 4 Maverick 400B and China's Z.ai GLM-4.5, especially in foundational tasks such as coding, math, common sense, and reasoning. While currently text-only, future multimodal capabilities including vision and speech-to-text are planned.
The company highlights its efficient development process, having trained Trinity and two smaller models—Trinity Mini (26B parameters) and Trinity Nano (6B parameters)—within a six-month timeframe. This was achieved with a total expenditure of $20 million, a fraction of the investment made by larger AI labs, utilizing 2,048 Nvidia Blackwell B300 GPUs.




