Close Menu
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

What's Hot

Baywatch’s David Charbet kills dog with truck: police

April 17, 2026

Judge blocks above-ground construction of President Trump’s White House ballroom

April 17, 2026

Google can now explore the web alongside AI mode

April 17, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram Vimeo
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
  • Home
  • AI
  • Entertainment
  • Finance
  • Sports
  • Tech
  • USA
  • World
  • Latest News
BWE News – USA, World, Tech, AI, Finance, Sports & Entertainment Updates
Home » Small startup Arcee AI built a 400B parameter open source LLM from scratch to deliver the best Meta’s Llama
AI

Small startup Arcee AI built a 400B parameter open source LLM from scratch to deliver the best Meta’s Llama

adminBy adminJanuary 28, 2026No Comments6 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email


Many in the industry believe that the winner of the AI ​​model market has already been determined. Big tech will own this market (Google, Meta, Microsoft, parts of Amazon) and the model makers of their choice (mostly OpenAI and Anthropic).

But Arcee AI, a small startup with 30 employees, disagrees. The company just released a truly perpetually open (Apache License) general purpose foundation model called Trinity, which Arcee claims is one of the largest open source foundation models ever trained and released by a US company, with 400B parameters.

According to Arcee, benchmark tests conducted using the base model (with very little post-training) show that Trinity is comparable to Meta’s Llama 4 Maverick 400B and Z.ai’s GLM-4.5, a high-performance open source model from China’s Tsinghua University.

Trinity LLM’s Arcee AI Benchmark
Arcee AI Benchmark for Trinity Large LLM (preview version, base model)Image credit: Arcee AI

Like other state-of-the-art (SOTA) models, Trinity supports multi-step processes such as coding and agents. Still, despite its size, it’s still not a true SOTA competitor, as it only supports text at the moment.

More modes are in development. The vision model is currently in development, and a speech-to-text version is also on the roadmap, CTO Lucas Atkins told TechCrunch (pictured above, left). In comparison, Meta’s Llama 4 Maverick is already multimodal and supports text and images.

But before adding more AI modes to its roster, Arcee says it wanted a basic LLM that would impress its primary target customers: developers and academics. The team specifically wants to ensure that U.S. companies of all sizes don’t choose China’s open model.

“Ultimately, the winner in this game, and the only way to really win in usage, is to have the best openweight model,” Atkins said. “To win the hearts and minds of developers, you have to give them the best.”

tech crunch event

san francisco
|
October 13-15, 2026

Benchmarks show that the Trinity base model, currently in preview and undergoing further post-training, holds its own, and in some cases slightly outperforms Llama, on coding and math, common sense, knowledge, and reasoning tests.

The progress Arcee has made to become a competitive AI Lab is impressive. The larger Trinity model follows two previous smaller models released in December. The 26B-parameter Trinity Mini is a fully post-trained inference model for tasks ranging from web apps to agents, and the 6B-parameter Trinity Nano is an experimental model designed to push the limits of small but talkative models.

Amazingly, Arcee trained everyone in six months using 2,048 Nvidia Blackwell B300 GPUs at a total cost of $20 million. Founder and CEO Mark McQuaid (pictured above, right) says this is part of the roughly $50 million the company has raised to date.

That kind of cash “was a lot for us,” said Atkins, who led the model-building effort. Still, he acknowledged that it pales in comparison to what large research institutions currently spend.

The six-month schedule was “very calculated,” Atkins said. Prior to LLM, Atkins worked building voice agents for automobiles. “We are a very hungry young startup. We have tremendous talent and bright young researchers. We believed that if given the opportunity to spend this amount of money and train a model of this scale, they would rise to the occasion. And they certainly did, despite the sleepless nights and long hours.”

McQuade, who was previously an early employee at open-source modeling marketplace Hugging Face, says Arcee didn’t start out wanting to be America’s new AI lab. The company originally customized models for large corporate clients like SK Telecom.

“We were only doing post-training, so we took the great work of others. We took the Llama model, we took the Mistral model, we took the open source Kwen model and we post-trained it to improve it,” he said, including doing reinforcement learning for the company’s intended use.

But as the customer list grew, Atkins said, the need for a proprietary model grew and McQuaid was concerned about becoming dependent on other companies. At the same time, many of the best open models came from China, and U.S. companies were wary of them or were banned from using them.

It was a nerve-wracking decision. At the scale and level that Arcee was aiming for, “I think there are less than 20 companies in the world that have ever pre-trained and released their own models,” McQuade said.

The company started small, taking on 4.5 billion small models created in partnership with training company DatologyAI. The success of the project encouraged even larger efforts.

But if the US already has the Llama, why do we need another indiscriminate weight model? Atkins said that by choosing the open source Apache license, the startup is committed to keeping its model open. This comes after Meta CEO Mark Zuckerberg suggested last year that the company might not necessarily open source all of its latest models.

“Llama can be considered not truly open source because it uses a meta-managed license with commercial and usage caveats,” he says. This has led some open source organizations to claim that Llama is not open source compliant at all.

“Arcee exists because America needs a permanently open, Apache-licensed, frontier-grade alternative that can actually compete on today’s frontier,” McQuade said.

All Trinity models, large and small, are available for free download. The maximum version will be released in 3 flavors. Trinity Large Preview is a lightly post-trained instructional model. This means that it is trained to not only predict the next word, but also to follow human instructions, making it suitable for general chat use. Trinity Large Base is a base model with no post training.

Additionally, TrueBase provides arbitrary instructional data and post-training models, so companies and researchers who want to customize do not need to deploy data, rules, or assumptions.

Arcee AI says it plans to eventually offer a hosted version of its general release model at competitive API prices. Its release is up to six weeks away as the startup continues to improve the model’s inference training.

Trinity Mini has an API price of $0.045 / $0.15, with a rate-limited free tier available. Meanwhile, the company continues to sell post-training and customization options as well.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous Article‘We are reliving the Nakba’: Palestinian community says Nakba is being erased as Israeli settler violence escalates
Next Article 16,000 jobs to be cut in latest anti-bureaucracy push
admin
  • Website

Related Posts

Google can now explore the web alongside AI mode

April 17, 2026

Anthropic CPO leaves Figma board after reports of offering competing product

April 17, 2026

OpenAI goes Anthropic with enhanced Codex that gives more power to the desktop

April 17, 2026

Robotics startup Physical Intelligence says its new robot brain can understand tasks it hasn’t been taught.

April 16, 2026
Leave A Reply Cancel Reply

Our Picks

Newly freed hostages face long road to recovery after two years in captivity

October 15, 2025

Former Kenyan Prime Minister Raila Odinga dies at 80

October 15, 2025

New NATO member offers to buy more US weapons to Ukraine as Western aid dwindles

October 15, 2025

Russia expands drone targeting on Ukraine’s rail network

October 15, 2025
Don't Miss
Entertainment

Baywatch’s David Charbet kills dog with truck: police

By adminApril 17, 20260

It’s been a minute since the Desperate Housewives star lived on Wisteria Lane, but then…

Christie Brinkley and daughter Sailor Brinkley Cook talk about social media

April 17, 2026

Kelly Hopton Jones hits son with car: Comment from Emily Kaiser

April 17, 2026

how much do stars earn

April 16, 2026
About Us
About Us

Welcome to BWE News – your trusted source for timely, reliable, and insightful news from around the globe.

At BWE News, we believe in keeping our readers informed with facts that matter. Our mission is to deliver clear, unbiased, and up-to-date news so you can stay ahead in an ever-changing world.

Our Picks

Former British Ambassador to the US Peter Mandelson fails pre-appointment security review

April 16, 2026

Cuban leader celebrates Bay of Pigs anniversary, vows to defeat US forces if attacked again

April 16, 2026

Russia fires another salvo in war over historical memory

April 16, 2026

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2026 bwenews. Designed by bwenews.

Type above and press Enter to search. Press Esc to cancel.