In a surprising strategic reversal, artificial intelligence powerhouse OpenAI is developing its first open-source language model since GPT-2, acknowledging it has been "on the wrong side of history" regarding open-source AI development.
The announcement, made by CEO Sam Altman in March 2025, represents a dramatic shift for a company that built its $300 billion business on closed, proprietary systems. OpenAI has begun soliciting developer feedback through a form on its website, stating the open model will feature reasoning capabilities similar to its o3-mini model and is expected to launch "in the coming months."
This pivot comes as open-source AI models gain significant traction. Meta's Llama family surpassed one billion downloads in March 2025, with CEO Mark Zuckerberg asserting that "open-source AI is crucial to ensuring people everywhere have access to the benefits of AI." Meta has continued expanding its open-source offerings with Llama 4 models released in April 2025.
Perhaps most influential in OpenAI's decision was the January 2025 release of DeepSeek R1, an open-source reasoning model from China that reportedly matches OpenAI's performance at just 5-10% of the operating cost. DeepSeek's MIT license allows unrestricted commercial use, and its success prompted AI scholar Kai-Fu Lee to declare that "open-source has won" in the AI space.
The economics of AI development appear to be driving this industry-wide shift. OpenAI reportedly spends $7-8 billion annually on operations, a cost structure increasingly difficult to justify against efficient open-source alternatives. As Clement Delangue, CEO of Hugging Face, celebrated: "Everyone benefits from open-source AI!"
For enterprise customers, OpenAI's announcement creates uncertainty about long-term investment strategies. Those who have built systems atop GPT-4 or o1 APIs must now evaluate whether to maintain that approach or begin planning migrations to self-hosted alternatives.
While OpenAI has twice delayed the release of its open model for additional safety testing, the company appears committed to this new direction. As base models become increasingly accessible, differentiation is happening at the application layer—creating opportunities for startups and established players to build domain-specific solutions atop foundation models.