Microsoft has officially added Elon Musk's xAI Grok 3 family of AI models to its Azure AI Foundry platform, marking a significant expansion of its AI model ecosystem beyond its primary partnership with OpenAI.
Announced at the Microsoft Build 2025 conference on May 19, the integration brings both Grok 3 and Grok 3 mini models to Microsoft's cloud platform. These models will be hosted in Microsoft's own data centers and billed directly by the company, ensuring they meet the same service-level agreements that Azure customers expect from any Microsoft product.
The Azure-hosted versions of Grok are more tightly controlled than those available on X (formerly Twitter), featuring enhanced data integration, customization, and governance capabilities beyond what xAI's standalone API currently provides.
"We're bringing Grok 3 and Grok 3 mini models from xAI to our ecosystem, hosted and billed directly by Microsoft," said Frank Shaw, chief communications officer at Microsoft. "Developers can now choose from more than 1,900 partner-hosted and Microsoft-hosted AI models, while managing secure data integration, model customization and enterprise-grade governance."
This move is part of Microsoft's broader strategy to diversify its AI model offerings. In addition to xAI's Grok models, Microsoft also announced it would offer Meta's Llama models and AI solutions from European startups Mistral and Black Forest Labs on its cloud services.
The integration comes despite the ongoing legal tensions between Elon Musk and OpenAI, in which Microsoft has invested heavily. Musk is currently suing OpenAI, a company he co-founded but later left, while OpenAI has filed a countersuit accusing Musk of unfair competition and harassment.
For Microsoft, this multi-model approach represents a strategic shift to reduce dependency on any single AI provider while offering customers a wider range of options for their AI applications.