Microsoft gears up to host Elon Musk’s Grok AI model: report

Grok's entry on Azure would broaden access to diverse AI tools.


News Desk May 02, 2025
Photo: Microsoft to power Musk’s Grok AI in potential cloud shake-up

Listen to article
WORLDWIDE:

Microsoft (MSFT.O), is preparing to host Elon Musk’s Grok artificial intelligence model on its Azure cloud platform, The Verge reported on Thursday, citing a source familiar with the discussions.

The tech company has reportedly been in talks with Musk’s AI startup xAI in recent weeks to make Grok available to both enterprise clients and Microsoft’s own product teams via its Azure AI cloud service.

Azure AI Foundry is a development platform offering tools to host, deploy and manage AI applications. The partnership would give developers access to Grok alongside other leading AI models.

The move comes amid rising tensions between Musk and Sam Altman, CEO of OpenAI — Microsoft’s key AI partner.

Elon Musk, a co-founder of OpenAI who left the firm in 2018, has accused it of abandoning its mission to develop AI for the benefit of humanity, citing corporate motives.

Last year, Musk filed a lawsuit against OpenAI and Altman. The company counter-sued in March.

Microsoft is reportedly only offering infrastructure to host the Grok model and not the computational capacity to train future versions, The Verge added. It remains unclear whether the agreement would be exclusive or if other cloud services, such as Amazon’s AWS (AMZN.O), opens new tab, will also host Grok.

The potential partnership signals Microsoft’s ongoing strategy to diversify beyond OpenAI. The company has been trialling models from xAI, Meta (META.O), opens new tab, and Chinese developer DeepSeek, as alternatives for its Copilot AI assistant.

In a notable move, Microsoft recently made DeepSeek’s R1 model available via Azure and GitHub after its rapid rise in popularity.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ