menu
close

OpenAI Taps Google's TPUs in Strategic Shift Beyond Nvidia

OpenAI has begun using Google's Tensor Processing Units (TPUs) to power ChatGPT and other products, marking its first significant adoption of non-Nvidia chips. This partnership follows OpenAI's earlier move to diversify beyond Microsoft's Azure cloud services and represents a major win for Google Cloud. The collaboration addresses OpenAI's growing computational demands while showcasing Google's success in commercializing its specialized AI hardware.
OpenAI Taps Google's TPUs in Strategic Shift Beyond Nvidia

In a significant shift for AI infrastructure, OpenAI has started renting Google's specialized Tensor Processing Units (TPUs) to power ChatGPT and its other AI products, according to sources familiar with the arrangement.

This marks the first time OpenAI has meaningfully utilized non-Nvidia chips for its AI workloads. The company has historically been one of the world's largest purchasers of Nvidia's graphics processing units (GPUs), which dominate the AI chip market. OpenAI hopes the TPUs, which it accesses through Google Cloud, will help lower the cost of inference computing—the process where AI models use their training to make predictions or decisions.

The partnership represents another step in OpenAI's strategy to diversify its computing infrastructure. Earlier this year, Microsoft—OpenAI's largest investor and primary cloud provider—modified their exclusive arrangement, moving to a model where Microsoft has a "right of first refusal" on new OpenAI cloud computing capacity. This change enabled OpenAI to pursue additional partnerships, including this latest one with Google.

For Google, securing OpenAI as a customer demonstrates how the tech giant has successfully leveraged its in-house AI technology to grow its cloud business. Google's TPUs, which were historically reserved for internal use, offer specific advantages for certain AI workloads, including potentially better energy efficiency and cost-effectiveness for inference tasks compared to GPUs.

However, the collaboration has its limits. According to reports, Google is not renting its most powerful TPUs to OpenAI, maintaining some competitive boundaries between the two AI rivals. This selective approach highlights the complex dynamics in the AI sector, where companies often simultaneously compete and collaborate.

The deal comes amid intense competition for AI computing resources, with major tech companies investing billions in specialized hardware. Google's latest TPU generation, codenamed Trillium, offers significant performance improvements over previous versions and is designed to handle the massive computational demands of advanced AI models.

Source:

Latest News