NVIDIA is strategically repositioning itself in the global AI landscape, moving beyond hardware provider to become the backbone of worldwide AI infrastructure development.
At Computex 2025 in Taipei, CEO Jensen Huang unveiled NVLink Fusion, a groundbreaking technology that opens NVIDIA's previously closed ecosystem to competitors' chips. This allows cloud providers and enterprises to integrate custom CPUs and AI accelerators from other companies with NVIDIA's GPUs, while still maintaining NVIDIA's interconnect technology as the critical infrastructure foundation.
"A tectonic shift is underway: for the first time in decades, data centers must be fundamentally rearchitected — AI is being fused into every computing platform," said Huang during his keynote. He repeatedly referred to modern data centers as "AI factories" rather than traditional computing facilities, emphasizing their role in producing valuable AI outputs.
The company has already secured partnerships with MediaTek, Marvell, Alchip, Astera Labs, Synopsys, and Cadence to develop custom AI chips compatible with NVIDIA systems. Additionally, Fujitsu and Qualcomm will create processors designed to work with NVIDIA accelerators, validating the company's new approach.
This strategic shift appears calculated to prevent major cloud providers like Microsoft and Amazon from bypassing NVIDIA entirely as they develop their own custom processors. By embracing a more open ecosystem while maintaining architectural control through technologies like NVLink Fusion, NVIDIA aims to remain central to AI development worldwide.
Beyond NVLink Fusion, NVIDIA announced several other infrastructure initiatives, including its next-generation GB300 systems coming in Q3 2025, a new RTX Pro Server system, and an expanded DGX Cloud Lepton platform connecting developers with GPU resources globally. The company also revealed a partnership with Taiwan's Foxconn to build an AI supercomputer, further cementing its international infrastructure strategy.