Amazon Web Services (AWS), the leading cloud computing provider globally, is exploring the possibility of utilizing Advanced Micro Devices Inc.’s (AMD) new artificial intelligence (AI) chips, according to an AWS executive who spoke to Reuters. While no final decision has been made, AWS’s interest in AMD’s chips was revealed during an AMD event where the company outlined its AI market strategy, competing with industry leader Nvidia Corp. (NVDA).
Although AMD showcased promising technical specifications for an upcoming AI chip that could outperform Nvidia’s current offerings in certain aspects, the absence of a flagship customer announcement caused AMD’s shares to decline. AMD’s CEO, Lisa Su, highlighted their approach of offering a range of components for building customized systems, allowing cloud computing customers like AWS to select the specific components they require using industry-standard connections.
While AWS has not made any public commitments to adopt AMD’s new MI300 chips for its cloud services, Dave Brown, Vice President of Elastic Compute Cloud at Amazon, confirmed that AWS is actively considering their usage. Brown mentioned that AWS and AMD are collaborating closely to determine the most suitable integration of the chips within existing systems.
In contrast to Nvidia’s offering of a complete system called DGX Cloud, AWS declined to partner with Nvidia on this venture. Dave Brown explained that AWS’s extensive experience in building reliable servers and its established supply chain expertise made it more practical for them to design their own servers from scratch. Although AWS has incorporated Nvidia’s H100 chip into its systems since March, it has done so as part of its own server designs.
The potential integration of AMD’s AI chips into AWS’s cloud infrastructure highlights the continuous competition and innovation within the cloud computing industry. As AWS explores different chip options, the future landscape of AI in the cloud remains dynamic and promising.