AWS BUILDS NEW COOLING SYSTEMS FOR $NVDA AI CHIPS Amazon $AMZN Web Services has developed its own cooling hardware to handle Nvidia's massive AI GPUs, as traditional liquid-cooling setups took up too much space or water. The new In-Row Heat Exchanger (IRHX) integrates into
Amazon $AMZN just said its cloud division has developed hardware to cool down next-generation Nvidia $NVDA GPUs that are used for AI workloads - CNBC https://t.co/XFearqaYXu
📢 𝐉𝐔𝐒𝐓 𝐈𝐍: $AMZN Amazon Web Services is building equipment to cool $NVDA Nvidia GPUs as AI boom accelerates - CNBC https://t.co/uebvsXYsL5
Amazon Web Services has designed its own cooling hardware, called the In-Row Heat Exchanger, to manage the heat generated by Nvidia’s next-generation GB200 NVL72 graphics processing units. The IRHX modules slide between server racks inside existing data centers, enabling AWS to deploy dense clusters of 72 Blackwell GPUs per rack without resorting to large-scale liquid-cooling systems that would increase floor space or water consumption, according to Dave Brown, vice president of compute and machine-learning services. The new infrastructure underpins AWS’s P6e compute instances, now available to customers training and serving generative-AI models. By building proprietary cooling systems, AWS aims to speed deployment of high-performance chips while controlling operating costs; the cloud division produced record operating margins in the first quarter of 2025. Rivals are following a similar path: Microsoft last year introduced its Sidekicks setup to cool internally designed Maia AI processors. Interest in alternative cooling technologies is broadening beyond the major cloud providers. Startup Yplasma this week raised a US$2.5 million seed round to refine plasma actuators that move air without mechanical fans, pitching the approach as a low-energy option for data-center and semiconductor applications. The funding underscores growing investor attention to hardware that can curb the rising power and water demands of AI workloads.