CoreWeave on 3 July became the first cloud provider to place Nvidia’s newest Blackwell Ultra processors in commercial service, after taking delivery of Dell-built GB300 NVL72 systems. The liquid-cooled racks integrate 72 Blackwell Ultra GPUs with 36 Grace CPUs and 36 BlueField-3 DPUs, delivering about 1.1 exaflops of FP4 inference performance—roughly 50 % more than the previous GB200 design. Dell assembled and tested the systems in the United States and installed them with data-centre partners Switch and Vertiv. The rollout underscores CoreWeave’s strategy of offering the most recent Nvidia hardware ahead of larger rivals and lifted the company’s shares about 6 % on the day of the announcement; Dell rose roughly 2 % and Nvidia edged higher. Building on that momentum, CoreWeave on 9 July launched general-availability cloud instances based on Nvidia’s RTX Pro 6000 Blackwell Server Edition, becoming the first provider to offer the GPUs at scale. The company says the cards cut large-language-model inference times by up to 5.6 × and speed text-to-video generation 3.5 × versus the prior-generation L40S. The rapid adoption highlights surging demand for ever more powerful—but heat-intensive—AI silicon. Amazon Web Services, for example, has designed in-row heat exchangers to cool forthcoming Nvidia Blackwell deployments. With early access to both training-class and professional Blackwell GPUs, CoreWeave is positioning itself as a nimble alternative to the industry’s three dominant hyperscalers.
#NVIDIABlackwell is now generally available on @awscloud. The NVIDIA GB200 NVL72 platform with NVIDIA NVLink, now available as Amazon EC2 P6e-GB200 instances, enables training and inference at scale to power AI breakthroughs. 🎥 See now to learn more ⬇️ https://t.co/nD4WcnYXla
#NVIDIABlackwell is now generally available on @awscloud. The NVIDIA GB200 NVL72 platform with NVIDIA NVLink, now available as Amazon EC2 P6e-GB200 instances, enables training and inference at scale to power breakthroughs across drug discovery, software development, and more.
AWS BUILDS NEW COOLING SYSTEMS FOR $NVDA AI CHIPS Amazon $AMZN Web Services has developed its own cooling hardware to handle Nvidia's massive AI GPUs, as traditional liquid-cooling setups took up too much space or water. The new In-Row Heat Exchanger (IRHX) integrates into