DeepNewz, mobile.
People-sourced. AI-powered. Unbiased News.
Download on the App Store
Screenshot of DeepNewz app showing story detail view.
Aug 13, 07:05 AM
OpenAI Picks NVIDIA’s GB200 Hardware to Train and Run GPT-5
AI Modeling
Tech
AI

OpenAI Picks NVIDIA’s GB200 Hardware to Train and Run GPT-5

Authors
  • Rohan Paul
  • The AI Investor
  • NVIDIA Data Center
5

OpenAI’s forthcoming GPT-5 large-language model was trained on NVIDIA’s current-generation H100 and H200 graphics processors and will be deployed on the new GB200 NVL72 artificial-intelligence server, according to information shared by NVIDIA’s data-center division. The GB200 NVL72 clusters 72 Blackwell-architecture GPUs and 36 Grace server CPUs in a single rack, interconnected by NVIDIA’s high-speed NVLink and NVLink-Switch fabrics. The configuration is designed to accelerate trillion-parameter models, offering what NVIDIA says is a significant jump in throughput and energy efficiency over earlier Hopper-based systems. Amazon Web Services plans to make similar multi-GPU configurations available through its SageMaker HyperPod service, giving customers access to the same infrastructure used for GPT-5. NVIDIA says its CUDA software stack has now been downloaded more than 450 million times, underscoring the company’s dominance in the market for AI computing silicon.

Written with ChatGPT .

Additional media

Image #1 for story openai-picks-nvidias-gb200-hardware-to-train-run-gpt-5-173da390