Global technology companies are accelerating investments in artificial intelligence infrastructure to meet surging demand. OpenAI, SoftBank, Nvidia, Oracle, and G42 are collaborating on Stargate, a $500 billion initiative that includes a 1.2-gigawatt campus in Abilene, Texas, and the Stargate UAE project in Abu Dhabi. Stargate UAE, with a $1 billion initial investment and a 1-gigawatt long-term goal, will deliver a 5-gigawatt AI campus, with the first 200 megawatts operational by 2026. Partners include Cisco and Foxconn, and the project will provide free ChatGPT access across the UAE. Nvidia is supplying its Grace Blackwell GB300 chips to these new data centers and has opened its AI ecosystem to rival chipmakers. The company is launching the GB300 in Q3 2025 and partnering with Foxconn to build a 100-megawatt AI supercomputer in Taiwan, as Foxconn's capex for AI servers rises 35.6% to NT$273.6 billion. Foxconn is a major supplier of Nvidia's GB200 and GB300 AI servers. Saudi Arabia is developing the 1.9-gigawatt Humain AI project, backed by Nvidia, AMD, and AWS, to compete with UAE’s ambitions. In Europe, Mistral AI, Nvidia, and partners are investing €8.5 billion in a new AI campus near Paris, which will feature exascale computing and low-carbon data centers. Nvidia’s Data Center segment has grown from less than $10 billion in 2020 to $100 billion in 2024, with projected five-year sales growth of 189%. Major customers include Microsoft, Meta, Amazon, and Google. Nvidia is also launching the DGX Spark mini PC, priced at $3,999 and capable of 1,000 TOPS, and the DGX Station, offering up to 20 petaflops, to bring advanced AI capabilities to desktops. PC manufacturers such as Dell, Asus, HP, and Lenovo are releasing their own versions of these devices. Dell has announced the Pro Max AI PC, featuring an enterprise-grade discrete NPU, and introduced the PowerCool eRDHx cooling system, which can reduce energy costs by up to 60%. Dell’s Project Lightning aims to double AI training throughput, and the PowerEdge XE9785/9785L servers will support AMD Instinct MI350 GPUs. Dell is also expanding its AI partner ecosystem and enhancing security and risk management for AI infrastructure. The rapid expansion of AI data centers is raising concerns about energy consumption and sustainability. AI already accounts for about 20% of global data center electricity use and could reach 50% by the end of 2025. Google projects more than 500 kilowatts per server rack by 2030, and AI could use 12% of US electricity by 2028. Companies are introducing innovations in cooling and energy efficiency, while some, such as Eric Schmidt, are exploring orbital data centers to address future power constraints.
🔴ANÁLISIS | Este miércoles publica resultados Nvidia, la última de las 7 Magníficas que reporta cuentas trimestrales; la clave serán las pistas que ofrezca sobre la demanda de chips para inteligencia artificial. https://t.co/zvIoeYvBhc
WHEN AI BURNS TOO MUCH POWER... YOU SEND IT TO SPACE?! Former Google CEO Eric Schmidt wants to launch AI data centers into orbit because Earth’s grid can’t keep up. He says AI is on track to eat 99% of our electricity, and some companies are already dreaming of 10-gigawatt https://t.co/shnvmnyHVl https://t.co/Me6lWbzQ2d
Dell unveils new Pro Max AI PC & innovations for data centres https://t.co/94g5iaCS4s #datascience #ds, inoreader