OpenAI’s next-generation language model, GPT-5, has run into technical and cost headwinds that are slowing its development, according to an Aug. 1 investigation by The Information. People familiar with internal testing told the publication that the new system shows stronger performance in coding, mathematics and multi-step tasks, yet the improvements are incremental compared with the leap from GPT-3 to GPT-4. Engineers are banking on advances in reinforcement learning to narrow the gap, the report said, while outside testers cited in separate posts have clocked GPT-5 at 45 percent on a demanding lateral-reasoning benchmark—outperforming leading competitors. Despite the remaining hurdles, OpenAI is still aiming to introduce the model in early August, with speculation centring on an Aug. 5 debut. The tempered progress highlights broader challenges facing the sector. Meta Platforms disclosed plans to offload roughly $2 billion in data-center assets to outside partners to contain ballooning infrastructure bills, and industry-wide capital expenditure on artificial-intelligence projects has already topped $155 billion this year, according to recent Guardian estimates. Rising power, chip and data-center costs are adding pressure just as frontier models demand ever larger computational budgets.
Capacity planning a rising concern for datacenter operators as AI grows https://t.co/Y058itf4Gh
The AGI Infrastructure Arms Race Is Already Here A Substack exposé reveals how companies like OpenAI, Meta, and Amazon are quietly battling over compute, chips, and influence. https://t.co/BGteKaEZfn Is the future of AI being decided in secret supply deals? #AI #News2025 #AGI
A reasonable take on what a crash in the AI market would actually mean for the wider economy, as CapEx for data centers continues to grow. (To be clear, there are no particular warning signs that this is a danger right now, but downside cases are always important to consider). https://t.co/qE5HHKdtPF