Apple is testing a sweeping overhaul of Siri that would let the voice assistant perform complex actions inside apps without any screen taps, according to reporting by Bloomberg’s Mark Gurman. The work relies on an expanded “App Intents” framework that underpins the company’s Apple Intelligence initiative and is being trialed across the iPhone, iPad, Mac and Vision Pro. In internal tests, the upgraded Siri can navigate third-party services such as Uber, Temu, Amazon, YouTube, WhatsApp and Facebook—as well as Apple’s own apps—to carry out tasks like locating, editing and sending a photo, posting to social media or adding items to an online shopping cart, all through voice commands. Apple is evaluating limits or safeguards for banking and other sensitive applications to avoid high-stakes errors. The company plans to release the new capabilities in the United States in spring 2026 as part of iOS 26.4, iPadOS 26.4, macOS 26.4 and corresponding Vision OS updates, the report said. Wider international rollout would follow. Apple views the feature as critical to future smart-home hardware and to keeping pace with conversational AI rivals.
yes, I am using GPT5 now for backend falling back to Sonnet / Opus for frontend stuff best combo so far @cursor_ai makes it easy for me to switch between these 2 models https://t.co/s2qJSKh4Sx
“We hear you all on 4o”: Sam Altman brings GPT-4o back after GPT-5 outrage, but most ChatGPT users won’t get it. #GPT5 https://t.co/dMR5us43Eb
GPT5 is evidence that we are no where close to AGI. All hype and BS peddled by an industry that never learns. Excited for the next current thing.