Apple used its Worldwide Developers Conference 2025 to detail a major expansion of on-device artificial-intelligence capabilities, headlined by iOS 26 and a new suite branded “Apple Intelligence.” At the heart of the rollout is the Foundation Models Framework, a family of 3-billion-parameter language and vision models tuned for Apple Silicon. The framework lets developers tap the models entirely on the device with only a few lines of Swift code, eliminating cloud costs and preserving user privacy. iOS 26 introduces system-wide Live Translation that interprets calls, video chats and messages in real time without sending data off the phone. Visual Intelligence turns screenshots into actionable items, and the Camera and Photos apps gain AI-driven tools aimed at content creators and easier search. Apple is deepening its collaboration with OpenAI: Image Playground and Genmoji now employ ChatGPT to generate new art styles, including anime-inspired graphics, addressing earlier complaints about image quality. The company will also use its own models to tag apps inside the App Store, improving discoverability. Software chief Craig Federighi described AI as a technological wave comparable to the internet, stressing that Apple’s strategy is seamless system integration rather than a standalone chatbot. Internal research questioning the sustainability of ever-larger models underscores Apple’s deliberate, privacy-centric approach to artificial intelligence.
Extrabajador de OpenAI mostró que ChatGPT tiene un "instinto de supervivencia" peligroso para usuarios https://t.co/mRTvcP1T49
Apple’s new Foundation Models framework lets developers access offline AI models, enabling smarter apps with privacy protection and no cloud costs. 📱🔒 #AI #ArtificialIntelligence #AINews https://t.co/p8w1GVDmaP
Here’s everything new for Apple Notes in iOS 26 https://t.co/Xg3QeEuZM8 by @iryantldr