
Apple is poised to unveil significant advancements in its technology at the upcoming Worldwide Developers Conference (WWDC24) 2024, scheduled for June 10. The focus will be on 'visionOS advancements' for the Apple Vision Pro headset, emphasizing spatial computing. Additionally, Apple plans to showcase its artificial intelligence (AI) strategy, including a new AI model named ReALM (Reference Resolution As Language Modeling). ReALM is designed to understand entities displayed on a user's screen, potentially enhancing interactions with Siri and other Apple services. This model is touted to be 'better than GPT-4' in resolving ambiguous references to on-screen entities and is capable of running 'on-device'. The introduction of ReALM, the emphasis on spatial computing with the Apple Vision Pro headset, and the AI boost in macOS 15 indicate Apple's commitment to advancing immersive technology and AI capabilities.
My roundup of the most important #AR #VR news of the week is out! Read about Oculus's acquisition anniversary, Apple Vision Pro launching in China, and more! https://t.co/1Q9JD3oudg #VirtualReality #Apple #Meta
#Apple says its #ReALM language model is better than #OpenAI's #GPT4 at "reference resolution." But what does that mean, and why is it important? @theSethu writes. https://t.co/bzTpx9bVpO
Apple AI research: ReALM is smaller, faster than GPT-4 when parsing contextual data https://t.co/aYfk6UsCsJ #Apple
















