At TED2025, Palmer Luckey, founder of Oculus and a prominent figure in extended reality (XR), delivered a teleprompter-free talk but later revealed on a podcast that he used smart glasses with notes displayed inside the lenses. These glasses, identified as Even Realities G1 smart glasses, integrated AI seamlessly at the hardware level, enabling Luckey to access his script discreetly despite TED's prohibition on teleprompters. This incident highlights the emerging role of AI-powered wearable technology, particularly smart glasses, in enhancing presentations and information access. Industry observers suggest that smart glasses represent a practical form factor for AI assistants, with companies like Apple potentially developing similar devices. Rumors indicate Apple plans to launch Apple Vision Air and smart/AI glasses by 2026, possibly featuring spatial video capabilities. The Apple Vision Pro, powered by the M2 chip, supports immersive environments and unlocks Unreal Engine for applications like ProjJumpScare, with anticipation for future improvements such as eye-tracked foveated rendering and a rumored M5 chip. The integration of AI with facial recognition in wearables is seen as a future trend, enhancing user interaction and information retrieval.
As for iPhone 15 and 16 Pro we do not have 360 realtime environment yet but do you reckon eventually iPhone can have 360 photo video capabilities? Vision Pro maybe 270 ° https://t.co/mEotDXHnRq
So first Apple Vision Air then Apple smart/AI glasses to launch in 2026 ..maybe the glasses will shoot spatial video after all
Here's what the rumors say about future generations of Apple Vision Pro https://t.co/PH42E3jbCB by @mbrkhrdt