Smart glasses are having a moment. Meta’s latest Ray-Ban specs are already available, Oakley’s newest models are on the horizon, and Google and Samsung are expected to join the race next year with Android XR. With all this momentum, it seemed only a matter of time before Apple revealed its own smart glasses—and recent signs suggest that time is near.
Reports indicate that Apple has paused development of its rumored Vision Air—a smaller, lighter successor to the Vision Pro VR headset—in favor of focusing development on smart glasses. This pivot is clearly aimed at competing with the wave of AI-powered glasses emerging from companies like Meta, Samsung, Google, Snap, Amazon, Xreal, Rokid, and even OpenAI, all either selling, developing, or exploring this exciting tech.
As I test various smart glasses this fall, the pieces of Apple’s potential puzzle are becoming clear. The company already has the product line and wearable tech ecosystem to make a splash, and it’s further along than many realize. Here’s how Apple’s current headphones, watches, phones, and software could shape the debut of its smart glasses.
Audio Technology Powered by AirPods
Apple has been preparing for wearable audio on our faces for over a decade. When the first AirPods launched in 2016 and initially drew flak for their unconventional look, it felt like Apple testing the waters. That gamble paid off: Today, AirPods are ubiquitous and widely embraced.
Since then, Apple has developed computational audio features perfectly suited for smart glasses. These include live translation in AirPods firmware, head-nodding gestures to quickly reply to messages, heart rate monitoring, ambient noise filtering to enhance focus or assist hearing, 3D spatial audio, and even open-ear noise cancellation in AirPods 4. Apple has also earned FDA clearance for hearing assistance—a feature already appearing in smart glasses from companies like Nuance.
Because smart AR glasses have tiny open-ear speakers built into their frames, these audio innovations could transition seamlessly from AirPods to Apple’s glasses—and AirPods might just be the beginning.
Control and Interaction via Apple Watch
Meta’s newest Display glasses include the Neural Band, which uses electrodes to read faint muscle impulses, enabling in-air gesture control. Apple already has a foothold in this space with the Apple Watch’s wrist-based gesture controls.
The Apple Watch supports double-tap and shake-to-dismiss gestures to respond to messages, answer calls, or stop timers quickly. Early on, I recognized how naturally these wrist gestures could integrate with AR and VR headset controls.
Apple’s smart glasses might connect directly to the Watch, providing fast access to on-screen readouts—or even skipping a built-in display altogether. Imagine the glasses serving as a viewfinder for camera glasses or a wearable touchscreen for app selection. Meta has hinted that its Neural Band might evolve into a watch-like device, and Google is also exploring close interplays between glasses and watches.
Camera Innovation via iPhone Air and Vision Pro
Apple has long excelled at miniaturizing high-performance cameras. The iPhone Air, released this fall, achieved some of the industry’s most impressive compression in incredibly slim form factors—perfect practice for the ultra-compact cameras glasses demand.
Apple also has experience incorporating complex camera arrays in headsets like the Vision Pro, which likely features far more advanced imaging than upcoming glasses will require.
Furthermore, Apple could integrate existing controls from the iPhone camera experience, such as capacitive touch sensors on the Camera Button, hinting at how users might navigate via the glasses’ frame arms.
It’s possible Apple will enable stereo 3D video recording on the glasses, allowing wearers to capture spatial footage to revisit later on a Vision headset—an evolution of the Vision Pro’s in-headset video recording capabilities.
Advancing Visual AI for Smart Glasses
Apple’s glasses will need powerful camera-aware AI capabilities, akin to the iPhone’s Visual Intelligence. Although Apple still has some ground to cover to rival Google Gemini and Meta AI, smart glasses could be the perfect vehicle to introduce and enhance these technologies. The glasses may even help train AI models based on what they capture, evolving smarter experiences over time.
Much like Meta aims to leverage AI through its glasses, Apple could develop AI for glasses that boosts its other future projects, including autonomous vehicles.
Apple Stores: Prime Venues for Glasses Demos
While Meta is building dedicated retail experiences for its Display glasses, Apple benefits from an existing global network of stores—already used for immersive tech demos during the Vision Pro launch. Apple Stores would naturally serve as ideal spots for smart glasses fittings, with prescription lenses handled through an online partner, just as Vision Pro does with Zeiss.
Superior Connectivity with iOS Ecosystem
Current smart glasses often struggle with seamless integration into phones and app ecosystems—a problem Apple is uniquely positioned to solve. Controlling both iOS and watchOS, Apple can build glasses that integrate effortlessly with iPhones, Apple Watches, and apps.
Meta’s glasses rely on companion phone apps and lack deep Siri or Gemini AI integration. Google’s Android XR will help improve connectivity on Android, and Apple’s entry could do the same on iOS, setting a new standard. Moreover, Apple’s move might encourage iOS developers to start designing apps with glasses in mind or enhance support for third-party smart glasses.
Looking Ahead
We likely won’t know definitive details about Apple’s smart glasses debut until next year, but putting all the clues together reveals exciting potential for truly innovative specs. Apple has the ecosystem, the technology, and the retail presence to deliver something special. Now it just needs to put the glasses on our faces.