Apple’s smart glasses will support gestures, according to multiple sources.
A recurring rumor is that Apple’s smart glasses will rely on gesture-based input, with the device featuring two built-in cameras, Siri, and not much else.
Claims that Apple is working on smart glasses date back to 2015, with analysts predicting a 2026 or 2027 release even before the Apple Vision Pro debuted at WWDC 2023. Since then, we’ve continued to see new rumors about smart glasses development efforts, hardware, and features.
A Woozad The report effectively reiterated the previously discussed hardware claims, while also suggesting that the device would support hand gestures. The claim regarding gestures matches what analyst Ming-Chi Kuo said in June 2025.
The report suggests that Apple’s smart glasses would also feature two cameras: a high-resolution camera for photos and videos, and a low-resolution camera for gestures and Siri-related visual inputs.
Neither 3D cameras nor LiDAR sensors will be available on these smart glasses, allegedly because the hardware is too power-hungry. These limitations would be due to the small built-in battery that Apple plans to use to keep the glasses thin and light.
Tim Cook was even said to see the product as a top priority, part of a broader push toward AI-enabled wearables and visual intelligence.
An April 2026 report suggested that Apple’s AI-powered smart glasses would be able to take photos and videos and would offer access to Siri. Rumors from October 2025 and February 2026 suggested that the initial version of the device would not feature a screen.
As for how users will interact with the product, gestures are the obvious choice for several reasons. Rumors about the device, as well as Apple patents and patent applications filed over the years, all point in this direction.
Why gesture support is obvious and necessary for Apple smart glasses
Given that the idea of using gestures to control smart glasses appeared in a 2020 Apple patent, it’s not hard to imagine this approach becoming a reality, even without an “internal source.”
Gesture controls for smart glasses can be seen in a 2020 Apple patent.
Hand gestures, in particular, are the subject of an Apple patent application in 2024. It explains how users can make purchases or get information about an object or landmark, simply by pointing at something in front of them.
Apple Vision Pro’s operating system, known as visionOS, already relies on gestures for navigation. A 2024 Apple patent suggested that these navigation gestures could extend to other devices in the company’s product line.
More recently, a 2025 Apple patent explored, among other things, how gestures could be used to control the long-rumored camera-equipped AirPods.
That said, Apple has been studying motion tracking since at least 2009, so references to gesture-controlled devices can be found in even more patents and patent applications.
Given that Apple already has products that rely on gestures, that gesture-centric navigation has appeared in several patents and apps, and that smart glasses will come with cameras, there is only one conclusion to draw.
Apple’s smart glasses will rely on gestures; this is obvious, even without multiple rumors clearly stating it. What is less clear, however, is when the product will reach end users.
It has been said that Apple is targeting a late 2026 release for its smart glasses, perhaps around Christmas. However, a release in 2027 is not ruled out either.
Apple will likely pitch its new product as an AI-driven iPhone companion, so it might appeal to some users who are already part of the Apple ecosystem. Still, not everyone is a fan of smart glasses, regardless of the manufacturer.