When I reviewed the Meta Ray-Ban Display glasses, one of my main frustrations was that there were only a handful of apps that could take advantage of the frames’ impressive display and they were all made by Meta. Now that is finally changing.
Meta opens up the glasses to third-party developers, who can now experiment with apps compatible with the display as well as the device’s Neural Band controller. The platform will work with apps accompanied by an iOS or Android mobile app and will also be compatible with web apps, the company said in an update.
With a 20-degree field of view, the glasses’ display isn’t fully immersive like other standalone AR glasses. And it seems Meta is looking for applications well-suited to monocular display, like “information overlays.” For example, Andrew Bosworth, Meta’s CTO, shared a video of an early application called “Darkroom Buddy,” an interactive guide to developing film that could serve as a “viewable” reference.
The gap between idea and prototype has never been smaller. Add in glasses and inputs like the Neural Band, and it looks like the early days of construction in a way we haven’t seen in over a decade.
We deploy web applications and mobile SDK on Meta Ray-Ban Display. Developer Preview… pic.twitter.com/OlDayAkozd
-Boz (@boztank) May 14, 2026
The company also suggests that developers can create experiences for streaming media, “displaying real-time data, like scores or status updates” and other “micro-app” purposes. The company is also apparently considering mini-games. A video shared by Meta also includes a few such examples, such as chess, snake, and a brick-breaking style game. (The glasses already come with a Meta-made puzzle game, although I didn’t find it particularly compelling.)
Adding third-party apps could open up a lot more functionality to the $800 glasses the company launched last fall. Since its launch, Meta has added some of its new features, like a built-in teleprompter and handwriting capabilities, but the glasses still seem somewhat limited. For example, I was looking forward to using the glasses while cooking, but was frustrated to find that my only option for seeing a recipe on the screen was to request one from Meta AI. I doubt the major recipe app developers are thinking about this device right now, but I like knowing that it’s at least possible.
It’s unclear when any of these new third-party apps might actually be available. Meta announced support for third-party apps for its screenless smart glasses last year, but most are still not available. One thing I’ll be watching closely is how these new experiences affect battery life, as I’ve found that display-intensive apps can drain the glasses’ built-in battery quite quickly.
The timing of Meta’s announcement is also notable. The company just announced the dates for its next Connect event, where we’ll likely get a lot more updates on all of its smart glasses projects. CEO Mark Zuckerberg also revealed a new pair of glasses that could be the next iteration of the Ray-Ban Display Meta frames.
For people who already own a pair, Meta is also rolling out some important updates that are already available. The previously mentioned “neural writing” feature, which lets you respond to messages by tracing letters with your fingers, is now available to everyone. The company is also adding display recording features that let you record what’s happening on the glasses’ screen alongside your own view and share it with others. I’ve personally found it difficult to describe to people exactly what it means to use the heads-up display, so I’m looking forward to trying it out. And finally, Meta offers live captioning for calls made via Messenger, WhatsApp and Instagram.