Meta Connect 2025: THE FUTURE IS NOW

Meta Connect 2025: THE FUTURE IS NOW

I’m back from Meta Connect 2025--an event which turned out to be one of the most consequential in the history of Augmented Reality. As someone who has been working in Augmented Reality for nearly 15 years, it’s clear that an inflection point has turned. We’re going mainstream.

Zuck introducing the new line of Meta glasses

Usually Connect has two full days packed with programming. This time only the keynote was on the first day, with only the second day featuring panels and talks. There were some exclusive events aside from these on the days leading up to Connect, but essentially this had half the content of previous years.

Although there were some sessions about VR--the focus was clearly on Horizon Worlds and Meta’s line of Smart Glasses. As a result, Connect seemed sparsely attended, but the excitement about the new category of AR smart glasses was off the charts. Some say this represents a move away from VR--as Zuckerberg barely mentioned it during the keynote--but I think they’re waiting to announce a new Quest before really talking about VR again.

So let’s talk about the new wearables. An entire suite of glasses were announced: second generation Ray-Bans, Oakley Meta Vanguards, and Ray-Ban Display glasses.

The second generation Ray-Bans are basically the original Meta Ray-Bans but with the upgraded storage size, camera and battery life of the recently released Meta Oakley HSTNs, plus features like HDR 60fps recording and some nice new styles. (Let’s face it, Oakleys can be rather…extreme. I prefer the Ray-Ban Wayfarer look). 

The Oakley Meta Vanguards are similar in features to the Ray-Bans, but include a centrally mounted, wider FOV camera built into the Oakley Vanguard style suited for sports and extreme environments. This new camera setup more accurately captures what the user sees for exciting first person sports action videos. The Vanguards also include an internal LED that notifies the user of events such as lap completion etc. These are great for sports--ranging from running to skiing. I’d really like this internal notification LED brought to the Ray-Bans. After all, Snap had this in their Spectacles almost a decade ago, it should be standard by now.

However, the device everyone is talking about is the new long-rumored Ray-Ban Display glasses. I really think they should have come up with a catchier name than that, as these are absolutely incredible. These glasses have most of the features of the existing Ray-Bans with the addition of a small full color, high resolution waveguide display in the right eye.

I got the opportunity to try these out at Connect, so let’s get down to details.

While in line for my demo, the Meta employee apologized for the wait. I told him I’ve been waiting 10 years for this, what’s another 20 minutes? The future is finally here.

The first thing I noticed when I put them on is that there’s definitely something going on in the right eye. With the display off the right eye appears slightly cloudy--the waveguide is not totally invisible. This is almost imperceptible and a minor issue compared to other AR glasses. There were absolutely no rainbow artifacts due to internal light refraction from incoming sunlight making these the first AR glasses that I could wear as actual practical glasses.

Also another thing to note is that when looking at someone wearing these glasses you can’t tell whether the display is on or off. You have to look at the glasses off-angle to see the screen, but during a normal face to face interaction with someone wearing these glasses it’s very difficult to see the screen. It’s truly a private display for the user only.

A truly private display, you can't see the user's screen while they're using it.

The glasses themselves are clearly thicker and bulkier than the Meta Ray-Ban frames, which is to be expected. They still look like regular, albeit somewhat conspicuous glasses. They are slightly heavier, but it’s really not very noticeable.

The glasses are controlled by voice and the new Neural Band. EMG bands have been around for awhile, but I had never tried one that worked very well--until now. Meta’s acquisition of CTRL-labs has produced the Meta Neural Band which allows effortless interaction with the Ray-Ban Displays.

The Neural Band accessory is essential to use the display. Unfortunately, despite having the standard tap and swipe interface on the side--that will only access the standard set of audio Ray-Ban features. I hope they come up with a way to use the tap and swipe interface with the display just in case you forget your Neural Band.

The Neural Band is required to access the AR interface

Simply touch your ring finger to your thumb to wake the display. The image appears as a small high res floating semi transparent additive screen in your right eye only. Having a display in only one eye can feel a bit hard to focus on at first, but I got used to it quickly. 

Slide your thumb against your index finger to move the selection cursor around: left, right, up, and down. Pinch your thumb and index finger together to make a selection. It’s that easy!

This small screen has a very efficient UI--holding your ring finger and thumb together brings you to the home screen where you can select from a series of built in apps. Among them are a suite of existing Meta apps such as WhatsApp, Instagram, and Messenger.

To get used to the controls Meta has developed a simple little puzzle game featuring a maze you must slide a puck around using gestures. It’s actually quite fun and I played through about a dozen levels before I was ushered off to try different demos.

I had to keep forcing myself to keep my hand raised in front of the glasses as I’ve been programmed from years of optical hand-tracked interfaces to keep my fingers in view of the device’s cameras. However, no cameras are used to detect gestures--just the EMG bands reading electrical impulses from your muscle fibers. You can keep your hand in your pocket and still control the glasses. This makes interacting with the glasses feel completely natural and inconspicuous, eliminating the awkward, visible gestures often associated with other AR interfaces.

Other unique features of these glasses include a navigation app which shows you directions in real-time visually as you walk around. This FAR more preferable than walking around staring at your phone in the streets of a major city. We’ve all seen demos of this before on Spectacles and Android XR, but this is the most practical version of it I’ve ever experienced.

The display is also used to show subtitles--both using real-time translation or just as an aid to help you understand conversations in loud environments or for the hearing impaired.

The biggest problem with these glasses is that they are stuck in the Meta ecosystem. This isn’t Meta’s fault. Apple won’t allow Meta access to iMessage, phone calls, and other fundamental phone functions. I really don’t use WhatsApp at all, and rarely use Messenger. So honestly a lot of the built in apps are of limited use to me. 

My suggestion to Meta would be to make more partnerships with the top apps, allowing them to use a linked Meta account to make voice/AI queries to the app using the same profile the user has on their smartphone. This way the device doesn’t need to access the phone in order to, say, figure out which restaurants on their Yelp bookmarks are open nearby. Meta could end up making a voice AI assistant layer over all the top apps that could dramatically lessen the need for deep phone access. I’m sure Meta is working on this, as you can see with the Strava integration with the Oakley Vanguard glasses.

A lot of people said it couldn’t happen--but the future is now. As someone who has worked in Augmented Reality for nearly 15 years, this is a day I always knew was coming. Stay tuned, we’re just getting started--next stop, Orion!