Skip to content

Meta Missed Out on Smartphones. Can Smart Glasses Make Up for It?

Meta Missed Out on Smartphones. Can Smart Glasses Make Up for It?

Meta has dominated online social connections for the past 20 years, but it missed out on making the smartphones that primarily delivered those connections. Now, in a multiyear, multibillion-dollar effort to position itself at the forefront of connected hardware, Meta is going all in on computers for your face.

At its annual Connect developer event today in Menlo Park, California, Meta showed off its new, more affordable Oculus Quest 3S virtual reality headset and its improved, AI-powered Ray-Ban Meta smart glasses. But the headliner was Orion, a prototype pair of holographic display glasses that chief executive Mark Zuckerberg said have been in the works for 10 years.

Zuckerberg emphasized that the Orion glasses—which are available only to developers for now—aren’t your typical smart display. And he made the case that these kinds of glasses will be so interactive that they’ll usurp the smartphone for many needs.

“Building this display is different from every other screen you’ve ever used,” Zuckerberg said on stage at Meta Connect. Meta chief technology officer Andrew Bosworth had previously described this tech as “the most advanced thing that we’ve ever produced as a species.”

The Orion glasses, like a lot of heads-up displays, look like the fever dream of techno-utopians who have been toiling away in a highly secretive place called “Reality Lab” for the past several years. One WIRED reporter noted that the thick black glasses looked “chunky” on Zuckerberg.

As part of the on-stage demo, Zuckerberg showed how Orion glasses can be used to project multiple virtual displays in front of someone, respond quickly to messages, video chat with someone, and play games. In the messages example, Zuckerberg noted that users won’t even have to take out their phones. They’ll navigate these interfaces by talking, tapping their fingers together, or by simply looking at virtual objects.

There will also be a “neural interface” built in that can interpret brain signals, using a wrist-worn device that Meta first teased three years ago. Zuckerberg didn’t elaborate on how any of this will actually work or when a consumer version might materialize. (He also didn’t get into the various privacy complications of connecting this rig and its visual AI to one of the world’s biggest repositories of personal data.)

He did say that the imagery that appears through the Orion glasses isn’t pass-through technology—where external cameras show wearers the real world—nor is it a display or screen that shows the virtual world. It’s a “new kind of display architecture,” he said, that uses projectors in the arms of the glasses to shoot waveguides into the lenses, which then reflect light into the wearer’s eyes and create volumetric imagery in front of you. Meta has designed this technology itself, he said.

The idea is that the images don’t appear as flat, 2D graphics in front of your eyes but that the virtual images now have shape and depth. “The big innovation with Orion is the field of view,” says Anshel Sag, principal analyst at Moor Insights & Strategy, who was in attendance at Meta Connect. “The field of view is 72 degrees, which makes it much more engaging and useful for most applications, whether gaming, social media, or just content consumption. Most headsets are in the 30- to 50-degree range.”

Leave a Reply