Skip to content

Ray-Ban Meta Smart Glasses are more of an AI device than ever with new updates

Ray-Ban Meta Smart Glasses are more of an AI device than ever with new updates

If you do a quick online search for Ray-Ban Meta Smart Glasses right now, you’ll find that the wearable is mostly marketed for its quick photo capturing and livestreaming capabilities.

However, at the Meta Connect 2024 event on Wednesday, Meta founder and CEO Mark Zuckerberg didn’t have too much to say about photos and videos during the Ray-Ban Meta Smart Glasses section of the presentation.

SEE ALSO:

Everything announced at Meta Connect 2024

In fact, Zuckerberg introduced the Ray-Ban Meta Smart Glasses primarily as an AI device.

“Glasses are a new AI device category,” Zuckerberg said, noting that his company has just caught up with the consumer demand for Meta smart glasses after sales took off faster than he said he expected.

Aside from a new limited edition Ray-Ban Meta Smart Glasses device with clear transparent frames, there weren’t any new smart glasses hardware announcements from Meta. 

Clear, transparent Ray-Ban Meta Smart Glasses


Credit: Meta

However, Zuckerberg did share several new features that he said were coming to the Meta smart glasses in a set of updates releasing over the next couple of months — all of them AI related.

Meta AI is already integrated into Ray-Ban Meta Smart Glasses in much the same way other companies’ voice assistant’s are integrated into their devices. But, according to Zuckerberg, new updates will make these interactions “more natural and conversational.”

Mashable Light Speed

“Hey Meta” instead of “Look and tell me.”

For example, currently, users have to prompt their Ray-Ban Meta Smart Glasses with the phrase “look and tell me” when they have a question. Zuckerberg’s demo showcased how users will no longer have to do that. Users will just need to activate the feature with the “Hey Meta” prompt and then ask their question. Meta AI will automatically know the question is in regards to whatever the user is looking at through the glasses.

Furthermore, after the initial “Hey Meta,” Meta AI will no longer require that users start each prompt with that phrase. Meta AI will be able to continue interacting with users.

Live Translation on Ray-Ban Meta Smart Glasses 

The latter feature is similar to what’s been seen in other smart glasses when it comes to translations. A user can access live real-time audio translations of another language through the glasses when conversing with another person. The demo seemed to work nearly perfectly at Meta Connect when translating from Spanish to English and English to Spanish.

Meta multimodal video AI


Credit: Meta

Multimodal AI prompts

Zuckerberg explained the multimodal video AI feature through a demo showing a user trying on outfits to wear. Through this feature, Meta AI was able to offer fashion advice and suggestions based on the user’s outfit and their specific question about it.

Ray-Ban Meta Smart Glasses will also soon be able to automatically remember things for users. The example showcased at Meta Connect involved Meta AI recalling the parking space number where the user parked their car. The user did not have to prompt Meta AI to do that. It naturally appeared to remember the number because the user viewed it through the glasses.

Meta AI remembers


Credit: Meta

Adding on to that feature is a similar Meta AI capability where Ray-Ban Meta Smart Glasses users will soon be able to look at a flier or advertisement and ask the smart glasses to call the phone number or scan the relevant QR code. The glasses can also automatically remember those things as well if a user wants to go back to what they previously viewed through the glasses at a later time.

Other updates coming to Ray-Ban Meta Smart Glasses include the ability to voice control Spotify and Amazon Music through the device as well as new integrations with apps like Audible and iHeartRadio.

Partnership with Be My Eyes for blind and low vision users

Meta + Be My Eyes


Credit: Meta

Meta also announced a partnership with Be My Eyes, a mobile app that connects blind and low-vision people with volunteers via live video to talk through what’s in front of them. The app will work directly through Ray-Ban Meta Smart Glasses and volunteers will be able to see through the user’s glasses in order to provide assistance.

Leave a Reply