Meta enhances Ray-Ban smart glasses with AI video, live translation features

Meta's update lets Ray-Ban glasses use AI to analyze user views, process questions, and respond in real-time


REUTERS December 17, 2024
Ray-Ban Meta sunglasses are displayed at the Meta Connect annual event at the company's headquarters in Menlo Park, California, US , September 24, 2024. PHOTO: REUTERS

Meta Platforms said on Monday it has updated the Ray-Ban Meta smart glasses with AI video capability and real-time language translation functionality.

The Facebook parent, which first announced the features during its annual Connect conference in September, said the update is available for members that are part of its "Early Access Program".

The features are included in the v11 software update, which will begin rolling out on Monday.

The latest update adds video to Meta's AI chatbot assistant, which allows the Ray-Ban smart glasses to process what the user is seeing and respond to questions in real-time.

The smart glasses will now be able to translate speech in real time between English and Spanish, French or Italian.

"When you're talking to someone speaking one of those three languages, you'll hear what they say in English through the glasses' open-ear speakers or viewed as transcripts on your phone, and vice versa," Meta said in a blog.

Meta also added Shazam, an app that lets users identify songs, to the smart glasses, which will be available in the US and Canada.

In September, Meta said it is updating the Ray-Ban smart glasses with several new AI features, including tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ