Meta integrates AI into its Ray-Ban smart glasses

Users of the Meta early access program in the United States and Canada are now able to download firmware version 11, which includes "live AI" and other features.

Meta integrates AI into its Ray-Ban smart glasses
Image: Google

Meta is introducing AI features to its Ray-Ban smart glasses, including the ability to have voice conversations with Meta chatbot, translate content into several languages, and use Shazam to identify songs that are currently playing.

A conversation with a Meta bot.

Users can now have a continuous conversation with the Meta AI chatbot without having to say the phrase "Hello, Meta". They can interrupt the bot, ask clarifying questions, or change the subject of discussion. Live AI allows users to interact with the Meta assistant without having to introduce AI into the conversation, as it constantly monitors the user's surroundings and what they're doing.

Live translation feature

The live translation feature allows the glasses to translate speech in real-time from English into Spanish, French, or Italian. Users have two options: they can listen to the translated speech through the glasses or view the transcript on the phone.

To get started, users need to download the language pairs and specify which languages they and the interlocutor speak.

Shazam integration

With the integration of Shazam, Meta's smart glasses will be able to identify any song that the user hears. All user need to do is ask "Meta, what song is this?" or "Meta, Shazam this song" and the microphones in the glasses will recognize the playing song. This works similarly to how Shazam functions on smartphones.

Users of the Meta early access program in the United States and Canada are now able to download firmware version 11, which includes "live AI" and other features. However, Shazam integration is already available to all users.

Users can apply for the Meta early access program here.