Meta’s Ray-Ban branded smart glasses are getting AI-powered reminders and translation features
Meta’s AI assistant has always been the most intriguing feature of its second-generation Ray-Ban smart glasses. While the generative AI assistant had fairly limited capabilities when the glasses launched last fall, the addition of real-time information and multimodal capabilities offered a range of new possibilities for the accessory. Now, Meta is significantly upgrading the Ray-Ban Meta smart glasses’ AI powers. The company showed off a number of new abilities for the year-old frames onstage at its Connect event, including reminders and live translations. With reminders, you’ll be able to look at items in your surroundings and ask Meta to send a reminder about it. For example, “hey Meta, remind me to buy that book next Monday.” The glasses will also be able to scan QR codes and call a phone number written in front of you. In addition, Meta is adding video support to Meta AI so that the glasses will be better able to scan your surroundings and respond to queries about what’s around you. There are other more subtle improvements. Previously, you had to start a command with “Hey Meta, look and tell me” in order to get the glasses to respond to a command based on what you were looking at. With the update though, Meta AI will be able to respond to queries about what’s in front of you with more natural requests. In a demo with Meta, I was able to ask several questions and follow-ups with questions like “hey Meta, what am I looking at” or “hey Meta, tell me about what I’m looking at.” When I tried out Meta AI’s multimodal capabilities on the glasses last year, I found that Meta AI was able to translate some snippets of text but struggled with anything more than a few words. Now, Meta AI should be able to translate longer chunks of text. And later this year the company is adding live translation abilities for English, French, Italian and Spanish, which could make the glasses even more useful as a travel accessory. And while I still haven’t fully tested Meta AI’s new capabilities on its smart glasses just yet, it already seems to have a better grasp of real-time information than what I found last year. During a demo with Meta, I asked Meta AI to tell me who is the Speaker of the House of Representatives — a question it repeatedly got wrong last year — and it answered correctly the first time.This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-ray-ban-branded-smart-glasses-are-getting-ai-powered-reminders-and-translation-features-173921120.html?src=rss
Meta’s AI assistant has always been the most intriguing feature of its second-generation Ray-Ban smart glasses. While the generative AI assistant had fairly limited capabilities when the glasses launched last fall, the addition of real-time information and multimodal capabilities offered a range of new possibilities for the accessory.
Now, Meta is significantly upgrading the Ray-Ban Meta smart glasses’ AI powers. The company showed off a number of new abilities for the year-old frames onstage at its Connect event, including reminders and live translations.
With reminders, you’ll be able to look at items in your surroundings and ask Meta to send a reminder about it. For example, “hey Meta, remind me to buy that book next Monday.” The glasses will also be able to scan QR codes and call a phone number written in front of you.
In addition, Meta is adding video support to Meta AI so that the glasses will be better able to scan your surroundings and respond to queries about what’s around you. There are other more subtle improvements. Previously, you had to start a command with “Hey Meta, look and tell me” in order to get the glasses to respond to a command based on what you were looking at. With the update though, Meta AI will be able to respond to queries about what’s in front of you with more natural requests. In a demo with Meta, I was able to ask several questions and follow-ups with questions like “hey Meta, what am I looking at” or “hey Meta, tell me about what I’m looking at.”
When I tried out Meta AI’s multimodal capabilities on the glasses last year, I found that Meta AI was able to translate some snippets of text but struggled with anything more than a few words. Now, Meta AI should be able to translate longer chunks of text. And later this year the company is adding live translation abilities for English, French, Italian and Spanish, which could make the glasses even more useful as a travel accessory.
And while I still haven’t fully tested Meta AI’s new capabilities on its smart glasses just yet, it already seems to have a better grasp of real-time information than what I found last year. During a demo with Meta, I asked Meta AI to tell me who is the Speaker of the House of Representatives — a question it repeatedly got wrong last year — and it answered correctly the first time.This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-ray-ban-branded-smart-glasses-are-getting-ai-powered-reminders-and-translation-features-173921120.html?src=rss
What's Your Reaction?