Ray-Ban Meta smart glasses do the AI thing without a projector or subscription

The Ray-Ban Meta smart glasses have been something of a pleasant surprise. They make videos, take photos, livestream and act as an adequate replacement for headphones, all while looking like a normal pair of sunglasses. However, everyone’s been waiting for the addition of multimodal AI after early access testing began in January. Now it’s here. What is multimodal AI? Simply put, it’s a toolset that allows an AI assistant to process multiple types of information, including photos, videos, text and audio. It’s an AI that can view and understand the world around you in real time. This is the underlying concept behind Humane’s maligned AI Pin. Meta’s version is more conservative with its promises and, honestly, we came away impressed during our initial hands-on. Multimodal Meta AI is rolling out widely on Ray-Ban Meta starting today! It's a huge advancement for wearables & makes using AI more interactive & intuitive.Excited to share more on our multimodal work w/ Meta AI (& Llama 3), stay tuned for more updates coming soon. pic.twitter.com/DLiCVriMfk— Ahmad Al-Dahle (@Ahmad_Al_Dahle) April 23, 2024 Here’s how it works. The glasses have a camera and five microphones, acting as the AI’s eyes and ears. With this in mind, you can ask the glasses to describe anything you are looking at. Do you want to know a dog’s breed before you go up and give it a good pet? Just ask the glasses. Meta says it can also read signs in different languages, which is great for traveling. We enjoyed exclaiming “Hey Meta, look at this and tell me what it says” and listening as it did just that. There’s even a landmark identification feature, though that wasn’t available to test. There are some other potential use case scenarios, like staring at loose ingredients on a kitchen counter and asking the AI to whip up a relevant recipe. However, we need a few weeks of real people running the tech through its paces to gauge what it's actually good at. Real-time translation is going to be something of a killer app, particularly for tourists, but here's hoping it keeps the hallucinations to a minimum. Mark Zuckerberg has shown the AI picking out clothes for him to wear but, come on, that’s about as pie in the sky as it gets. Multimodal AI wasn’t the only update for the smart glasses announced today. Meta revealed hands-free video call integration with WhatsApp and Messenger. There are also some new frame designs for the fashion-conscious. These new styles can be fitted with prescription lenses and are available for preorder right now. The Ray-Ban Meta smart glasses start at $300, which isn’t chump change but is certainly better than $700 for a clunky pin.This article originally appeared on Engadget at https://www.engadget.com/ray-ban-meta-smart-glasses-do-the-ai-thing-without-a-projector-or-subscription-175403559.html?src=rss

Apr 23, 2024 - 23:30
 0
Ray-Ban Meta smart glasses do the AI thing without a projector or subscription

The Ray-Ban Meta smart glasses have been something of a pleasant surprise. They make videos, take photos, livestream and act as an adequate replacement for headphones, all while looking like a normal pair of sunglasses. However, everyone’s been waiting for the addition of multimodal AI after early access testing began in January. Now it’s here.

What is multimodal AI? Simply put, it’s a toolset that allows an AI assistant to process multiple types of information, including photos, videos, text and audio. It’s an AI that can view and understand the world around you in real time. This is the underlying concept behind Humane’s maligned AI Pin. Meta’s version is more conservative with its promises and, honestly, we came away impressed during our initial hands-on.

Here’s how it works. The glasses have a camera and five microphones, acting as the AI’s eyes and ears. With this in mind, you can ask the glasses to describe anything you are looking at. Do you want to know a dog’s breed before you go up and give it a good pet? Just ask the glasses. Meta says it can also read signs in different languages, which is great for traveling. We enjoyed exclaiming “Hey Meta, look at this and tell me what it says” and listening as it did just that. There’s even a landmark identification feature, though that wasn’t available to test.

There are some other potential use case scenarios, like staring at loose ingredients on a kitchen counter and asking the AI to whip up a relevant recipe. However, we need a few weeks of real people running the tech through its paces to gauge what it's actually good at. Real-time translation is going to be something of a killer app, particularly for tourists, but here's hoping it keeps the hallucinations to a minimum. Mark Zuckerberg has shown the AI picking out clothes for him to wear but, come on, that’s about as pie in the sky as it gets.

Multimodal AI wasn’t the only update for the smart glasses announced today. Meta revealed hands-free video call integration with WhatsApp and Messenger. There are also some new frame designs for the fashion-conscious. These new styles can be fitted with prescription lenses and are available for preorder right now. The Ray-Ban Meta smart glasses start at $300, which isn’t chump change but is certainly better than $700 for a clunky pin.This article originally appeared on Engadget at https://www.engadget.com/ray-ban-meta-smart-glasses-do-the-ai-thing-without-a-projector-or-subscription-175403559.html?src=rss

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Viral News Code whisperer by profession, narrative alchemist by passion. With 6 years of tech expertise under my belt, I bring a unique blend of logic and imagination to ViralNews360. Expect everything from tech explainers that melt your brain (but not your circuits) to heartwarming tales that tug at your heartstrings. Come on in, the virtual coffee's always brewing!