Meta's smart glasses can now tell you where you parked your car

Meta is rolling out some of the previously announced features to its AI-powered Ray-Ban smart glasses for users in the US and Canada. CTO Andrew Bosworth posted on Threads that today's update to the glasses includes more natural language recognition, meaning the stilted commands of "Hey Meta, look and tell me" should be gone. Users will be able to engage the AI assistant without the "look and" portion of the invocation. Most of the other AI tools showed off during last month's Connect event are also arriving on the frames today. That includes voice messages, timers and reminders. The glasses can also be used to have Meta AI call a phone number or scan a QR code. CEO Mark Zuckerberg demonstrated the new reminders features as a way to find your car in a parking garage in an Instagram reel. One notable omission from this update is the live translation feature, but Bosworth didn't share a timeline for when that feature will be ready. View this post on Instagram A post shared by Mark Zuckerberg (@zuck) Meta's smart glasses already made headlines once today after two students from Harvard University used them to essentially dox total strangers. Their combination of facial recognition technology and a large language processing model was able to reveal addresses, phone numbers, family member details and partial Social Security Numbers.This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-smart-glasses-can-now-tell-you-where-you-parked-your-car-195200826.html?src=rss

Oct 3, 2024 - 01:30
 0
Meta's smart glasses can now tell you where you parked your car

Meta is rolling out some of the previously announced features to its AI-powered Ray-Ban smart glasses for users in the US and Canada. CTO Andrew Bosworth posted on Threads that today's update to the glasses includes more natural language recognition, meaning the stilted commands of "Hey Meta, look and tell me" should be gone. Users will be able to engage the AI assistant without the "look and" portion of the invocation.

Most of the other AI tools showed off during last month's Connect event are also arriving on the frames today. That includes voice messages, timers and reminders. The glasses can also be used to have Meta AI call a phone number or scan a QR code. CEO Mark Zuckerberg demonstrated the new reminders features as a way to find your car in a parking garage in an Instagram reel. One notable omission from this update is the live translation feature, but Bosworth didn't share a timeline for when that feature will be ready.

Meta's smart glasses already made headlines once today after two students from Harvard University used them to essentially dox total strangers. Their combination of facial recognition technology and a large language processing model was able to reveal addresses, phone numbers, family member details and partial Social Security Numbers.This article originally appeared on Engadget at https://www.engadget.com/wearables/metas-smart-glasses-can-now-tell-you-where-you-parked-your-car-195200826.html?src=rss

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Viral News Code whisperer by profession, narrative alchemist by passion. With 6 years of tech expertise under my belt, I bring a unique blend of logic and imagination to ViralNews360. Expect everything from tech explainers that melt your brain (but not your circuits) to heartwarming tales that tug at your heartstrings. Come on in, the virtual coffee's always brewing!