Google’s accessibility app Lookout can use your phone’s camera to find and recognize objects
Google has updated some of its accessibility apps to add capabilities that will make them easier to use for people who need them. It has rolled out a new version of the Lookout app, which can read text and even lengthy documents out loud for people with low vision or blindness. The app can also read food labels, recognize currency and can tell users what it sees through the camera and in an image. Its latest version comes with a new "Find" mode that allows users to choose from seven item categories, including seating, tables, vehicles, utensils and bathrooms. When users choose a category, the app will be able to recognize objects associated with them as the user moves their camera around a room. It will then tell them the direction or distance to the object, making it easier for users to interact with their surroundings. Google has also launched an in-app capture button, so they can take photos and quickly get AI-generated descriptions. Google The company has updated its Look to Speak app, as well. Look to Speak enables users to communicate with other people by selecting from a list of phrases, which they want the app to speak out loud, using eye gestures. Now, Google has added a text-free mode that gives them the option to trigger speech by choosing from a photo book containing various emojis, symbols and photos. Even better, they can personalize what each symbol or image means for them. Google has also expanded its screen reader capabilities for Lens in Maps, so that it can tell the user the names and categories of the places it sees, such as ATMs and restaurants. It can also tell them how far away a particular location is. In addition, it's rolling out improvements for detailed voice guidance, which provides audio prompts that tell the user where they're supposed to go. Finally, Google has made Maps' wheelchair information accessible on desktop, four years after it launched on Android and iOS. The Accessible Places feature allows users to see if the place they're visiting can accommodate their needs — businesses and public venues with an accessible entrance, for example, will show a wheelchair icon. They can also use the feature to see if a location has accessible washrooms, seating and parking. The company says Maps has accessibility information for over 50 million places at the moment. Those who prefer looking up wheelchair information on Android and iOS will now also be able to easily filter reviews focusing on wheelchair access. Google made all these announcements at this year's I/O developer conference, where it also revealed that it open-sourced more code for the Project Gameface hands-free "mouse," allowing Android developers to use it for their apps. The tool allows users to control the cursor with their head movements and facial gestures, so that they can more easily use their computers and phones. Catch up on all the news from Google I/O 2024 right here!This article originally appeared on Engadget at https://www.engadget.com/googles-accessibility-app-lookout-can-use-your-phones-camera-to-find-and-recognize-objects-160007994.html?src=rss
Google has updated some of its accessibility apps to add capabilities that will make them easier to use for people who need them. It has rolled out a new version of the Lookout app, which can read text and even lengthy documents out loud for people with low vision or blindness. The app can also read food labels, recognize currency and can tell users what it sees through the camera and in an image. Its latest version comes with a new "Find" mode that allows users to choose from seven item categories, including seating, tables, vehicles, utensils and bathrooms.
When users choose a category, the app will be able to recognize objects associated with them as the user moves their camera around a room. It will then tell them the direction or distance to the object, making it easier for users to interact with their surroundings. Google has also launched an in-app capture button, so they can take photos and quickly get AI-generated descriptions.
The company has updated its Look to Speak app, as well. Look to Speak enables users to communicate with other people by selecting from a list of phrases, which they want the app to speak out loud, using eye gestures. Now, Google has added a text-free mode that gives them the option to trigger speech by choosing from a photo book containing various emojis, symbols and photos. Even better, they can personalize what each symbol or image means for them.
Google has also expanded its screen reader capabilities for Lens in Maps, so that it can tell the user the names and categories of the places it sees, such as ATMs and restaurants. It can also tell them how far away a particular location is. In addition, it's rolling out improvements for detailed voice guidance, which provides audio prompts that tell the user where they're supposed to go.
Finally, Google has made Maps' wheelchair information accessible on desktop, four years after it launched on Android and iOS. The Accessible Places feature allows users to see if the place they're visiting can accommodate their needs — businesses and public venues with an accessible entrance, for example, will show a wheelchair icon. They can also use the feature to see if a location has accessible washrooms, seating and parking. The company says Maps has accessibility information for over 50 million places at the moment. Those who prefer looking up wheelchair information on Android and iOS will now also be able to easily filter reviews focusing on wheelchair access.
Google made all these announcements at this year's I/O developer conference, where it also revealed that it open-sourced more code for the Project Gameface hands-free "mouse," allowing Android developers to use it for their apps. The tool allows users to control the cursor with their head movements and facial gestures, so that they can more easily use their computers and phones.
Catch up on all the news from Google I/O 2024 right here!This article originally appeared on Engadget at https://www.engadget.com/googles-accessibility-app-lookout-can-use-your-phones-camera-to-find-and-recognize-objects-160007994.html?src=rss
What's Your Reaction?