The Pixel Screenshots app uses AI to scour the screengrabs I can't remember why I saved

About 50 percent of my photo album is receipts. That is, screenshots of everything I consider even mildly interesting. Whether it’s Uber drivers who never seem to be getting closer, hot tea from my friend’s Instagram Stories or unfathomable email threads, my gallery is full of unexplainable internet detritus. Best of all, just from viewing their thumbnails, I can never know where exactly a specific image is, because walls of text all look the same from afar. So when Google announced its new Pixel Screenshots app at its Made By Google event today, I was excessively excited. The Screenshots app launches alongside the Pixel 9, Pixel 9 Pro and Pixel 9 Pro Fold, and uses Gemini AI to help locate specific images. After you grant the app access to your photos, the AI will not only ingest files it thinks are screenshots, but also start identifying what’s within each picture. On the home page, you’ll see a row at the top called “Collections,” with a series of pre-organized snaps like “Gift Ideas,” “Boots” or “Places to visit.” These can be curated by yourself or suggested by the system. Below this row is a grid of all your most recent captures, and at the bottom is a search bar and a Plus symbol next to it. Pressing that symbol will let you either launch the camera or import a photo from your album. This is helpful for pictures you’ve taken of real-world signs that contain information you want Gemini AI to help remember. Tapping each screenshot in this app will expand the image and bring up a title, summary and buttons based on its contents. These are all AI-generated, so if you’re looking at a picture of a music festival’s Instagram post about upcoming dates, the title might say “Lollapalooza headline acts” with buttons to add specific events from that picture to your calendar. If you’ve pulled up an image of a restaurant’s website, then Screenshots might offer shortcuts to call the shop or navigate to the business address via Maps. From the home page, you can either type into the search bar or tap the microphone icon in it and ask Google for things like “What was Sam’s WiFi password?” or “How much do I owe Cherlynn?” The app will scour your gallery and not only return images with possibly relevant info, but also attempt to answer your question up top. In the demo I saw at a recent hands-on event, a Google rep asked the app “When do the tickets for the festival go on sale?” Screenshots responded almost instantly by pulling up a picture of a folk festival’s Instagram post, and seconds later showed the words “The tickets for the festival go on sale on August 5th.” This example was particularly impressive as there were multiple dates noted in the screenshot, one for the ticket sales starting and one for the festival itself kicking off. From the same interface, the company’s rep was able to get the Pixel 9 to set a reminder to buy the tickets in time. It’s kind of a coincidence that Google is launching this app today, considering Apple’s redesign of its Photos app also pays extra attention to organizing and filtering out screenshots. My experience of both approaches is extremely limited at the moment, but currently I slightly prefer Google’s Screenshots app. It feels like a more focused and deliberate way to look for information and get help from AI, rather than possibly getting distracted by my million selfies in the Photos app on my iPhone when I’m trying to look for a bank statement, perhaps. The use of AI to make sense of our screenshots feels like a smart one, though there are of course privacy concerns. Microsoft already had to hit pause on the rollout of its Recall feature that was supposed to remember everything you were doing on your computer by taking screenshots every few seconds. Google’s Screenshots app uses Gemini Nano, which is its on-device AI model for local processing, and the company says this feature won’t share your screenshots offline (beyond the backups you might already have opted in to via Google Photos). The Pixel Screenshots app will be on the Pixel 9 family at launch, and the company has nothing to share on wider availability at the moment. But based on how Google has launched and rolled out apps like Recorder in the past, it’s likely that older Pixel devices will get Screenshots in time, as long as it’s received well by users.This article originally appeared on Engadget at https://www.engadget.com/apps/the-pixel-screenshots-app-uses-ai-to-scour-the-screengrabs-i-cant-remember-why-i-saved-170043423.html?src=rss

Aug 13, 2024 - 23:30
 0
The Pixel Screenshots app uses AI to scour the screengrabs I can't remember why I saved

About 50 percent of my photo album is receipts. That is, screenshots of everything I consider even mildly interesting. Whether it’s Uber drivers who never seem to be getting closer, hot tea from my friend’s Instagram Stories or unfathomable email threads, my gallery is full of unexplainable internet detritus. Best of all, just from viewing their thumbnails, I can never know where exactly a specific image is, because walls of text all look the same from afar. So when Google announced its new Pixel Screenshots app at its Made By Google event today, I was excessively excited.

The Screenshots app launches alongside the Pixel 9, Pixel 9 Pro and Pixel 9 Pro Fold, and uses Gemini AI to help locate specific images. After you grant the app access to your photos, the AI will not only ingest files it thinks are screenshots, but also start identifying what’s within each picture.

On the home page, you’ll see a row at the top called “Collections,” with a series of pre-organized snaps like “Gift Ideas,” “Boots” or “Places to visit.” These can be curated by yourself or suggested by the system.

Below this row is a grid of all your most recent captures, and at the bottom is a search bar and a Plus symbol next to it. Pressing that symbol will let you either launch the camera or import a photo from your album. This is helpful for pictures you’ve taken of real-world signs that contain information you want Gemini AI to help remember.

Tapping each screenshot in this app will expand the image and bring up a title, summary and buttons based on its contents. These are all AI-generated, so if you’re looking at a picture of a music festival’s Instagram post about upcoming dates, the title might say “Lollapalooza headline acts” with buttons to add specific events from that picture to your calendar. If you’ve pulled up an image of a restaurant’s website, then Screenshots might offer shortcuts to call the shop or navigate to the business address via Maps.

From the home page, you can either type into the search bar or tap the microphone icon in it and ask Google for things like “What was Sam’s WiFi password?” or “How much do I owe Cherlynn?” The app will scour your gallery and not only return images with possibly relevant info, but also attempt to answer your question up top. In the demo I saw at a recent hands-on event, a Google rep asked the app “When do the tickets for the festival go on sale?”

Screenshots responded almost instantly by pulling up a picture of a folk festival’s Instagram post, and seconds later showed the words “The tickets for the festival go on sale on August 5th.” This example was particularly impressive as there were multiple dates noted in the screenshot, one for the ticket sales starting and one for the festival itself kicking off. From the same interface, the company’s rep was able to get the Pixel 9 to set a reminder to buy the tickets in time.

It’s kind of a coincidence that Google is launching this app today, considering Apple’s redesign of its Photos app also pays extra attention to organizing and filtering out screenshots. My experience of both approaches is extremely limited at the moment, but currently I slightly prefer Google’s Screenshots app. It feels like a more focused and deliberate way to look for information and get help from AI, rather than possibly getting distracted by my million selfies in the Photos app on my iPhone when I’m trying to look for a bank statement, perhaps.

The use of AI to make sense of our screenshots feels like a smart one, though there are of course privacy concerns. Microsoft already had to hit pause on the rollout of its Recall feature that was supposed to remember everything you were doing on your computer by taking screenshots every few seconds. Google’s Screenshots app uses Gemini Nano, which is its on-device AI model for local processing, and the company says this feature won’t share your screenshots offline (beyond the backups you might already have opted in to via Google Photos).

The Pixel Screenshots app will be on the Pixel 9 family at launch, and the company has nothing to share on wider availability at the moment. But based on how Google has launched and rolled out apps like Recorder in the past, it’s likely that older Pixel devices will get Screenshots in time, as long as it’s received well by users.This article originally appeared on Engadget at https://www.engadget.com/apps/the-pixel-screenshots-app-uses-ai-to-scour-the-screengrabs-i-cant-remember-why-i-saved-170043423.html?src=rss

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow

Viral News Code whisperer by profession, narrative alchemist by passion. With 6 years of tech expertise under my belt, I bring a unique blend of logic and imagination to ViralNews360. Expect everything from tech explainers that melt your brain (but not your circuits) to heartwarming tales that tug at your heartstrings. Come on in, the virtual coffee's always brewing!