bt_bb_section_bottom_section_coverage_image

Google Photos’ Gemini AI-Powered Ask Photos Feature Reportedly Rolling Out to Some Users

Google Photos is reportedly getting the long-awaited Photo Request feature in a limited US rollout. The artificial intelligence (AI) component, developed by Gemini, was first unveiled at Google I/O in May. Last month, the company confirmed that the feature will be released early, and it has been reported that users have already started seeing the feature on their devices. This feature allows users to search for specific photos in Google Photos by sending a phone request to Gemini.

Ask Photos in Google Photos

According to Wired, the Ask Photos feature is rolling out to more Android users in the US via an update to the release page. It appears in the lower right corner of Google Photos and replaces the Search tab. The tech giant opened a waiting list last month to request early access to the unit, and those on the waiting list are getting the unit.

Google Photos feature, available on Android and iOS devices, allows users to query information to find out what photos are stored in the user’s cloud storage . Users can ask ​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​quity, users can ask ​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​ knows has has is doesn´t matter. If the AI ​​didn’t find the shape on the first try, users can also ask follow-up questions.

As per Google, the AI feature also focuses on user privacy. As previously stated by the company, user data including the queries made to Ask Photos, will not be used for ads. The prompts might be reviewed by humans, but that will be done after the user’s account has been disconnected.

The publication also shared a screenshot of the overview page of the feature, which highlighted how the tech giant is processing the data in Google Photos to let users run natural-language queries on them. As per the company, it generates text descriptions for the images and videos, uses facial recognition and compiles the data with location and time stamps to add context (such as if the user went on a vacation to Goa between October 1 and 5), and estimates the relationship of other people in the gallery with the user based on their images.

Share
× WhatsApp