Google tweaks Search to help hide explicit deepfakes - 2 minutes read




Google is rolling out new online safety features that make it easier to remove explicit deepfakes from Search at scale and prevent them from appearing high up in search results in the first place.

When users successfully request the removal of explicit nonconsensual fake content that depicts them from Search, Google’s systems will now also aim to filter out all explicit results on similar searches about them and remove any duplicate images.

“These protections have already proven to be successful in addressing other types of non-consensual imagery, and we’ve now built the same capabilities for fake explicit images as well,” Google product manager Emma Higham said in the announcement. “These efforts are designed to give people added peace of mind, especially if they’re concerned about similar content about them popping up in the future.”

Google Search queries that intentionally seek deepfake images of a real person should instead surface “high-quality, non-explicit content”

Google Search rankings are also being adjusted to better handle queries that carry a higher risk of surfacing explicit fake content. For example, searches that intentionally seek deepfake images of a real person (such as the sexually explicit AI-generated images of Taylor Swift that were circulated earlier this year) should instead surface “high-quality, non-explicit content” like relevant news stories. Sites that receive a substantial amount of removals for fake explicit imagery will be demoted in Google Search rankings.

Google says that previous updates have reduced exposure to explicit image results on queries that are specifically looking for such deepfake content by over 70 percent this year. The company is also working on a way to distinguish between real explicit content — such as an actor’s consensual nude scenes — and explicit fake content so that legitimate images can still be surfaced while demoting deepfakes.

These updates follow similar changes that Google has made to tackle how dangerous and / or explicit content appears online. In May, Google started banning advertisers from promoting deepfake porn services. Google also expanded the types of “doxxing” information that can be removed from Search in 2022 and started blurring sexually explicit imagery by default in August 2023.



Source: The Verge

Powered by NewsAPI.org