Google cracks down on explicit deepfakes in search results
Google has rolled out an update to its Search function, aimed at making explicit deepfakes more difficult to find. This move is part of the tech giant's ongoing efforts to combat the spread of realistic-looking manipulated images. Now, the company will also allow individuals to request removal of non-consensual fake imagery featuring them from Search results.
Enhanced removal process for explicit content
The update enhances the removal process for explicit deepfakes. Previously, users could request the removal of such images under Google's policies. Now, when a removal request is approved, Google will also filter all explicit results on similar searches about that individual. The company's systems will scan for any duplicates of the offending image and remove them as well.
Google's ranking systems updated to combat deepfakes
Google has also modified its ranking systems. If a user searches for explicit deepfakes with a person's name, results will now show "high-quality, non-explicit content" instead. For example, if there are news articles about that person, then these will be featured in the search results. This change aims to redirect users away from explicit content and toward more informative sources.
Google's efforts to distinguish between legitimate and fake content
Google has clarified that it will not remove results for legitimate content, such as nude scenes of actors, in its bid to banish deepfakes from its results page. The company acknowledges that distinguishing between legitimate and fake explicit images is still a work in progress. To address this challenge, websites that frequently host and are reported for fake explicit imagery, will see their position in Google's search rankings downgraded.