How to ask Google to remove deepfake porn results from Google Search

The internet is full of deepfakes — and most of them are nudes. According to a report from Home Security Heroes, deepfake porn makes up 98% of all deepfake videos online. Thanks to easy-to-use and freely available generative AI tools, the number of deepfakes online — many of which aren’t consensual — skyrocketed 550% from […]
© 2024 TechCrunch. All rights reserved. For personal use only.

The internet is full of deepfakes — and most of them are nudes.

According to a report from Home Security Heroes, deepfake porn makes up 98% of all deepfake videos online. Thanks to easy-to-use and freely available generative AI tools, the number of deepfakes online — many of which aren’t consensual — skyrocketed 550% from 2019 to 2023.

While laws against nonconsensual deepfakes are lagging behind, at least in the U.S., it’s becoming a little bit easier to get deepfakes removed, thanks to new tools in Google Search.

Google recently introduced changes to Search to combat deepfake porn, including adjustments to the Search ranking algorithm designed to lower deepfake content in searches. The company also rolled out an expedited way to process request for removals of nonconsenual deepfake porn results from Search.

Here’s how to use it.

The easiest way to request that a deepfake nonconsensual porn result — a webpage, image or video — be removed from Google Search is using this web form. Note that there’s a separate form for child sexual abuse imagery, and the target content has to meet Google’s criteria for removal, as follows:

It’s nude, intimate, or sexually explicit (for example, images or videos of you) and is distributed without permission; OR
It’s fake or falsely depicts you as nude or in a sexually explicit situation; OR
It incorrectly associates you or your name with sex work.

Click on “Content contains nudity or sexual material option,” then proceed to the next page.

Image Credits: Google

At this stage, select “Content falsely portrays me in a sexual act, or in an intimate state. (This is sometimes known as a “deep fake” or “fake pornography.”):”

Image Credits: Google

At the final page in the form, after entering your name, country of residence and contact email, you’ll have to indicate whether it’s you or someone else depicted in the deepfake content to be removed. Google allows others to remove content on someone’s behalf, but only if that person is an “authorized representative” who explains how they have that authority.

Image Credits: Google

Next is the content information section. Here, you’ll need to provide the URLs to the deepfake results to be removed (up to a maximum of 1,000), the URLs to the Google Search results where the content appears (again, up to a maximum of 1,000) and search terms that return the deepfakes. Lastly, you’ll have to upload one or more screenshots of the content you’re reporting and any additional info that might help explain the situation.

After submitting a request, you’ll get an automated email confirmation. The request will be reviewed, after which Google may request more information (like additional URLs). You’ll get a notification of any action taken, and, if the request didn’t meet Google’s requirements for removal, a follow-up message explaining why.

Requests that are denied can be re-submitted with new supporting materials.

Google says that when someone successfully requests the removal of nonconsensual deepfake porn results in Search, the company’s systems will also aim to filter explicit results on all similar searches about that person. In addition, Google says, when an image is removed from Search under Google’s policies, its systems will scan for — and remove — any duplicates of that image they find.

“These protections have already proven to be successful in addressing other types of non-consensual imagery, and we’ve now built the same capabilities for fake explicit images as well,” Google writes in a blog post. “These efforts are designed to give people added peace of mind, especially if they’re concerned about similar content about them popping up in the future.”

 


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *