Facebook’s New Plan May Curb Revenge Porn, But Won't Kill It

The only way to eradicate revenge porn is to catch it before it's posted—this is a good first step, though.
RevengePorn.jpg
Getty Images

Facebook needed to move against nonconsensual porn. The scandal surrounding Marines United, a secret Facebook group of 30,000 servicemen who shared dozens of women's private images without permission, proved that. Now, the social media giant finally shuffling in the right direction.

On Wednesday, Facebook released new guidelines for how it plans to curb the sharing of nonconsensual porn, which some call "revenge porn," whether revenge was the motive or not. Under Facebook's new bylaws, if revenge porn pops up in your newsfeed and you report it, a team of (unfortunate) Facebook employees will now vet the image, and implement photo-matching technology to make sure it doesn't spread any further. The protocol works across Facebook Messenger and Instagram, too. But while this policy is a great start---and will give Facebook some much-needed legal cover if federal law ever criminalizes nonconsensual porn---the only way to kill revenge porn is to stop it being posted in the first place.

Facebook's photo-matching technology should be a huge boon to revenge porn victims. "The constant challenge for the victim is reporting each post that shares their photo," says Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, and also serves as the tech and legislative policy advisor for the Cyber Civil Rights Initiative. "So we're really excited about this. It will alleviate some of that burden."

Once someone reports an image, Facebook can be reasonably confident photo-matching technology will catch the rest. (Even better, a pop-up will notify the would-be poster that the photo they're about to share is revenge porn.) It's basically the same hashing technology that powers Google's image search, and Facebook already uses something similar to help identify child pornography. Circumventing it is difficult: a person would need to making significant visual changes to the original image, like adding stickers or filters or pasting the person onto a new background to bamboozle the tech. "If it's just some dude uploading a photo from his phone, this should work really well," says Jen Golbeck, a computer scientist at the University of Maryland.

Legally speaking, it's a good move for Facebook too. If sharing revenge pornography becomes a federal crime---which Franks and congressional representative Jackie Speier (D-California) are working on---Facebook is going to need to find some shelter. As with criminalized content like child porn or terrorist videos, online intermediaries like Facebook would be legally obligated to report revenge porn to the powers that be, retain evidence, and take good faith measures to stop its spread. The photo-matching technology would help the company do that. "This is one of the ways Facebook could signal they are trying to address this problem in the same way as child pornography," Franks says.

The key word there, though, is "trying." Like child porn or doxxing, revenge porn inflicts damage the first time it's shared, so removing something after it's already been posted is a second-best solution. And this measure wouldn't even catch the nonconsensual porn shared within a closed ecosystem like the Marines United group. "We have to work preemptively. We've got a real problem with people sharing these images in a likeminded group," Franks says. "In that situation, a woman might not find out her photos had been shared for 8 or 9 months." Or ever.

Nor does reporting nonconsensual porn on Facebook stop the image from spreading elsewhere the internet. When one Marines United member reported that group for its behavior, other members just moved the revenge porn party over to Google Drive. Franks expects other major tech platforms like Google and Twitter to announce similar policies soon, but it will take those systems being interoperable---or at least communicating with each other---to make a dent in this issue.

The only thing that could really stop a group like Marines United is an AI that scans images prior to posting. According to a Facebook spokesperson familiar with the efforts, the company is heading in that direction, and the only thing holding it back is coaching the AI to understand context—the thing that makes a photo revenge porn instead of, say, the "napalm girl" or a work of modern art. "Even if it's right 90 percent of the time, you really don't want to catch stuff that's legitimate," Golbeck says. "It's probably going to take more fine tuning to avoid those user concerns."

The question of context also brings up another user concern: censorship creep. In the past, groups like the ACLU have opposed revenge porn laws not because they condone the behavior, but because the overly broad laws would have also criminalized consensual porn, or even photos of Holocaust victims. And if every major social media platform takes down the same harmless image at the same time, it could be a public-relations disaster.

To avoid those fiascos, the path forward requires both cooperation and codification. "With mushy categories like 'extremism,' you run the risk of censoring political speech or dissent," says Danielle Citron, who teaches law at the University of Maryland. "But if the definition of nonconsensual porn is narrow enough, we could have a shared industry database that avoids the pitfalls." Beyond agreeing to a specific definition of revenge porn, Citron says, companies need to educate their content moderators about the possibility of unintentional censorship, so that missteps don't happen.

Facebook has created a template for others to expand and iterate on. That's a great first step. Now comes the rest of the journey. And finding consensus on limiting internet speech in Silicon Valley? That might be even harder than hunting down revenge porn.