Our website uses cookies

Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing Infosecurity Magazine, you agree to our use of cookies.

Okay, I understand Learn more

Facebook Turns to Image Recognition to Thwart Revenge Porn

Looking to address the pernicious effect of revenge porn, Facebook is launching image-recognition tools that will prevent intimate content from being shared on Facebook, Messenger and Instagram without permission.

Revenge porn is the province of the jilted and the jealous, the malicious and the envious. Typically it happens when two people in a relationship share intimate or sexual pictures or videos via text or email; post-break-up, or in the hands of “frenemies,” this content may be posted publicly as payback for heartbreak or other perceived transgressions. It can be enormously damaging for victims, especially younger teen girls.

“According to a study of US victims of non-consensual intimate images, 93% report significant emotional distress and 82% report significant impairment in social, occupational or other important areas of their life,” said Antigone Davis, head of global safety at Facebook, in a post. “These tools, developed in partnership with safety experts, are one example of the potential technology has to help keep people safe. Facebook is in a unique position to prevent harm, one of our five areas of focus as we help build a global community.”

The new tools involve community ethics as well as technology. First and foremost, if Facebook members that see an intimate image on Facebook that looks like it was shared without permission, they can report it by using the “Report” link that appears in the dropdown menu next to a post.

“Specially trained representatives from our Community Operations team review the image and remove it if it violates our Community Standards,” Davis explained. “In most cases, we will also disable the account for sharing intimate images without permission. We offer an appeals process if someone believes an image was taken down in error.”

The social network then uses photo-matching technologies to help thwart further attempts to share the image on Facebook, Messenger and Instagram after it’s been reported and removed.

“It is great to see Facebook is taking steps to employ new technologies to root out revenge porn, and match the current technology it uses when identifying child-abuse imagery and terrorist material,” said Claire Stead, online safety ambassador at Smoothwall, via email. “Facebook should be a safe environment for all, but unfortunately there will always be those that abuse it by posting inappropriate content. While it’s not possible to completely eradicate problems, by using the right tools to smartly filter and monitor the web in action, companies such as Facebook will have a solid foundation from which they can protect its users from all kinds of nasty threats online.”

She added, “Facebook’s measure is a step in the right direction, but there is still a lot more work that network providers, social media companies and even academic institutions can do to protect those most vulnerable.”

National Network to End Domestic Violence, Center for Social Research, the Revenge Porn Helpline (UK) and the Cyber Civil Rights Initiative provided input and feedback throughout the product-development process for the tool suite, and Facebook also is partnering with safety organizations to offer resources and support to the victims of this behavior. It also has worked with the Cyber Civil Rights Initiative and other companies to create a one-stop destination for victims and others to report this content to the major technology companies. 

What’s Hot on Infosecurity Magazine?