Facebook readies AI tech to combat 'revenge porn'

The new technology is in addition to a pilot program that required a trained team to review offending images

'Inauthentic activity has no place on our platform', Facebook said in a blog post. PHOTO: REUTERS

Facebook said on Friday it would use artificial intelligence to combat the spread of intimate photos shared without people’s permission, sometimes called “revenge porn,” on its social networks.

The new technology is in addition to a pilot program that required trained representatives to review offending images.

Facebook, Twitter doing too little against disinformation: EU

"By using machine learning and artificial intelligence, we can now proactively detect near-nude images or videos that are shared without permission," the social networking giant said in a blog post. "This means we can find this content before anyone reports it."


A member of Facebook’s community operations team would review the content found by the new technology, and if found to be an offending image, remove it or disable the account responsible for spreading it, the company added.

“Revenge porn” refers to the sharing of sexually explicit images on the internet, without the consent of the people depicted in the pictures, in order to extort or humiliate them. The practice disproportionately affects women, who are sometimes targeted by former partners.

‘Facebook, Google investing in Pakistan’

Facebook will also launch a support hub called “Not Without My Consent” on its safety center page for people whose intimate images have been shared without their consent.

The Menlo Park, California-based company works with at least five outsourcing vendors in at least eight countries on content review, a Reuters tally shows. It had about 15,000 people, a mix of contractors and employees, working on the content review as of December.
Load Next Story