Social media goliath Facebook identifies about 500k instances of revenge porn per month, according to a report. That’s more than 16k per day!
But this number seems low considering Facebook currently has around 2.6 billion monthly active users.
Earlier this year, Facebook launched artificial intelligence (AI) tools that could spot revenge porn, also known as non-consensual intimate images, before being reported by users, NBC News reported.
Among those 500k reports are instances of nude images posted as revenge against another person and also what’s known as ‘sextortion,’ in which nude images are posted to the site in an attempt to black mail someone.
In 2017, the company launched a pilot project that let users submit intimate pictures to the platform as a means of training its AI tool to identify and remove such pictures if they appeared on the platform.
“In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports,” NBC News quoted Radha Plumb, head of product policy research at Facebook.
Facebook has a team of around 25 people — excluding content moderators — that works full-time combatting revenge porn. Its goal is not only to quickly remove pictures or videos once they have been reported, but also to detect the images using AI the moment they are uploaded, to prevent them from being shared.