THE FUTURE IS HERE

Facebook says its new AI technology can detect ‘revenge porn’

Facebook on Friday announced a new artificial intelligence-powered tool that it says will help the social network detect revenge porn — the nonconsensually shared intimate images that, when posted online, can have devastating consequences for those who appear in the photos. The technology will leverage both AI and machine learning techniques to proactively detect near-nude images or videos that are shared without permission across Facebook and Instagram.

The announcement follows on Facebook’s earlier pilot of a photo-matching technology, which had people directly submit their intimate photos and videos to Facebook. The program, which was run in partnership with victim advocate organizations, would then create a digital fingerprint of that image so Facebook could stop it from ever being shared online across its platforms. This is similar to how companies today prevent child abuse images from being posted to their sites.

The new AI technology for revenge porn, however, doesn’t require the victim’s involvement. This is important, Facebook explains, because victims are sometimes too afraid of retribution to report the content themselves. Other times, they’re simply unaware that the photos or videos are being shared.

While the company was short on details about how the new system itself works, it did note that it goes beyond simply “detecting nudity.”

After the system flags an image or video, a specially trained member of Facebook’s Community Operations team will review the image, then remove it if it violates Facebook’s Community Standards. In most cases, the company will also disable the account as a result. An appeals process is available if the person believes Facebook has made a mistake.

In addition to the technology and existing pilot program, Facebook says it also reviewed how its other procedures around revenge porn reporting could be improved. It found, for instance, that victims wanted faster responses following their reports and they didn’t want a robotic reply. Other victims didn’t know how to use the reporting tools or even that they existed.

Facebook noted that addressing revenge porn is critical as it can lead to mental health consequences like anxiety, depression, suicidal thoughts and sometimes even PTSD. There can also be professional consequences, like lost jobs and damaged relationships with colleagues. Plus, those in more traditional communities around the world may be shunned or exiled, persecuted or even physically harmed.

Facebook admits that it wasn’t finding a way to “acknowledge the trauma that the victims endure,” when responding to their reports. It says it’s now re-evaluating the reporting tools and process to make sure they’re more “straightforward, clear and empathetic.”

It’s also launching “Not Without My Consent,” a victim-support hub in the Facebook Safety Center that was developed in partnership with experts. The hub will offer victims access to organizations and resources that can support them, and will detail the steps to take to report the content to Facebook.

In the months ahead, Facebook says it also will build victim support toolkits with more locally and culturally relevant info by working with partners, including the Revenge Porn Helpline (U.K.), Cyber Civil Rights Initiative (U.S.), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-yeon (South Korea).

Revenge porn is one of the many issues that results from offering the world a platform for public sharing. Facebook today is beginning to own up to the failures of social media across many fronts — which also include things like data privacy violations, the spread of misinformation and online harassment and abuse.

CEO Mark Zuckerberg recently announced a pivot to privacy, where Facebook’s products will be joined together as an encrypted, interoperable, messaging network — but the move has shaken Facebook internally, causing it to lose top execs along the way.

While changes are in line with what the public wants, many have already lost trust in Facebook. For the first time in 10 years Edison Research noted a decline in Facebook usage in the U.S., from 67 to 62 percent of Americans 12 and older. Still, Facebook is still a massive platform, with its more than 2 billion users. Even if users themselves opt out of Facebook, that doesn’t prevent them from ever becoming a victim of revenge porn or other online abuse by those who continue to use the social network.