Facebook AI-powered tech detects revenge porn automatically: Here's how
Facebook is ramping up efforts to prevent revenge porn from appearing on its web or mobile-based platforms. The social network has introduced a new AI-backed tool that automatically detects near-nude images and videos shot and shared without consent. This can come very handy in saving people from the physical and emotional trauma stemming from such content. Here's how the new tech would work.
AI tool to detect revenge porn
Previously, Facebook used to rely on community reports, mostly from the victims themselves, to flag and remove non-consensual intimate content. However, with the new tool, such content would be flagged before anyone reports it. Following this, the social network's Community Operations team will review the content in question and decide if it should be taken down or not.
Most victims don't even know about content being shared
"Often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared," Antigone Davis, Facebook's Global Head of Safety, said, explaining the importance of the new tool for better protecting victims.
Action on the confirmation of revenge porn
If the content flagged by the AI tool is found to be in violation of Facebook's community standards, the moderators will take it down and disable the account used for posting it. Notably, the tool will work in tandem with Facebook's existing pilot program that gives potential victims an emergency option to submit their nudes to the company and prevent them from being shared.
Pilot program has received positive response
The pilot program, which revolves around creating a digital fingerprint of an image to prevent it from being shared, initially drew flak. However, Facebook says it has received positive feedback from victims and support organizations and will expand it to more people in the future.
Also, Facebook is launching support hub for victims
Along with the tools, Facebook is also introducing a support hub for victims called 'Not Without My Consent'. With this hub, accessed through Safety Center, the social network aims to help targets of revenge porn find organizations and resources for support. This will also help them with the steps they need to take to get the content and prevent it from being shared further.
Still, the tech needs to evolve
Facebook's resources will prove useful for victim support, but to prevent exploitation from occurring in the first place, its tech needs to evolve further. Simply put, the tech should be improved to the point at which it not only detects revenge porn but also takes it down automatically, pretty much like other algorithms work. This would make the whole detection-protection process swifter.