Facebook introduced a new feature this week that will use photo-matching technology to help prevent so-called “revenge porn” from being shared.
When a Facebook user shares intimate photos of someone else without their permission, other users will be able to report it. Representatives from Facebook’s Community Operations team will then review the post and remove it if it’s found to violate the site’s community standards. Facebook also said it will often disable the account of the user who’s sharing revenge porn pics.
Most notably, Facebook said it will now also use photo-matching technology to prevent inappropriate photos that make it on to the site from being shared further on Facebook, Messenger and Instagram. If someone tries to share an image that has been reported and removed, they will receive an alert from Facebook that tells them they have been stopped from sharing it.
“We’ve focused in on this because of the unique harm that this kind of sharing has on its victims,” Facebook Global Head of Safety Antigone Davis told TechCrunch. “In the newsroom post we refer to a specific piece of research around the unique harm this has for victims. I think that’s where the focus was for this moving forward.”
In terms of protecting users, this is a huge step forward for Facebook. However, the solution seems so simple and easy to implement that one has to wonder why the site didn’t introduce something like it a long time ago.