Facebook Is Asking Its Users To Send Them Their Nude Photos
Facebook is making a somewhat odd request from users, asking them to send in nude photos in order to combat other nude photos going up
This article is more than 2 years old
It used to be if a nude photo of yourself was posted on Facebook by your significant other without your permission the fallout would be significant. You could have your reputation damaged, lose your job, and get ostracized from your entire community. If you’re in high school, the fallout from having your nudes shared with multiple people is even worse, especially for girls — you could lose your scholarship, get suspended or expelled, and get grounded for life (depending on the power dynamics in your household). Now Facebook, the embattled social media platform, is asking users to send them their risky nudes…for a good cause.
As reported by Deadline, Meta (formerly Facebook) has partnered with a UK-based non-profit called Revenge Porn Helpline to build a tool that allows people to prevent intimate images from being posted on Facebook, Instagram, and other participating platforms without their consent. The tool launched on Thursday, and it builds on a pilot program the company started in Australia in 2017.
The tool itself requires users to submit their nudes or otherwise risque images to a central, global website called StopNCII.org, which stands for Stop Non-Consensual Intimate Images. “It’s a massive step forward,” Sophie Mortimer, manager of Revenge Porn Helpline, told NBC News. “The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it.” This is the first step in helping Facebook combat this issue.
Here’s how the submission process works: StopNCII.org has to get consent from people and ask them to confirm that they are in an image. People can then select the photos and/or videos on their devices, including manipulated ones, that depict them nude or nearly nude. The images will then be converted into unique digital fingerprints called hashes, which will be passed on to participating companies, starting with Facebook and Instagram.
StopNCII.org, which was developed in consultation with 50 international partners specializing in image-based abuse, online safety, and women’s rights, won’t have access to or store copies of the original images. Instead, they will be converted into hashes in users’ browsers, and StopNCII.org will only get the hashed copies. Sophie Mortimer said that other large companies, including social media platforms, adult sites, and message boards, have expressed interest in joining the program to stop revenge porn, but they have not yet announced their participation in it.
Obtaining nudes from revenge porn victims — 90% of whom are women — to stop the same images from circulating on social media by their scorned ex-partners can be risky business. During Facebook’s 2017 pilot, the images were reviewed by human moderators at the point of submission and converted into hashes. That has raised some serious privacy concerns, such as the moderation team reviewing the nude images comprising mostly men. That’s saying something for a company that violates users’ privacy 24/7.
Fortunately, in the last four years, Facebook has developed more systems to combat revenge porn on its platform. However, in 2019, they said that collaboration with other companies was needed to stop people bent on sharing intimate photos and videos without the subjects’ consent from moving to other platforms to spread them. Thankfully, those who submit intimate material can track their cases in real time and withdraw their participation at any time.