News

A New Privacy Feature For Instagram & Facebook Gives Users More Control Of Personal Images

Carl Court/Getty Images News/Getty Images

As social media becomes more ubiquitous in daily life, it's important that social media platforms implement technology to help protect people from non-consensual online experiences that could infringe on privacy rights. In an effort to do this, Facebook is building on its work to support people impacted by online abuse with a new community feature. The "Not Without My Consent" feature for Instagram and Facebook is a step forward for online privacy and will provide tools for people affected by online abuse to take action.

Facebook's Not Without My Consent is a new addition to the website's Safety Center as of Friday, March 15. With a goal of responding "to intimate images shared without permission," the feature was developed with the assistance of experts to help victims find organizations and resources to support them. Additionally, the feature helps to remove the sensitive content from Facebook, Facebook Messenger, and/or Instagram — and its goal is to prevent the content from being distributed or shared again.

Hopefully, you're never in a situation where your privacy is violated like that, but this new feature is here to help you every step of the way if it were to happen.

When users first navigate to the webpage they will find four expert recommendations for what to do when someone has shared an intimate image without consent or is threatening to do so. These recommendations include:

  • Contact law enforcement if there's concern about physical safety.
  • Surround oneself with a support system to help navigate the process.
  • Take screenshots or print out images before taking steps to delete the content.
  • Seek additional support and guidance.
Courtesy of Facebook

The webpage also has clear directions for how users should report when they're ready. Facebook, Instagram, and Messenger users can report non-consensually posted intimate images using the "Report" link that pops up after clicking the downward arrow on the post. To do this, users can also fill out a form called "Report Blackmail, Intimate Images or Threats to Share Intimate Images." Facebook also provides resources through its NCII Pilot for people who are concerned that someone may share their intimate images online, even if they haven't done so yet. Finally, people impacted by this kind of online abuse can also access Cyber Civil Rights Initiative's "Online Removal Guide" through the Not Without My Consent webpage to report non-consensually posted images elsewhere on the internet.

Courtesy of Facebook

Facebook's push for heightened privacy and victim support extends beyond the Not Without My Consent online support hub. The social media tech company is implementing new detection technology to proactively find near nude images or videos that are shared without permission on Facebook and Instagram, even before the images or videos are reported. When an image has been detected, Facebook will remove it and possibly disable the account for sharing an intimate image without consent. According to the press release, proactive detection of this kind is important for two reasons. First of all, sometimes victims can be afraid of retribution and decide it better not to report, and secondly, victims can be unaware that the content has been shared online.

According to a 2017 Facebook press release, among people in the U.S. who are victims of shared non-consensual intimate images, almost 100 percent report emotional distress and over 80 percent report "significant impairment in social, occupational or other important areas of their life." Facebook's new privacy and safety policies will hopefully position the social media giant to better protect its users from these kinds of impacts.