Congress amended the Communications Decency Act (CDA) in 1996 in an effort to stop minors from viewing sexually explicit material. Though this was the main goal of the act, Section 230 has perhaps had a greater impact on social media than the portion aimed at protecting children. Section 230 gives... Read More »
With Section 230 in the way, Meta stepped up to build and fund new tech for Revenge Porn Helpline, with Stop Nonconsensual Intimate Image Abuse & Take It Down
Meta, formerly known as the Facebook company, lent a much-needed helping hand to millions of victims plus a website called Revenge Porn Helpline, run by U.K.-based tech policy nonprofit SWGfL. Also called “Take It Down,” the site is open to children under the age of eighteen or their guardians.
Revenge porn is a scourge on the internet, as jilted lovers, anyone with a grudge, or blackmailing hackers post nude photos or videos online for the world to view.
Non-consensual image or video sharing, aka revenge porn, occurs when someone posts or sends nude or nearly nude videos or photos of someone else, without their consent, to others.
If you were to think revenge porn is limited to consenting adults, you’d be very wrong. One in three underage teenagers reports that they have seen non-consensual nudes of minors online, which according to the law is child porn. Revenge Porn Hotline reports a forty percent increase in reported child porn cases recently.
For adults, the statistics are sobering: one in twenty-five Americans has been a victim of revenge porn, according to the Data & Society Research Institute.
Despite 48 states and DC enacting laws to stop nonconsensual pornographic images or videos online, the federal Section 230 of the Communications Decency Act is blocking authorities from doing anything to stop this user-generated content.
Section 230 of the Communications Decent Act provides technology companies legal immunity from user-created content posted on their websites, which is acting as a Federal legal loophole that allows nonconsensual nude or nearly nude consent to be posted and shared.
The states with statutes making nonconsensual images and videos illegal have a daunting task because of the wording of Section 230. Websites can simply ignore requests to remove the content and not respond to any state’s legal injunctions because Section 230 offers them legal immunity.
Technology firms that discover sexually explicit images of minors on their sites are mandated by law to remove them, but adults experiencing nonconsensual image sharing have no such laws to protect them.
The “Stop Nonconsensual Intimate Image Abuse” program run by the Revenge Porn Helpline offers users a way to submit their cases. Meta built its technology to let young victims easily scan websites with nonconsensual images or videos to the new platform, which will then act and remove them from the sites, with the help of “Take It Down.”
The nonprofit running “Take It Down” is part of the nonprofit National Center for Missing & Exploited Children. When young people scan the offending websites, “Take It Down” gives them a specific hashtag. These hashtags are identity markers that tech firms can use to find the codes of the nonconsensual images and videos and then remove them.
Meta, Instagram, Pornhub and other sites are working with “Take it Down” and removing nonconsensual images or videos using these hashtags.
Meta global head of safety Antigone Davis released a statement about the new technology created to help victims of nonconsensual sharing of images online. He said that young people seeing themselves abused in this way is frightening and damaging.
“Having a personal intimate image shared with others can be scary and overwhelming, especially for young people,” said Davis. “It can feel even worse when someone tries to use those images as a threat for additional images, sexual contact or money - a crime known as sextortion.”
Sextortion is becoming a commonplace form of blackmail. The Department of Homeland Security reports it received more than 3,000 sextortion tips in 2022.
According to Homeland Security, a typical extortion scenario unfolds when a young teenager answers an online request to see their body or send a nude photo or webcam video to a new digital “friend.” Once the teens do this, their “friend” tells them to send more and expose more of their body, or they will publicize the images already sent. Sometimes, the fake “friend” demands money as well.
Related Articles
New Mexico's Attorney General Raúl Torrez has filed a civil suit accusing Meta, the parent company of Instagram and Facebook, of exposing children to sexually explicit materials that promote child abuse and human trafficking. In doing so, Torrez alleges Meta has enabled a “breeding ground” for predators looking to take... Read More »
Facebook, aka Meta, is at the forefront of the Metaverse, a virtual world where users can live an anonymous life among other artfully crafted Avatars. Though this world is anything but physical, the laws of the real world are coming into play as a woman said she was “virtually gang-raped”... Read More »
Jack Dorsey, Sundar Pichai, and Mark Zuckerberg, the CEOs of Twitter, Google, and Facebook, respectively, appeared before the Senate on October 28. The hearing was called to examine the consequences of Big Tech’s “immunity” under Section 230 of the Communications Decency Act (CDA). Section 230 gives platforms like Twitter, Google,... Read More »