New Mexico's Attorney General Raúl Torrez has filed a civil suit accusing Meta, the parent company of Instagram and Facebook, of exposing children to sexually explicit materials that promote child abuse and human trafficking. In doing so, Torrez alleges Meta has enabled a “breeding ground” for predators looking to take... Read More »
Leaking Internal Documents to WSJ, Facebook Insider Tells Senate Committee to Dig Deeper into Legal Liabilities for Algorithm-Driven Published Content
A Facebook former product manager, Frances Haugen, an insider-turned whistleblower, spoke at the Senate hearing created to “protect kids online.” The discussion focused on social media and its impact upon everyday teenagers, who have been exposed to hateful posts and misinformation on numerous platforms, on Facebook, Instagram, and more.
Haugen offered the Senate hearing specific details about how the inner workings of Facebook operate, with booming profits allegedly leading the way for the platform’s decisions about publishing specific posts. Armed with internal documents as proof, the whistleblower said Facebook’s issues with misinformation posts are grounded in specific algorithms plus the company’s choices as to what will be published.
Her recommendation to the Senate committee was clear: enforce much stricter transparency laws on the social media giant.
“We can afford nothing less than full transparency,” said Haugen. “As long as Facebook is operating in the shadows and hiding its research from public scrutiny, it is unaccountable.”
One legal connection Haugen made to the committee concerned Section 230 of the Communications Decency Act. This law protects social media platforms from being specifically held liable for any content posted by their users. Haugen said she believes Section 230 must be edited, making sure Congress exempts all social media decisions from enjoying algorithm liability protections. If these exemptions are adopted, then social media platforms such as Facebook could be held liable for the posts they allow.
In her statement to the Senate committee, Haugen said Facebook “wants you to believe …the problems we’re talking about are unsolvable. They want you to believe in false choices. That to be able to share fun photos of your kids with old friends, you must also be inundated with anger-driven virality. They want you to believe this is just part of the deal.”
Mark Zuckerberg, the clear largest stakeholder, Chief Executive and Chairman of Facebook, has the final say in what and how the platform acts when posting misinformation or angry content.
Currently, Facebook engages in engagement-based ranking, meaning the platform employs algorithms to place certain types of posts in the highest rankings. These posts are most often the most popular ones and generate the most shares, likes and comments, no matter the content.
In response to Haugen’s internal document-sharing with the Senate Committee and testimony, Facebook weighed in on Twitter, stating Haugen does “not work with child safety.”
Haugen, working with the Senate, assisted a series of Wall Street Journal articles in the deep workings of Facebook that discussed harmful posts for teenagers and how illegal human traffickers and others use the platform.
Related Articles
After a federal judge threw out an original Federal Trade Commission (F.T.C.) case against Facebook in June, brought by forty states, that accused the platform of being a monopoly, the F.T.C. came out swinging on Thursday. Led by the new F.T.C. chair Lina Khan, the refiled federal suit now contains... Read More »
On Wednesday, October 28, the Senate Committee on Commerce, Science and Technology virtually brought together the CEOs of Facebook, Twitter, and Google for a lengthy hearing on Social Media Content Moderation. The hearing centered around amending Section 230 of the Communication Decency Act (CDA). Section 230 is what Chairman Wicker... Read More »
Jack Dorsey, Sundar Pichai, and Mark Zuckerberg, the CEOs of Twitter, Google, and Facebook, respectively, appeared before the Senate on October 28. The hearing was called to examine the consequences of Big Tech’s “immunity” under Section 230 of the Communications Decency Act (CDA). Section 230 gives platforms like Twitter, Google,... Read More »