Sep 22, 2024

Leaking Internal Documents to WSJ, Facebook Insider Tells Senate Committee to Dig Deeper into Legal Liabilities for Algorithm-Driven Published Content

by Diane Lilli | Oct 06, 2021
Frances Haugen testified at a Senate hearing  on how Facebook’s algorithms promote harmful content. Photo Source: Frances Haugen testifying at a Senate hearing on how Facebook’s algorithms promote harmful content, file photo, October 5, 2021. (Matt McClain/Pool/Getty Images via Slate)

A Facebook former product manager, Frances Haugen, an insider-turned whistleblower, spoke at the Senate hearing created to “protect kids online.” The discussion focused on social media and its impact upon everyday teenagers, who have been exposed to hateful posts and misinformation on numerous platforms, on Facebook, Instagram, and more.

Haugen offered the Senate hearing specific details about how the inner workings of Facebook operate, with booming profits allegedly leading the way for the platform’s decisions about publishing specific posts. Armed with internal documents as proof, the whistleblower said Facebook’s issues with misinformation posts are grounded in specific algorithms plus the company’s choices as to what will be published.

Her recommendation to the Senate committee was clear: enforce much stricter transparency laws on the social media giant.

“We can afford nothing less than full transparency,” said Haugen. “As long as Facebook is operating in the shadows and hiding its research from public scrutiny, it is unaccountable.”

One legal connection Haugen made to the committee concerned Section 230 of the Communications Decency Act. This law protects social media platforms from being specifically held liable for any content posted by their users. Haugen said she believes Section 230 must be edited, making sure Congress exempts all social media decisions from enjoying algorithm liability protections. If these exemptions are adopted, then social media platforms such as Facebook could be held liable for the posts they allow.

In her statement to the Senate committee, Haugen said Facebook “wants you to believe …the problems we’re talking about are unsolvable. They want you to believe in false choices. That to be able to share fun photos of your kids with old friends, you must also be inundated with anger-driven virality. They want you to believe this is just part of the deal.”

Mark Zuckerberg, the clear largest stakeholder, Chief Executive and Chairman of Facebook, has the final say in what and how the platform acts when posting misinformation or angry content.

Currently, Facebook engages in engagement-based ranking, meaning the platform employs algorithms to place certain types of posts in the highest rankings. These posts are most often the most popular ones and generate the most shares, likes and comments, no matter the content.

In response to Haugen’s internal document-sharing with the Senate Committee and testimony, Facebook weighed in on Twitter, stating Haugen does “not work with child safety.”

Haugen, working with the Senate, assisted a series of Wall Street Journal articles in the deep workings of Facebook that discussed harmful posts for teenagers and how illegal human traffickers and others use the platform.

Share This Article

If you found this article insightful, consider sharing it with your network.

Diane Lilli
Diane Lilli
Diane Lilli is an award-winning Journalist, Editor, and Author with over 18 years of experience contributing to New Jersey news outlets, both in print and online. Notably, she played a pivotal role in launching the first daily digital newspaper, Jersey Tomato Press, in 2005. Her work has been featured in various newspapers, journals, magazines, and literary publications across the nation. Diane is the proud recipient of the Shirley Chisholm Journalism Award.