Facebook whistleblower Frances Haugen returned to Congress to discuss legislative proposals to reform Section 230 of the Communications Decency Act. That's the liability shield that protects social media companies from being sued by users for their content moderation decisions, like removing some posts and keeping others up.
"Facebook's profit-optimizing machine is generating self-harm and self-hate — especially for vulnerable groups," Haugen said.
Haugen last month released a trove of internal Facebook documents and research showing, among other things, that Facebook executives knew its hate-speech problem was wider than they disclosed, and that its Instagram platform was making body image issues worse for roughly one-third of teen girls.
Lawmakers think these documents could open a path for legislation to overhaul Section 230, in order to give users a legal tool to hold social media companies accountable when they serve them harmful content, or when their platforms cause real-world harm for users.
Haugen and other Facebook critics told the House Energy and Commerce Committee that Facebook currently benefits from Section 230 because it shields it from accountability around changing its content-serving algorithms, which have been shown to do things like prioritizing angry content that is more likely to contain misinformation, because it gets more engagement.
"Facebook does not have safety by design, and it chooses to run the engine hot because it maximizes their profit," Haugen said. "The result is a system that amplifies division, extremism and polarization."
"We have to go to Facebook and ask for benevolence in dealing with their harms," said Rashad Robinson, President of Color of Change. "Congress has done this with other industries. We need to create rules to hold them accountable. Whether it's product design, or in advertising, Facebook is completely not accountable."
Energy and Commerce committee members have released several proposals to change the law. One bill would limit 230 liability protections when a platform knows that its algorithms are promoting recklessly harmful content. Another would prevent platforms from using 230 as a defense in civil rights and terrorism cases.
All in all, witnesses at the hearing stressed that action needs to be taken now.
"We have literally been over a decade without major reforms for these companies," said Jim Steyer, Founder and CEO of Common Sense Media. "And we've assumed that in some cases that they would self-police or self-regulate. Well, that's not true, and the record is clear."
Facebook's Mark Zuckerberg has said he supports some changes to Section 230 but wants any new regulations to require that platforms demonstrate that they have systems in place to identify unlawful content and remove it.