Senate bill would update how liable online companies are for content on their platforms

Photo (c) Duncan_Andison - Getty Images

If approved, the PACT Act would limit online companies’ liability protections

Under a new bipartisan Senate bill, Section 230 of the Communications Decency Act -- which acts as a liability shield for online companies -- would be modernized. Online platforms like Facebook and Google would be required to disclose their content moderation practices, providing greater transparency to consumers. 

The Platform Accountability and Consumer Transparency Act (PACT Act), introduced by Democrat Senator Brian Schatz and Republican Senator John Thune, would hold internet companies responsible for the content that violates their own policies or is illegal.

“Section 230 was created to help jumpstart the internet economy, while giving internet companies the responsibility to set and enforce reasonable rules on content,” Sen. Schatz said in a statement. “But today, it has become clear that some companies have not taken that responsibility seriously enough.” 

“Our bill updates Section 230 by making platforms more accountable for their content moderation policies and providing more tools to protect consumers,” he said. 

Mandatory disclosures

Calling Section 230 “ripe for reform,” Sen. Thune said the revisions would require tech companies to establish policies that inform users about the content that is allowed and “provide notice to users that there is a process to dispute content moderation decisions.” 

“The internet has thrived because of the light touch approach by which it’s been governed in its relatively short history, “ Sen. Thune added. “By using that same approach when it comes to Section 230 reform, we can ensure platform users are protected, while also holding companies accountable.”

The changes would require internet platforms to explain their moderation practices in an acceptable use policy that could be viewed at any time by users. Companies would also be required to provide quarterly, “disaggregated” reports on content removed and other actions they’ve taken. 

A “defined complaint system” would be put in place to handle reports and notify users of moderation actions within two weeks, and an appeals process would be established. The National Institute of Standards and Technology would create a “voluntary framework” for guidelines and best practices. 

The proposed act would also limit Section 230’s ability to protect companies facing action from federal regulators and state attorneys general. 

“This is not designed to attract people who want to bully tech companies into political submission,” said Sen. Schatz told TechCrunch. “It’s designed to improve federal law.”

Modernizing Section 230

Section 230 has become more of a focus lately after President Trump, acting on outrage after Twitter fact-checked two of his tweets, announced that he would sign an executive order taking aim at the liability protections given to social media companies under the law. 

FCC Commissioner Geoffrey Starks said recently that there are policymakers who believe it’s time for Section 230 to be updated, but it’s not the president’s place to make that decision. 

“The broader debate about section 230 long predates President Trump’s conflict with Twitter in particular, and there are so many smart people who believe the law here should be updated,” Starks said in an interview with the Information Technology and Innovation Foundation. “But ultimately that debate belongs to Congress.” 

Take an Identity Theft Quiz. Get matched with an Authorized Partner.