Gretchen A. Peck | for editor and editor
In the simplest terms, Section 230 – or, more formally, Section 230 of the Communications Act 1934, enacted as part of the Communications Decency Act in 1996 – distinguishes platforms publishers. Publishers can be held liable for the content they produce and deliver, but big tech platforms have successfully argued that they should be exempt from this legal risk, as they are merely pipelines of information and should not possibly not be held responsible for the content. users create and share.
Colloquially referred to as “the law that made the internet possible,” Section 230 has undoubtedly allowed platforms to grow, thrive, and grow to the epic proportions they enjoy today. Meta, for example, grossed $39.37 billion in 2021 alone, according to Statista.
Section 230 has come under scrutiny more recently, including by members of Congress and a certain ousted former president.
It seems that regardless of political affiliation, everyone wants to sue the platforms, or at least have the ability to sue them. Some critics want platforms to be held accountable for subjective censorship and deplatforming. They argue that the rules adopted by Twitter, Facebook and others are subjective, politically biased and amount to censorship in the digital “public square”.
Others want to be able to sue platforms for the content they To do allow on their apps and sites content deemed or proven to be harmful, such as disinformation about the global COVID-19 pandemic and vaccines, or disinformation campaigns intended to undermine the US election.
You can imagine the lawsuits that would accrue if the protections of Section 230 were repealed. This would force platforms to moderate content, become fact checkers – editors, for good – and censor their users more often than less. A full repeal would seem contrary to what some members of Congress say they advocate. A year ago, Congressional Republicans on the House Energy and Commerce Committee said they had a plan to revamp Section 230. They wanted the platforms to make a more concerted effort to detect and censor criminal activity, such as the sale of drugs or the exploitation of children, but they have also sought to prevent platforms from censoring “political speech” – itself a broad and subjective definition.
In July 2021, House Republicans Cathy McMorris Rodgers and Jim Jordan had in hand a bill that would require tech companies to disclose to the Federal Trade Commission, on a quarterly basis, the content rules and how they are applied.
Former President Donald J. Trump was a strong supporter of scrapping Section 230, but that was before he got into the social media business himself, with the launch of his TRUTH Social platform. – with an interface that looks a lot like Twitter. In an op-ed published by The Week in February 2022, authors Nicole Saad Bembridge and Trevor Burrus wrote about the rock and roll position TRUTH Social found itself in almost immediately. Users are promised a ‘family-friendly’ experience, requiring moderation, removal of ‘hate speech, spam, pornography and bullying’, and even – gurgle – banning users who don’t comply the rules, just like the platforms he criticized for de-platforming.
Although the former president is now busy flirting with a future race and his new business venture, which he put former congressman Devin Nunes in charge of, lawmakers back in DC continue their quest to overturn the article. 230.
For an idea of how laser-focused DC lawmakers are, consider these statistics, courtesy of Quinta Jurecic, Brookings Institution Fellow in Governance Studies, who wrote for Lawfareblog.com on March 15. 2022: “In the 116and Congress, lawmakers have officially introduced more than 25 bills to amend or repeal the law. 117and Congress has already seen nearly 30 such proposals. Arguments over technology policy and internet regulation pull in a number of different directions, but almost everyone seems to agree that something has to be done about it. [Section] 230 – even though no one can agree on what it is,” she wrote.
And that is of course the most important question: how to amend it so that the platforms can still exist but also be held more accountable?
David Chavern, President and CEO of the News Media Alliance
As President and CEO of the News Media Alliance (NMA), David Chavern represents the interests of the group’s more than 2,000 member organisations. He has been a proponent of retooling Section 230 and wrote an op-ed on the subject for Wired magazine in 2020 titled “Section 230 is a government license to build rage machines”. In it, he argued: “It is time for strong amendments.
E&P recently followed Chavern to learn more about how news publishers could win if Section 230 gets a smart congressional overhaul.
“At a very macro level, you have what is a pretty extraordinary disclaimer of liability that started out as a pretty simple idea that platforms are not responsible for what their users post on their services,” he said. -he explains.
“But what it really means is that even if you do bad things [as a platform]you are not responsible,” he added.
Chavern acknowledged that when Section 230 was conceived, it was a different time, when the internet was still in its infancy. But the way the platforms operate now no longer feels like a utility or a “stupid series of passive pipes”, he said. Now they are “extremely active content selectors and selectors.” It refers to the notorious proprietary algorithms that the platforms have designed to deliver personalized experiences to users and large but targeted audiences for advertisers.
“The reason news publishers care is that our content ends up in competitive environments on those platforms – Google and Facebook, in particular,” he said. “The incentives for the platforms are really skewed by the fact that there’s no accountability. Of the billions of content out there, Facebook decides what content you’re exposed to on the platform. They do some very editorials.
“Their motivations are to choose the things that grab your attention, not the things that have quality. The problem for news publishers is that we invest a lot in quality and expensive content. It’s actually much cheaper to invent things.… But at the end of the day, if section 230 allows platforms not to care about the quality of information, we are in a bad position, because our first value is quality Chavern pointed out, “Now we end up competing for people’s attention on these platforms against made-up garbage.”
Chavern suggested a good starting point for the amendment: legislating proprietary algorithms, forcing more public transparency into how platforms operate and allowing users to directly choose the content they experience online.
“Suddenly they would care more about quality,” he said, while acknowledging that it would impact the platforms’ relationship with advertisers. “Facebook chooses, individually, for people what they will see, and they are rewarded for it, because it is the key to their advertising business.”
If the power of content curation was entrusted to users? “You could see a world in which there is now a premium placed on quality content from news publishers,” Chavern said.
At least one Supreme Court justice appears ready to take on cases challenging Section 230. In early March, Justice Thomas wrote an opinion on the denial of certiorari in the Doe vs. Facebook, Inc. Case. He upheld the denial, but concluded: “We should, however, address the appropriate scope of immunity under section 230 in an appropriate case.”
Gretchen A. Peck is editor-in-chief of Editor & Publisher. She has been reporting for E&P since 2010 and welcomes comments on [email protected].