Do Social Media Platforms Have Too Much Power Under the Protection of Section 230?

KEY TAKEAWAYS

  • Section 230 protects internet service providers and interactive computer service providers from liability based on content posted by third parties and users, while allowing them to choose what to moderate.
  • Supporters of the law say it gives social media platforms and Big Tech powerful tools to combat hate, disinformation, and abuse online.
  • Democrat and Republican lawmakers fear Section 230 grants the platforms too much power and need to be held more accountable with a potential revamp of the law.

Let’s travel back to 1995 in America. NASA’S Galileo spacecraft reaches Jupiter after a six-year journey. Timothy McVeigh commits the devastating Oklahoma City bombing terrorist attack. Michael Jordan returns to the NBA. 

It’s also the year the modern internet took the world by storm, and our lives would never be the same. I remember getting impatient with the tune and hum of dial-up, casting aside my encyclopedias I had been so proud of the year before, and thrilling at Minesweeper and Links. “I think he hit the tree, Jim!” If you know, you know. 

HTML, URL, and Java are developed, Microsoft’s Windows 95, Amazon, Yahoo, and eBay are launched, and the World Wide Web is born. People with a little knowledge of how to navigate this new tool could, and did, put anything and everything at our fingertips. But, unfortunately, new technology often means new regulations, and the internet was no exception. 

In 1996, Section 230 of the Communications Decency Act (CDA) was passed to regulate online pornography. Today, it’s coming under fire due to the power technology companies can hold on political discussions. As a result, lawmakers, social media platforms, and Big Tech CEOs are at odds on how to go about repealing or revamping the law, as it could have far-reaching consequences.

WHAT IS SECTION 230? 

The Telecommunications Act of 1996 was passed in response to the internet boom and it is a rewrite of the Communications Act of 1934. Occasional rewrites serve as proof lawmakers can revamp and update the laws that define our culture as telecommunications evolve, and it may be time for the CDA to get some attention. 

Read: The Bipartisan ‘Endless Frontier Act’ Increases Tech Research Spending to $100bn Annually to Outcompete Communist China

Section 230 is the only aspect of the CDA that survived because it violated First Amendment-protected free speech. Specifically, Section 230 says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This protects websites from lawsuits if a user posts something illegal, although there are exceptions for copyright violations, sex work-related material, and violations of federal criminal law.

Under Section 230, the protected intermediaries include regular Internet Service Providers (ISPs) and a range of “interactive computer service providers.” As a result, online platforms like Facebook, Twitter, and YouTube that publish third-party content can’t be held liable for what people post and allow them to moderate user content without being treated as publishers

As tech companies have grown and become more powerful, Congress and other lawmakers have questioned the modern efficacy of the law. As a result, they are exploring what an update might require if it’s needed at all. 

WHY SHOULD IT STAND?

Often touted as “Big Tech’s favorite law,” Section 230 is considered by the Electronic Frontier Foundation (EFF) to be one of the most valuable tools for protecting freedom of expression and innovation on the internet. Without these protections, providers could be heavily deterred from allowing user content of any kind for fear of litigation or would need to heavily censor all we see, say, and do online. 

This protection has allowed for unfettered growth among social media companies and other sites and services and has shaped the way we interact with these platforms and each other. 

Fans of the law say it gives platforms powerful tools to combat hate, disinformation, and abuse online. For example, you may recall a case from 2018 when Twitter was able to use Section 230 to ban Jared Taylor from the site for openly advocating for a “majority-white nation.” 

WHAT SHOULD CHANGE?

Critics say tech companies abuse the power Section 230 provides and need to be held more accountable. Hate speech and politically biased material can go checked or unchecked at the behest of site moderators and have undue influence on content consumers, even causing harm in some cases. 

Tech CEO's Testify Remotely Before Senate Judiciary
WASHINGTON, DC – NOVEMBER 17: Facebook CEO Mark Zuckerberg testifies remotely during a Senate Judiciary Committee hearing titled, “Breaking the News: Censorship, Suppression, and the 2020 Election” on Capitol Hill on November 17, 2020 in Washington, DC. Twitter CEO Jack Dorsey is also scheduled to testify remotely. (Photo by Hannah McKay-Pool/Getty Images)

The Jared Taylor case is helpful in considering the other side of the Section 230 debate. As a result of his ban, he tried (unsuccessfully) to sue Twitter, his lawyers citing “viewpoint discrimination,” and Taylor claiming Twitter’s free-speech pledge to be fraudulent. His lawyer Noah Peters wrote that “Twitter censorship should terrify everyone.” He goes on to say, “Our lawsuit is not about whether Taylor is right or wrong. It’s about whether Twitter and other technology companies have the right to ban individuals from using their services based on their perceived viewpoints and affiliations.”

According to a transcript of his remarks, former Attorney General William Barr stated, “Section 230 has been interpreted quite broadly by the courts. Today, many are concerned that Section 230 immunity has been extended far beyond what Congress originally intended. Ironically, Section 230 has enabled platforms to absolve themselves completely of responsibility for policing their platforms, while blocking or removing third-party speech — including political speech — selectively, and with impunity.”

Some Democrats say it allows tech companies to get away with not moderating content enough, while some Republicans say it enables them to moderate too much.

WHY THIS MATTERS

When Section 230 was drafted, regulators couldn’t possibly know what the internet would someday become. Though on opposite sides as to why it should be reformed, both political parties agree action needs to be taken to modernize the law. Free speech is essential, but so is limiting the influence unchecked content providers have over our society and culture. 

In the wake of the March 2021 congressional hearing on tech companies’ role in spreading misinformation, it’s clear that it’s a hot debate that is not going away. But, unfortunately, even the Big Tech CEOs couldn’t agree on if or how the law should be reformed. 

We get most of our information about how to live from the internet and social media platforms. They can influence how we vote, what we put in our bodies, how we think, even who we choose to interact with (hello algorithms). It’s easy and hard to control the types of content we consume, meaning the platform controls what we see (hard for us to get around), but we control what we consume (easy to read or scroll on by). 

As purveyors, consumers, and user content providers of the internet, we need to take more responsibly for what we consume and share while Big Tech and lawmakers decide what to do next with Section 230. Do you find three credible sources to back up that article you just posted? Are you actively seeking out a viewpoint that challenges your own to ensure you’ve explored both sides of an issue before Tweeting? Do you read a celebrity’s opinion on social media and assume it’s the whole truth? 

There’s not a lot we can do to determine what will happen with Section 230, but we can check ourselves around how we utilize social media. We can practice critical thinking by taking responsibility for the quality and variety of the content we consume and post. 

Click this link for the original source of this article.
Author: Samantha DeTurk


This content is courtesy of, and owned and copyrighted by, https://thinkcivics.com and its author. This content is made available by use of the public RSS feed offered by the host site and is used for educational purposes only. If you are the author or represent the host site and would like this content removed now and in the future, please contact USSANews.com using the email address in the Contact page found in the website menu.

0 Comments
Inline Feedbacks
View all comments

USSANews.com
A better search engine: DuckDuckGo.com.
Visit our Discussion Forum at Libertati.com.

Follow us:
WP Twitter Auto Publish Powered By : XYZScripts.com