Clicky

women who feels censored from freedom of speech because of section 230

What is Section 230? Section 230 Explained

If you don’t fully understand what Section 230 is, don’t worry. You’re not alone. In fact, most of your Twitter and Facebook friends probably don’t understand it either — even if they talk about it a lot. This is an explanation of Section 230 written in simple terms that anyone can understand. What is Section 230 and why is everyone talking about it right now?

Basically, Section 230 is a law that protects website owners from being sued for any content that is posted by users of their websites. It means that the website owner will not be held legally responsible for content posted by someone who uses their website.

Think of Section 230 as legal immunity for user-generated content websites.

Section 230 is a vital piece of legislation that largely shaped how the internet works today. Without it, many of the online platforms you use every day might not exist. This is the reason why everyone should have a basic understanding of what Section 230 is and what it does (and does not) do. 

There are a lot of misunderstandings about Section 230. I hope to clear up any confusion you have and help you to understand why Section 230 is so important.

What is Section 230?

Section 230 in simple terms is a law that protects websites from being held legally responsible for content posted by their users. It was passed in 1996 as part of the Communications Decency Act and it has been controversial ever since. Basically, it protects websites like Facebook, Twitter, and YouTube from lawsuits over user-generated content.

That means if someone posts something on social media websites like YoutTube or Twitter that’s offensive or illegal, the website itself can’t be sued — only the person who posted it can be taken to court. Without Section 230, websites would have to constantly monitor what every single user posts on their website or risk being sued out of existence. 

The controversy around Section 230 has grown more heated recently and many lawmakers want to repeal or amend it. There have been dozens of bills introduced in Congress to reform, repeal, or change Section 230 — and the Supreme Court is about to rule on a case that could affect everyone who uses the internet.

The Controversy around Section 230 Explained

The 26 words that people are debating about are contained in Section 230(c) of the Communications Decency Act (CDA) which protects an “interactive computer service” from lawsuits. Section 230(c) says that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230(c)(2) explains the liability protections further by saying, that “any action voluntarily taken in good faith to restrict access to or availability of the material that the provider considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

This means that an interactive computer service (notice it does not say publisher or platform) will not be considered the publisher of the content posted by its users. And, it also states that no provider will be held liable if they choose to moderate their website’s user-generated content.

Or more simply put, a website owner is only legally responsible for the content they publish, not the content that their users post. 

This is the cause of the controversy. Some people believe that websites that edit their users’ content should not qualify for Section 230 immunity. However, they misunderstand the purpose of Section 230. Section 230 was created so that platforms could moderate their content without being held liable for what users of their websites posted.

The History of Section 230

In the early days of the internet, there were no laws that governed the content of websites. This posed a challenge for courts on how to handle lawsuits regarding user-generated content. Two court cases came up that involved user-generated content. One was a case against Compuserve (Cubby, Inc. v. CompuServe Inc) and the other was a case against Prodigy (Stratton Oakmont, Inc. v. Prodigy Servs. Co).

Cubby, Inc. v. CompuServe Inc

Compuserve hosted a community for journalists that published a daily newsletter called Rumorville USA. Rumorville was accused of posting a false and defamatory comment about one of its competitors. The competitor sued both Rumerville for posting the content and Compuserve for distributing Rumorville’s content.

Compuserve argued that they had no knowledge of the post and had no reason to know about it, considering the large volume of content posted on its forums every day. This was the first court case to address the legal responsibilities of websites that host user-generated content.

The court did not hold Compuserve libel for Rumorville’s content because they were simply a distributor of content and not the person who actually posted the content. Since they did not moderate the content on their forums, they did not have knowledge of the defamatory content. And, because they did not know about it, they could not be held responsible for it.

Stratton Oakmont, Inc. v. Prodigy Servs. Co.

The second case about how to handle user-generated content involved Prodigy. Prodigy hosted an online community as well. But, unlike Compuserve, Prodigy moderated the content posted on their forums based on their community guidelines.

An anonymous user posted defamatory comments on Prodigy’s Money Talk forum stating that the president of a large security brokerage had committed criminal and fraudulent acts. The company sued Prodigy over these comments, even though it wasn’t Prodigy who posted them. They argued that Prodigy was the publisher of the anonymous statements and should be legally responsible for them.

Under the law at the time, if Prodigy was the publisher, they could be held liable for comments posted on their forum. In this case, the court determined that Prodigy was not just a distributor of the content in question, but the publisher of the content as well, simply because they moderated their community. The court said because Prodigy exercised significant editorial control, this made them libel for the defamatory content.

Although the two cases were similar, Prodigy was found libel and Compuserve was not. Essentially, Compuserve was determined to be a mere distributor of content, and Prodigy was found to be a publisher, just because they moderated their forums. Lawmakers were concerned that if platforms could be sued for any content posted on their forums, these websites would choose not to moderate their forums at all. So, they proposed a solution – Section 230.

The Purpose of Section 230 

In 1996, Congress enacted the Communications Decency Act aimed at protecting minors from accessing obscene or pornographic content on the internet. It made it illegal for anyone to intentionally send or show minors obscene or indecent content. In 1996, Congress introduced a bipartisan amendment to this act that would become known as Section 230.

Section 230 was crafted by Representatives Ron Wyden (D) and Chris Cox (R) to promote free speech while also encouraging platforms to create and implement community standards. As Jeff Kosseff explains in his book, The Twenty-Six Words That Created the Internet, the two lawmakers were interested in writing legislation that encouraged platform owners to develop community standards. Without Section 230, there would be no incentive to do this.

“We really were interested in protecting the platforms from being held liable for the content posted on their sites and being sued out of existence,” Wyden, the co-author of Section 230 explained. “And we were interested in allowing the platforms to take down some content that they believe shouldn’t be on their site without being held liable for all the content on the site, so that you could really encourage responsible behavior.”

Which websites are protected by Section 230?

You’ve probably heard someone mention the common misconception that only platforms are protected under section 230, not publishers. Contrary to popular belief, section 230 extends protection to both platforms and publishers (at least for now). 

Section 230 does not distinguish between platform and publisher. Instead the term, “interactive computer services” is used. This has been broadly accepted to mean any website that publishes user-generated or third-party content. This includes blogs that allow users to post comments, websites like Amazon, Yelp, and Google that allow users to post reviews, and discussion boards like Discord or Slack.

How Section 230 Protects Free Speech

Section 230 has been incredibly important in allowing online communities to flourish, and it has played a huge role in the development of the internet as we know it today. Without Section 230, websites would be held liable for the content their users post. If this law is changed, this could lead to widespread online censorship and the end of platforms like Facebook, Twitter, and Youtube.

One of the chief arguments for reforming Section 230 is that some people believe that it enables big tech companies to censor conservative views. But, Section 230 actually protects everyone’s free speech, including conservatives. Conservatives should think carefully about the unintended consequences of any changes made to Section 230.

Section 230 allows for competition in the marketplace. If you don’t like the way Twitter or Facebook moderates user content, you can use another platform. This is why websites like Rumble are taking off. Consumers are using the free market to find communities that they like better. 

Without this law, smaller platforms might never be able to get off the ground. Stripping away the protections from Section 230 would be extremely harmful to smaller platforms that do not have the means to battle against expensive lawsuits in court, leaving us with only the tech giants to choose from.

While some think Section 230 shields the big tech companies by allowing them to censor free speech, it actually protects free speech. If we remove or reform Section 230, the internet as we know it may be gone. 

Why Republicans Dislike Section 230

In recent years, Republicans have become increasingly vocal about their opposition to Section 230. One reason is that they believe the section gives the tech giants too much power. They accuse these companies of censorship and silencing conservative voices, such as President Trump’s Twitter account being permanently suspended in January 2021. 

Some want to repeal section 230, while others want to amend it. Republicans are concerned about what they see as a lack of transparency from tech companies regarding their moderation policies. Some argue that Section 230 should be amended so that tech companies would have to publicly disclose how they decide which content is acceptable and which isn’t.

They also argue that social media platforms can’t be both a publisher and a platform. Many insist that social media companies should allow all users to post any type of content freely on their platform, including hate speech or misinformation. 

Like it or not, these platforms are businesses, and their objective is to make money. If they aren’t allowed to have editorial control of their content, they may choose not to provide these services at all or moderate their content even more heavily than they do now to avoid lawsuits.

Why Democrats Dislike Section 230

Democrats have also been vocal in their opposition to Section 230. They feel that the law provides a shield for tech companies that knowingly host dangerous content. They believe that the immunity provided to big tech companies in Section 230 contributed to the January 6 riot at the capital and the spread of misinformation.

Democrats also point out that Section 230’s broad protections can enable tech companies to be completely unaccountable for other questionable behavior such as promoting health misinformation, allowing discrimination and hate speech, or letting dangerous groups use their platform as a recruiting tool.

Democrats say Section 230 has been used to prevent victims of online abuse and harassment from taking legal action. They believe that tech companies should be held more accountable for their content and activities, especially when they profit from them. They have introduced bills that would control the larger platforms’ use of algorithms, essentially allowing the government to decide how posts should be filtered or recommended with these algorithms.

Ultimately, both Republicans and Democrats agree that Section 230 needs updating but disagree on how much and what type of reform is necessary. Libertarians, on the other hand, support Section 230.

Why Libertarians Support Section 230

Libertarians are passionate advocates of the First Amendment, which guarantees Americans the right to freedom of speech. Section 230 protects this right by providing immunity from liability to communities that provide a place for public discourse.

Online platforms, not just the big ones, provide everyone who uses the internet a place to communicate more freely. Admittedly, in today’s world, people can use online platforms to spread hate, defamatory speech, or misinformation. However, if we remove Section 230 or hand over control to the government, they will be in charge of how big tech companies moderate, filter, or censor their content. 

The free speech and civil discourse that we experience right now will disappear. Not only that, but this could destroy an entire sector of the economy. Neither the Republicans or Democrats have considered the effect that repealing or reforming will have on business owners, content creators, and the majority of people who own a website. 

For example, removing Section 230 protections would mean that if you post a defamatory or fraudulent comment on my blog, even if I don’t realize it, I could be held accountable. Section 230 is extremely important for content creators. It allows creators to produce content that will reach a broader audience without having to use their own websites or platforms. It allows users to collaborate, and share information. 

It provides all of us with a better internet experience. While Republicans are worried about censorship, and Democrats are worried about controlling the narrative, Libertarians want you to have the freedom to express yourself any way you choose!

What do you think?

I fear that Section 230 will soon be repealed. Nobody knows what the internet will look like if this happens, but I believe reform of any kind will reduce our right to freedom of speech. What do you think about Section 230? Are you in favor of keeping Section 230 as is or do you think Section 230 needs to be reformed?

Help me spread the message of liberty to more people.
Take a second to support Patty for Liberty on Patreon!

Share:

Facebook
Twitter
Pinterest
LinkedIn

2 thoughts on “What is Section 230? Section 230 Explained”

  1. Patty
    This is a great piece and your libertarian thoughts are well stated. However, 230 has been corrupted by the courts and now by government officials interference in content moderation practices. Now a hand full of men control what citizens are allowed to see, hear, or say which is troublesome to say the least.

    1. Patty for Liberty

      I don’t disagree with you on that but I don’t think repealing Section 230 is the solution. You don’t fix one problem by creating another. And, I definitely understand the concerns.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top