This is a free excerpt from one of Shortform’s Articles. We give you all the important information you need to know about current events and more.
Don't miss out on the whole story. Sign up for a free trial here .
What is Section 230 in terms of social media companies? Why is it being challenged?
U.S. Code Section 230 is a law saying companies that provide access to information via the internet are not liable for content posted by third parties. If the Supreme Court rules that Section 230 doesn’t apply to social media companies, they will likely block all controversial content to limit their liability.
Read on to learn about Section 230 and how these changes may transform social media as we know it.
Section 230 & Social Media Explained
This year, SCOTUS will likely rule on three cases that could reshape social media as we know it, or dictate what shapes it can take by clarifying the boundaries of social media companies’ legal immunity and powers of suppression. Until now, under U.S. Code Section 230, social media companies have enjoyed legal immunity from the content posted on their platforms by third parties.
In this article, we’ll explain how Section 230 applies to social media companies, why it’s being challenged, and how social media companies are responding.
What the Laws Currently Say
The reason that these boundaries are unclear is that they are specified by laws which were written before the rise of social media. The court’s rulings will focus on how U.S. Code Section 230 and the First Amendment of the U.S. Constitution apply to social media companies.
Passed in 1791, the First Amendment of the U.S. Constitution prohibits the government from suppressing free speech, including newspapers and other publishers. However, it doesn’t prevent publishers from being held accountable for what they publish. For example, within certain limits, if a newspaper prints defamatory articles about you, you can sue them for libel. And a publisher who calls for riots acts of terror could be charged with incitement.
Passed in 1996 as the Communications Decency Act, Section 230 of the U.S. Code encourages companies that provide access to information via the internet to voluntarily restrict violent, obscene, or otherwise objectionable content. It does this mostly by establishing that they cannot be held liable for content that people post on their platforms because they are not the ones publishing or “speaking” the content. Instead, the person who creates or uploads content is solely responsible for what she says online—even when platforms actively filter what their users post.
Challenging the Scope of Section 230
This month, SCOTUS will likely hear Gonzalez vs Google, which challenges the blanket immunity from liability that social media companies have enjoyed under Section 230.
The Gonzalez family sued Google over the death of their daughter Nohemi in a terrorist attack that was allegedly incited by a YouTube video. They argue that Section 230 shouldn’t apply in this case, and Google (which operates YouTube) should be held accountable when content on their platform contributes to terrorist attacks.
The U.S. Justice Department filed a brief in support of Gonzalez, saying that social media companies like YouTube shouldn’t qualify for Section 230 immunity because they go beyond just removing objectionable content, and use their algorithms to promote content for users. Promoting the content should make them accountable for it, especially in cases like this where their algorithms appear to have amplified the terrorists’ message.
What Are Social Media Companies Saying?
A number of social media companies, including Meta (Facebook), Microsoft (which runs LinkedIn), and Twitter filed briefs in support of Google, saying that a narrower definition of immunity under Section 230 would not only undermine their business model, but also hinder public discussion of relevant issues, especially controversial issues.
They argue that their content recommendation algorithms are essential because they make the vast amount of information stored on their platforms accessible: Without algorithms to selectively promote content to individual users, nobody would be able to find what they’re looking for on social media sites. And without user engagement, the platforms would die. But if social media companies become liable for what their users post, they would likely remove all content that’s even remotely controversial to protect themselves from liability.
Want to fast-track your learning? With Shortform, you’ll gain insights you won't find anywhere else .
Here's what you’ll get when you sign up for Shortform :
- Complicated ideas explained in simple and concise ways
- Smart analysis that connects what you’re reading to other key concepts
- Writing with zero fluff because we know how important your time is