Advertisement

Op-Ed: The Supreme Court could upend the internet. How?

A sign reads "YouTube" in front of an office park
A Supreme Court case that tackles online content moderation stems from claims against YouTube and its owner Google.
(Jeff Chiu / Associated Press)
Share via

This month the Supreme Court marked a turning point in the history of the internet. The court agreed to consider Gonzalez vs. Google, its first case interpreting Section 230 — a once-obscure statute that is now widely credited for having “created the internet” and is debated by politicians on both sides of the aisle.

Section 230 states that online companies will not be “treated as the publisher” of any content provided by a third party, such as someone posting on the companies’ websites. Enacted by Congress in 1996 as part of the otherwise ill-fated Communications Decency Act, the law provides a degree of legal immunity to actors such as Google, Twitter and Facebook for the content shared on their platforms by users.

The law protects companies that provide a platform for other people’s speech from the constant threat of defamation suits — while still empowering them to remove content thought to be objectionable. This enabled the robust, often discordant discourse that defines the internet today. What might the Supreme Court’s intervention mean for its future?

Advertisement

The Supreme Court will review cases calling into doubt Section 230, which allows internet media platforms to function

The Gonzalez case now in the court’s hands arose after a young woman, Nohemi Gonzalez, was killed by an Islamic State attack in Paris. Her estate and family members contend that Google violated the Anti-Terrorism Act by allowing the terrorist organization to post content that furthered its mission on YouTube (which Google owns). They also claim that Google’s algorithms promoted Islamic State by recommending its content to users.

The two courts that have considered the case to date held that Section 230 immunity covers alleged violations of the Anti-Terrorism Act. But when considering different statutes in other decisions related to 230, the Court of Appeals for the 9th Circuit, with jurisdiction over West Coast cases, has more narrowly interpreted Section 230’s protections than other courts have. The possibility that this same statute might mean different things based on where someone lives in the U.S. contravenes the rule of law. Reconciling such inconsistencies is a common motivation for the Supreme Court taking a case and may explain the current court’s interest in Gonzalez, as might the novel questions around algorithmic recommendations. Justice Clarence Thomas also signaled an interest in taking up 230 in past dissents.

The court could simply adopt the broad view of 230 protection for platforms, reducing incentives to review the content those platforms carry. If the court adopts a narrower view, however, that would lead to more content moderation.

Advertisement

Supporters of the narrow position might argue that, while broad liability protection was appropriate when the industry first emerged, it is less justifiable now that internet companies are large and dominant. Stricter regulation could place greater responsibility on companies to exercise discretion over the content they host and bring to potentially millions of people.

On the other hand, those in favor of preserving extensive immunity with 230 argue that limiting protections to certain types of content will cause companies to remove everything remotely troublesome rather than undertake the difficult, controversial task of deciding on which side of the line a piece of content falls. The result would be the loss of a significant amount of online discourse, including anything with even the most tenuous possibility of creating liability.

Lawmakers killed one bill to regulate social media companies, but another proposal survives.

History provides good reason to worry that narrowing immunity may erode or stifle speech. Congress enacted an amendment in 2018 stipulating that Section 230 does not apply to content that violates laws prohibiting sex trafficking. Two days after that law went into effect, Craigslist took down its personals section rather than determine which content in fact related to prostitution. Other companies followed suit, applying similarly sweeping approaches. This experience suggests that restricting immunity may reduce the amount of available speech. It may even lead content curators to abandon current efforts to strengthen their oversight, because the more they moderate their content, the more likely they are to be scrutinized for it.

Advertisement

But the Supreme Court could approach the Gonzalez case in a completely different manner, focusing less on content moderation than on how platforms are designed. Section 230 clearly permits companies to remove certain types of objectionable content. What is less clear is whether the statute provides similar protection for algorithmic decisions to promote illegal content, which is the issue at hand in the Gonzalez plaintiffs’ objection to YouTube’s algorithms. Any online curator must decide how to serve their content to users. The justices could restrict platforms’ ability to use algorithms to recommend content, a strategy currently central to these companies’ business models and on which all users depend.

The Supreme Court’s resolution of the Gonzalez case will likely represent the most consequential update for Section 230 in the foreseeable future. Last year’s congressional hearings on issues raised by the statute reflected a partisan divide between Democrats calling for more content removal and Republicans calling for less — suggesting legislative consensus isn’t likely anytime soon. If the Supreme Court follows its expected schedule, we will know by the end of June whether or not it decides to remake the future of the internet.

Christopher S. Yoo is a professor of law and the founding director of the Center for Technology, Innovation & Competition at the University of Pennsylvania.

Advertisement