By: Jessica Newman
On February 21, the Supreme Court heard arguments on a landmark technology case, Gonzalez v. Google. One of the issues argued considered the viability of Section 230. On February 22, the Supreme Court considered Twitter, Inc. v. Taamneh and addressed whether social media companies such as Twitter need to take more aggressive action to prevent terrorists from accessing and proliferating on their platforms. These decisions could impact the protections currently established by section 230 for large online platforms. However, the question then arises whether it is the role of the Supreme Court to make this decision? Or is it the role of Congress to make this decision? There is a strong chance that the Supreme Court will defer from making such a monumental decision; however, what is the likelihood that bipartisan congressional action will come to fruition? Before pondering these questions, I summarized the crux of the two cases below.
Gonzalez v. Google:
Facts: Petitioners in this case are family members of a young woman, Nohemi Gonzalez who was killed in the November 2015 terrorist attack in Paris in which ISIS took responsibility. ISIS has used YouTube to spread their message and attract followers to their terrorist organization.
Issue: “Whether Section 230(c)(1) of the Communications Decency Act immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.” In addition, was the use of YouTube for ISIS recruitment considered a violation of the Anti-Terrorism Act and Justice Against Terrorism Sponsors Act?
Twitter v. Taamneh:
Facts: Nawras Alassaf was killed in 2017 in a terrorist attack orchestrated by ISIS in Istanbul. On appeal, there is no reference to Section 230 as it was not reached by the lower courts and not addressed by the Ninth Circuit.
Issue: By allowing the distribution of content produced by ISIS “without editorial supervision,” did META including Twitter violate the ATA and JATSA?
At the crux of these cases is a struggle between the branches of government. Which body of Government should make such monumental decisions about technology? To a degree, neither Congress nor the Supreme Court seem like the ideal candidate.
The Telecommunications Act of 1996 included the provision that established Section 230. The purpose was to protect online platforms from the liability of third party users that publish content on these internet platforms. At this stage of the internet, this provision incentivized the growth of the internet and the spread of information online. However, as these platforms and the world wide web have grown beyond expectation, the question of section 230 reform has resurfaced time and time again. The rise in AI-powered algorithms and machine learning have made the internet unrecognizable from the platforms that existed in the 1990’s. Furthermore, many of these platforms including Google, Twitter, and Facebook, have quasi-governmental power and impact on people and society.
Additionally, amicus briefs to the Supreme Court contained a range of other concerns from economic disruption, the lack of meaningful redress rights, child safety, a disruption to innovation, and a limit to free speech and expression.
Through the years, Section 230 has been a topic of contention for both sides of the political aisle, and representatives from both sides have fought for Section 230 reform. On February 28, 2023, several senators, including Warner, Hirono, Klobuchar, Kaine, and Blumenthal, along with representatives Castor and Levin, introduced legislation to reform section 230 in order to “allow social media companies to held accountable for enabling cyber-stalking, online harassment, and discrimination on social media platforms.” Safeguarding Against Fraud Exploitation, Threats, Extremism and Consumer Harms Act is just one example of an attempt to add more nuance to Section 230.
Tied to the February cases are two cases decided by the 11th and 5th Circuit Court of Appeals in 2022. In both of these cases, NetChoice, an association of online platforms including Amazon, Meta, Lyft, and Etsy objected to similar laws passed in Florida and Texas which provided an outline for content-moderation on online platforms regarding political candidates and officials. The 11th Circuit Court essentially rejected the law; whereas, the 5th Circuit upheld the law. There is a high probability that the Supreme Court will hear these cases in the Fall.
The Supreme Court Justices are not experts on technology. During the oral arguments, many of the Justices appeared frustrated and concerned with the solutions brought forth by attorneys representing both parties. If these two cases bring forward such reactions, the same conflict is expected from the NetChoice cases as well. This begs the question of whether the Supreme Court is the appropriate actor to answer these questions. Many amicus briefs to the February cases cited Section 230 reform an appropriate job for Congress and that such action from the Supreme Court is an overreach of judicial power.
One example of bipartisan Section 230 reform, FOSTA (Allow States and Victims to Fight Online Sex Trafficking), created an exception to Section 230 protections for “providers and users of interactive computer services of Federal and State criminal and civil law relating to sexual exploitation of children or sex trafficking.” While this Act did garner bipartisan consent, notable downsides of this legislation included increased risk for sex workers. In addition, FOSTA did not help federal prosecutors in increasing the prosecution rate of human and sex trafficking. Bipartisan legislation will be necessary and should be encouraged in section 230 reform as the internet and these platforms have an undeniable impact on all persons in the United Stats and across the globe. However, beyond political consensus, Section 230 and liability for online platforms affects a wide range of stakeholders.
Notable academics and practitioners in the realm of Section 230 support empowering users to decide the type of content moderation that they want to see. However, it is important to note that plenty of content online is legal even if it is disturbing. This balance could be confusing. The First Amendment plays its hand here. Placing power in the hands of the users is known as “middleware” – where users could choose which speech rules they want to see online. However, plans for this form of content moderation still need further development and wider political support.
All in all, the government, technology companies, and users have a stake in these four cases. Next year’s decisions will be important to watch.