Abstract

Excerpted From: Spencer Overton and Catherine Powell, The Implications of Section 230 for Black Communities, 66 William and Mary Law Review 107 (October, 2024) (365 Footnotes) (Full Document)

 

OvertonPowellThe internet presents both opportunities and challenges to Black communities. Online platforms have allowed Black activists to build political movements, Black creators to find audiences, and Black businesses to reach customers. Unfortunately, online platforms have also facilitated white supremacy group organizing and domestic terrorism, steered housing opportunities toward white users and away from Black users, and allowed for the targeting of Black communities with disinformation about voting, health, and other critical issues.

These opportunities and challenges affect Black Americans broadly. Approximately 77% of Black adults use social media. Black adults are more likely than white adults to use Instagram, X (formerly known as Twitter), YouTube, WhatsApp, and TikTok. Black Americans are increasingly likely to rely on online platforms such as Google, Apartments.com, LinkedIn, and ZipRecruiter to access such essential services as housing and employment.

The internet's benefits and costs to Black communities are shaped by Section 230 of the Communications Decency Act (commonly known as “Section 230”), which was enacted as part of the Telecommunications Act of 1996. Section 230 has been recognized as the key legal provision that has facilitated the growth of the web because it has allowed platforms to freely host and remove third-party user content (for example, comments, posts, and videos) without fear of liability for the content posted or for moderating that content. Although privacy regulations and many other legal provisions also shape the experiences of Black people online and warrant examination, Section 230 is a primary provision and the focus of this Article.

Section 230(c) provides:

(c) Protection for “Good Samaritan” blocking and screening of offensive material.

(1) Treatment of publisher or speaker

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2) Civil liability

No provider or user of an interactive computer service shall be held liable on account of--

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material.

Courts have interpreted Section 230 to immunize website operators such as Facebook, X, and YouTube from legal liability for decisions to leave up or take down content created by third-party users, as well as decisions to make posts more likely or less likely to be viewed (that is, upranking or downranking). Thus, in a hypothetical police union's defamation lawsuit claiming that a Black Lives Matter activist's Facebook post was false and malicious, Meta (which owns Facebook, Instagram, and WhatsApp, all of which are protected by Section 230) would likely face no liability and would be immediately dismissed as a defendant. Similarly, Facebook would not be legally liable for removing a post by a user falsely claiming that voter fraud is rampant in Black communities.

Section 230 contains several carve-outs. The section does not give platforms immunity for violations of federal criminal law, intellectual property law, the federal Electronic Communications Privacy Act of 1986 and similar state laws, or federal sex-trafficking law and related state laws. Further, states may enforce state laws that are consistent with Section 230 but not those that are inconsistent with the section.

As nonstate actors, social media companies have free speech rights to engage in content moderation. While the First Amendment of the U.S. Constitution protects the right of platforms to publish and remove content (as recently underscored by the Supreme Court in the consolidated NetChoice litigation), judicial interpretations of Section 230 give platforms additional substantive and procedural tools. For example, unlike the First Amendment, Section 230 has been interpreted by the courts to allow platforms to evade liability for publishing defamation, deceptive trade practices, false advertising, and commercial speech and transactions, as well as to prohibit state and local governments from enacting laws that restrict platforms from publishing this material. Also unlike the First Amendment, Section 230 has been interpreted to enable early dismissals of lawsuits that avoid extensive discovery and allow more predictable outcomes for litigants. These expanded protections reduce platforms' litigation costs. These judicial interpretations of Section 230 also reduce plaintiffs' ability to recover when they suffer harms from content and conduct that are not protected by the First Amendment, such as deceptive trade practices and discrimination.

Many federal and state courts have construed Section 230 broadly--extending immunity even when platforms “republished content knowing it might violate the law, encouraged users to post illegal content, changed their design and policies to enable illegal activity, or sold dangerous products.”

For years, various experts and advocates have called for Congress to reform Section 230, for courts and the Federal Communications Commission to reinterpret Section 230, and for states to regulate tech platforms in ways that some argue conflict with Section 230. Unfortunately, despite the significant opportunities and challenges of the law for Black communities in the United States, no entity has comprehensively examined the implications of Section 230 and proposed reforms for Black communities in particular. This Article fills that void.

Granted, many of the Section 230 issues that confront Black communities also affect other communities, and an understanding of these common challenges is essential to policymaking. For example, both Black and Latino adults experience online harassment due to their race at more than two times the rate of their white counterparts. White supremacists post comments, manifestos, and videos on platforms protected by Section 230 to inspire each other to commit mass shootings, whether against mosques in Christchurch, New Zealand; a Walmart in heavily Latino El Paso; the Tree of Life synagogue in Pittsburgh; the LGBTQ+ Pulse nightclub in Orlando; or Mother Emanuel African Methodist Episcopal Church in Charleston. Black and transgender users' social media accounts are removed at disproportionately high rates, and such removals often occur when the users follow site policies or when their content falls into gray areas of content moderation. Section 230 also shields platforms from liability for technology-facilitated discrimination, harassment, and violence against women and members of LGBTQ+ communities, including individuals at intersections of various identities.

Although these connections are important, and this Article has benefited from studies that examine various demographic groups, an analysis of Section 230 that centers on Black communities in the United States adds unique value. Focusing on Black communities in a more targeted way illuminates the full costs and benefits of the immunity Section 230 provides for both third-party content and content moderation, as well as the costs and benefits of reform proposals. This is particularly important because Black communities are often underrepresented in large tech companies that enjoy Section 230 protections and in legal and policy debates surrounding Section 230. This Article also contributes to the capacity of civil rights organizations to develop independent perspectives and exercise more agency while participating in Section 230 reform debates.

As such, this Article is a tool that allows for Section 230 debates that are more informed and can better tailor proposed reforms to include the interests of all communities. Rather than purport to identify a single Section 230 proposal that will completely and permanently resolve all challenges facing Black communities, this Article reveals the most significant benefits and challenges to Black communities of Section 230 and a few popular reforms. Many of these insights provide factors for analyzing and improving Section 230 reforms generally, such as the effectiveness of a proposal in addressing primary challenges to Black communities, the scope of the challenges that will remain unaddressed, the likelihood that a proposal will exacerbate or create challenges to Black communities, and the potential for a proposal to result in overmoderation of Black users or curtail existing opportunities that Black communities enjoy.

Part I of this Article examines the opportunities and challenges to Black communities that stem from Section 230's insulation of tech companies from liability for pure third-party content. On one hand, the immunity Section 230 provides has arguably incentivized tech platforms to create virtual spaces that Black communities have used to build community; organize and engage in social activism; amplify issues underreported in traditional media; scale businesses; and build careers in music, video, and other creative endeavors. On the other hand, interpretations of Section 230--sometimes overly broad and unsupported by statutory text or judicial decisions--allow platforms to justify their hosting of unlawful activities that present challenges to Black communities, including the organization of white supremacist violence, housing and employment discrimination, and illegal election interference. Indeed, as economic and social activity increasingly move online, platforms' overly broad interpretations of Section 230 in designing their practices can expand opportunities for anti-Black discrimination and increase civil rights violations.

Part II of this Article details how Section 230 buttresses the freedom of platforms to moderate third-party content and analyzes the benefits and challenges of content moderation to Black communities. On one hand, Section 230 benefits Black communities by supporting platforms' ability to address, downrank, and remove--without fear of legal liability--a broad range of unsavory but lawful content that the First Amendment generally prevents government from regulating. Such content includes disinformation about voting, impersonating Black people online, hate speech, and white supremacy organizing. On the other hand, Section 230 presents challenges to Black communities by incentivizing platforms to overenforce platform guidelines against Black users and profit from anti-Black content that violates platform guidelines.

Part III analyzes several proposed reforms, including notice-and-takedown proposals, disclosure requirements, and carve-outs to immunity for civil rights laws, algorithmic recommendations, advertisements, and larger platforms. Many of these reforms address some but not all of the challenges faced by Black communities. Further, certain reforms could result in unintended harms, such as overmoderation of Black user content, and should be carefully tailored to minimize those harms. Part III also examines “content neutrality” proposals that purport to advance user “free speech,” which would generally harm Black communities by discouraging platforms from removing disinformation, discrimination, hate, and other harmful content. Despite technology's potential to liberate by ushering in a postracial future, our digital society reflects, and in some instances amplifies, race inequality and intolerance.

 

[. . .]

 

As detailed above, Section 230's immunity to platforms for third-party content and for content moderation presents distinct opportunities and challenges for Black communities. An understanding of these opportunities and challenges is helpful in the context of debates to reform Section 230. In considering reforms to Section 230, policymakers and advocates should analyze the effectiveness of a proposal in addressing primary challenges faced by Black communities due to Section 230, the scope of challenges that will remain unaddressed, the likelihood that the proposal will exacerbate existing challenges or create new challenges, and the potential for the proposal to result in overmoderation of Black users or curtail existing opportunities that Black communities enjoy. Reforms should be carefully tailored to address these issues and should explicitly clarify that platforms lack immunity when their algorithms, datasets, or platform designs materially contribute to unlawful discrimination and other illegality.


Spencer Overton is the Patricia Roberts Harris Research Professor at George Washington University Law School. Catherine Powell is the Eunice Hunton Carter Distinguished Research Scholar Professor at Fordham Law School.