
Section 230
A piece of US internet legislation that provides immunity for website platforms from third-party content. Its application to algorithmic content promotion is currently being challenged in court.
entitydetail.created_at
8/22/2025, 1:48:58 AM
entitydetail.last_updated
8/22/2025, 1:50:34 AM
entitydetail.research_retrieved
8/22/2025, 1:50:34 AM
Summary
Section 230, codified as 47 U.S.C. § 230 and enacted as part of the Communications Decency Act of 1996, is a pivotal U.S. law that grants interactive computer services immunity from liability for content posted by third-party users. This immunity, designed to treat online platforms as distributors rather than publishers, was a direct response to early 1990s lawsuits that created conflicting precedents on platform responsibility. It also includes "Good Samaritan" provisions, enabling platforms to moderate objectionable content in good faith without legal repercussions. Although its parent act, the Communications Decency Act, was largely struck down as unconstitutional by the Supreme Court in *Reno v. American Civil Liberties Union* (1997), Section 230 was deemed severable and has withstood subsequent challenges. However, its protections are not absolute, with explicit carve-outs for federal criminal law, intellectual property, and human trafficking, notably amended by FOSTA-SESTA in 2018. In recent years, Section 230 has become a contentious topic in political discourse, particularly concerning its role in content moderation, hate speech, and alleged ideological biases, especially highlighted during the 2020 US presidential election. A recent court ruling, specifically regarding TikTok's algorithms in a "Blackout Challenge" lawsuit, indicates a potential reinterpretation of its scope, suggesting that algorithmic content curation might not always fall under its immunity.
Referenced in 1 Document
Research Data
Extracted Attributes
Authors
Representatives Christopher Cox and Ron Wyden
Exceptions
Federal criminal law, intellectual property law, human trafficking laws
Legal Code
47 U.S.C. § 230
Core Principle
Treats online platforms as distributors, not publishers
Primary Purpose
Provides immunity to interactive computer services for third-party content
Secondary Purpose
Good Samaritan protection for content moderation
Enacting Legislation
Communications Decency Act of 1996 (Title V of the Telecommunications Act of 1996)
Timeline
- Lawsuits against online discussion platforms, such as *Stratton Oakmont, Inc. v. Prodigy Services Co.* and *Cubby, Inc. v. CompuServe Inc.*, result in differing interpretations of platform liability, prompting the need for Section 230. (Source: Summary, Wikipedia, DBPedia)
1990s
- Section 230 is enacted as Section 509 of the Communications Decency Act, which is Title V of the Telecommunications Act of 1996. (Source: Summary, Wikipedia, DBPedia, Web Search)
1996-02-08
- The Supreme Court rules the Communications Decency Act unconstitutional in *Reno v. American Civil Liberties Union*, but Section 230 is determined to be severable and remains in place. (Source: Summary, Wikipedia, DBPedia, Web Search)
1997-06-26
- Section 230 is amended by the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA-SESTA) to require the removal of material violating federal and state sex trafficking laws. (Source: Summary, Wikipedia, DBPedia)
2018-04-11
- Section 230 becomes a major issue in the United States presidential election, facing increased scrutiny regarding hate speech and ideological biases. (Source: Summary, Wikipedia, DBPedia)
2020
- A court rules in *Doe v. TikTok* that Section 230 immunity does not protect TikTok's algorithms in a lawsuit related to the 'Blackout Challenge', suggesting algorithmic content curation might constitute editorial judgment. (Source: Summary, Related Documents)
2023-05-17
Wikipedia
View on WikipediaSection 230
In the United States, Section 230 is a section of the Communications Act of 1934 that was enacted as part of the Communications Decency Act of 1996, which is Title V of the Telecommunications Act of 1996, and generally provides immunity for online computer services with respect to third-party content generated by their users. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. Section 230(c)(2) further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the voluntary good faith removal or moderation of third-party material the operator "considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." Section 230 was developed in response to a pair of lawsuits against online discussion platforms in the early 1990s that resulted in different interpretations of whether the service providers should be treated as publishers, Stratton Oakmont, Inc. v. Prodigy Services Co., or alternatively, as distributors of content created by their users, Cubby, Inc. v. CompuServe Inc. The section's authors, Representatives Christopher Cox and Ron Wyden, believed interactive computer services should be treated as distributors, not liable for the content they distributed, as a means to protect the growing Internet at the time. Section 230 was enacted as section 509 of the Communications Decency Act (CDA) of 1996 (a common name for Title V of the Telecommunications Act of 1996). After passage of the Telecommunications Act, the CDA was challenged in courts and was ruled by the Supreme Court in Reno v. American Civil Liberties Union (1997) to be unconstitutional, though Section 230 was determined to be severable from the rest of the legislation and remained in place. Since then, several legal challenges have validated the constitutionality of Section 230. Section 230 protections are not limitless and require providers to remove material that violates federal criminal law, intellectual property law, or human trafficking law. In 2018, Section 230 was amended by the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA-SESTA) to require the removal of material violating federal and state sex trafficking laws. In the following years, protections from Section 230 have come under more scrutiny on issues related to hate speech and ideological biases in relation to the power that technology companies can hold on political discussions and became a major issue during the 2020 United States presidential election, especially with regard to alleged censorship of more conservative viewpoints on social media. Passed when Internet use was just starting to expand in both breadth of services and range of consumers in the United States, Section 230 has frequently been referred to as a key law, which allowed the Internet to develop.
Web Search Results
- Section 230 Protections
Section 230 refers to Section 230 of Title 47 of the United States Code (47 USC § 230). It was passed as part of the much-maligned Communication Decency Act of 1996. Many aspects of the CDA were unconstitutional restrictions of freedom of speech (and, with EFF's help, struck down by the Supreme Court), but this section survived and has been a valuable defense for Internet intermediaries ever since. For more, check out EFF's CDA 230 issue page. ### What protection does Section 230 provide? [...] Section 230 says that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This federal law preempts any state laws to the contrary: "[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section." The courts have repeatedly rejected attempts to limit the reach of Section 230 to "traditional" Internet [...] Courts have held that Section 230 prevents you from being held liable even if you exercise the usual prerogative of publishers to edit the material you publish. You may also delete entire posts. However, you may still be held responsible for information you provide in commentary or through editing. For example, if you edit the statement, "Fred is not a criminal" to remove the word "not," a court might find that you have sufficiently contributed to the content to take it as your own. Likewise,
- Section 230
Section 230 was enacted as section 509 of the Communications Decency Act (CDA) of 1996 (a common name for Title V of the Telecommunications Act of 1996).( After passage of the Telecommunications Act, the CDA was challenged in courts and was ruled by the Supreme Court in _Reno v. American Civil Liberties Union_ (1997) to be unconstitutional, though Section 230 was determined to be severable from the rest of the legislation and remained in place. Since then, several legal challenges have [...] Section 230 was developed in response to a pair of lawsuits against online discussion platforms in the early 1990s that resulted in different interpretations of whether the service providers should be treated as publishers, _Stratton Oakmont, Inc. v. Prodigy Services Co._, or alternatively, as distributors of content created by their users, _Cubby, Inc. v. CompuServe Inc._( The section's authors, Representatives Christopher Cox and Ron Wyden, believed interactive computer services should be [...] Section 230 has two primary parts both listed under §230(c) as the "Good Samaritan" portion of the law. Under section 230(c)(1), as identified above, an information service provider shall not be treated as a "publisher or speaker" of information from another provider. Section 230(c)(2) provides immunity from civil liabilities for information service providers that remove or restrict content from their services they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or
- Interpreting the Ambiguities of Section 230
1996.21 21.Although the law is commonly described as “Section 230 of the Communications Decency Act,” it was actually Section 509 of the Telecommunications Act of 1996, of which Title V (covering sections 501-09) was the Communications Decency Act. Section 509 created a new section 230 in the Communications Act of 1934, codified at 47 U.S.C. § 230. Thus, the proper name of Section 230 should be “Section 230 of the Communications Act of 1934, as amended” or “Section 509 of the Telecommunications [...] When Section 230 was enacted in 1996, it was as part of a broader congressional response to the perceived dangers of the internet—specifically, the problem of children being exposed to inappropriate content, especially pornography. In fact, Section 230, although it has come to assume a central role in internet law, was only one small, and relatively obscure, part of a much broader legislative package, the Communications Decency Act (CDA) of 1996, itself part of the Telecommunications Act of [...] The first two sections of Section 230 set out various findings and policy statements and illustrate Section 230’s multiple—and potentially conflicting—goals. Section 230 was intended to accomplish the goal of protecting children but through a different mechanism, the removal of “disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material.”23 23.47 U.S.C. §
- Section 230: A Juridical History | Stanford Law School
Section 230 of the Communications Decency Act of 1996 is the most important law in the history of the internet. It is also one of the most flawed. Under Section 230, online entities are absolutely immune from lawsuits related to content authored by third parties. The law has been essential to the internet’s development over the last twenty years, but it has not kept pace with the times and is now a source of deep consternation to courts and legislatures. Lawmakers and legal scholars from across
- Section 230
That’s why the U.S. Congress passed a law, Section 230 (originally part of the Communications Decency Act), that protects Americans’ freedom of expression online by protecting the intermediaries we all rely on. It states: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." (47 U.S.C. § 230(c)(1)). [...] Section 230 allows for web operators, large and small, to moderate user speech and content as they see fit. This reinforces the First Amendment’s protections for publishers to decide what content they will distribute. Different approaches to moderating users’ speech allows users to find the places online that they like, and avoid places they don’t. [...] Section 230 embodies that principle that we should all be responsible for our own actions and statements online, but generally not those of others. The law prevents most civil suits against users or services that are based on what others say. Congress passed this bipartisan legislation because it recognized that promoting more user speech online outweighed potential harms. When harmful speech takes place, it’s the speaker that should be held responsible, not the service that hosts the speech.
DBPedia
View on DBPediaSection 230 is a section of Title 47 of the United States Code that was enacted as part of the United States Communications Decency Act and generally provides immunity for website platforms with respect to third-party content. At its core, Section 230(c)(1) provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users: No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. Section 230(c)(2) further provides "Good Samaritan" protection from civil liability for operators of interactive computer services in the good faith removal or moderation of third-party material they deem "obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." Section 230 was developed in response to a pair of lawsuits against Internet service providers (ISPs) in the early 1990s that resulted in different interpretations of whether the service providers should be treated as publishers or, alternatively, as distributors of content created by its users. It was enacted as part of the Communications Decency Act (CDA) of 1996 (a common name for Title V of the Telecommunications Act of 1996), formally codified as part of the Communications Act of 1934 at 47 U.S.C. § 230. After passage of the Telecommunications Act, the CDA was challenged in courts and was ruled by the Supreme Court in Reno v. American Civil Liberties Union (1997) to be unconstitutional, though Section 230 was determined to be severable from the rest of the legislation and remained in place. Since then, several legal challenges have validated the constitutionality of Section 230. Section 230 protections are not limitless and require providers to remove material illegal on a federal level, such as in copyright infringement cases. In 2018, Section 230 was amended by the Stop Enabling Sex Traffickers Act (FOSTA-SESTA) to require the removal of material violating federal and state sex trafficking laws. In the following years, protections from Section 230 have come under more scrutiny on issues related to hate speech and ideological biases in relation to the power that technology companies can hold on political discussions and became a major issue during the 2020 United States presidential election. Passed when Internet use was just starting to expand in both breadth of services and range of consumers in the United States, Section 230 has frequently been referred to as a key law, which allowed the Internet to develop.
