NetChoice Content Moderation Case
A Supreme Court case concerning laws from Florida and Texas that aimed to regulate content moderation on large social media platforms. The court ruled in favor of the platforms, protecting their editorial judgment as a form of free speech.
First Mentioned
9/21/2025, 4:07:00 AM
Last Updated
9/21/2025, 4:07:50 AM
Research Retrieved
9/21/2025, 4:07:50 AM
Summary
The NetChoice Content Moderation Case, formally known as Moody v. NetChoice, LLC, represents a pivotal series of United States Supreme Court decisions that upheld the editorial rights of social media platforms. These rulings effectively struck down state laws in Texas (HB 20) and Florida (S.B. 7072) that sought to regulate content moderation practices on these platforms. The case is deeply intertwined with Section 230 of the Communications Decency Act, which governs content moderation in the U.S., and affirmed that platforms exercise First Amendment editorial judgment. The implications of these decisions, alongside other Supreme Court rulings on presidential immunity and the Chevron Doctrine, were extensively analyzed on the All-In Podcast, where hosts discussed their impact on the administrative state and political landscape.
Referenced in 1 Document
Research Data
Extracted Attributes
Court
United States Supreme Court
Outcome
Struck down Florida Social Media Law (S.B. 7072)
Case Name
Moody v. NetChoice, LLC
Key Issue
First Amendment rights of social media companies regarding editorial judgment
Also Known As
NetChoice Content Moderation Case
Subject Matter
Content moderation on social media platforms
Legal Precedent/Context
Section 230 of the Communications Decency Act
Timeline
- Florida legislature enacted S.B. 7072, a law regulating social media companies. (Source: web_search_results)
2021
- NetChoice, LLC and the Computer & Communications Industry Association (CCIA) sued the Attorney General of Florida, arguing S.B. 7072 violated the First Amendment. (Source: web_search_results)
2021
- Texas legislature enacted HB 20, a law aimed at protecting the free exchange of ideas and information on social media platforms. (Source: web_search_results)
2021-08
- NetChoice, LLC and CCIA sued Texas, challenging HB 20. (Source: web_search_results)
2021
- The United States Supreme Court issued its opinion in Moody v. NetChoice, LLC, affirming platforms' editorial rights and striking down the Texas and Florida social media laws. (Source: web_search_results, related_documents, summary)
2024-07-01
- The Supreme Court decisions, including the NetChoice Content Moderation Case, were discussed on the All-In Podcast. (Source: related_documents)
Recent
Wikipedia
View on WikipediaContent moderation
Content moderation, in the context of websites that facilitate user-generated content, is the systematic process of identifying, reducing, or removing user contributions that are irrelevant, obscene, illegal, harmful, or insulting. This process may involve either direct removal of problematic content or the application of warning labels to flagged material. As an alternative approach, platforms may enable users to independently block and filter content based on their preferences. This practice operates within the broader domain of trust and safety frameworks. Various types of Internet sites permit user-generated content such as posts, comments, videos including Internet forums, blogs, and news sites powered by scripts such as phpBB, a wiki, PHP-Nuke, etc. Depending on the site's content and intended audience, the site's administrators will decide what kinds of user comments are appropriate, then delegate the responsibility of sifting through comments to lesser moderators. Most often, they will attempt to eliminate trolling, spamming, or flaming, although this varies widely from site to site. Major platforms use a combination of algorithmic tools, user reporting and human review. Social media sites may also employ content moderators to manually flag or remove content flagged for hate speech, incivility or other objectionable content. Other content issues include revenge porn, graphic content, child abuse material and propaganda. Some websites must also make their content hospitable to advertisements. In the United States, content moderation is governed by Section 230 of the Communications Decency Act, and has seen several cases concerning the issue make it to the United States Supreme Court, such as the current Moody v. NetChoice, LLC. Content moderation can result in a range of outcomes, including blocking and visibility moderation such as shadow banning. Content moderation together with parental controls can help parents filter age appropriateness of content for their children.
Web Search Results
- NetChoice v. Paxton / Moody v. NetChoice - Epic.org
NetChoice challenged each of these laws in separate cases. One, NetChoice v. Paxton, came up through the Fifth Circuit. The other, NetChoice v. Moody, came up through the Eleventh Circuit. In both cases, NetChoice claimed that the laws violated social media companies’ First Amendment rights. NetChoice argues that when social media companies moderate content, they are exercising editorial judgment like a newspaper editor does. The laws thus violate the First Amendment by interfering with that [...] ## The Case In this case, social media companies represented by NetChoice argue that two state social media regulations violate their free speech rights. The laws were passed by the Texas and Florida legislatures, motivated by their concerns that large social media companies were unfairly censoring conservative viewpoints on their platforms. To fight this perceived threat, the legislatures passed bills that regulate social media companies in a variety of ways. [...] The two courts came to opposite conclusions. The Eleventh Circuit ruled that the Florida law likely violated the First Amendment. It wrote that social media companies’ content moderation practices are analogous to other private entities’ use of editorial judgment in choosing which third-party speech to disseminate and how to rank it. In the Eleventh Circuit’s opinion, it explains that it broadly agrees with NetChoice’s argument: when social media companies publish content guidelines and enforce
- Content Moderation - NetChoice
[Image 3 NetChoice Cases ### NetChoice v. Ellison (Minnesota) Minnesota’s SF 4097 is a clear attempt to pressure social media companies into curating and delivering speech in line with government preferences—a direct violation of the First Amendment. The law […] Tue, July 1, 2025]( 4 Media Hits and Press Statements Press Statements ### NetChoice Sues Minnesota to Prevent Government from Pressuring Social Media MINNEAPOLIS, Minn. — Last night, NetChoice sued Minnesota over SF 4097 because this [...] NetChoice v. Fitch (Mississippi) NetChoice v. Yost (Ohio) NetChoice v. Skrmetti (Tennessee) NetChoice & CCIA v. Paxton (Texas, 2021) CCIA & NetChoice v. Paxton (Texas, 2024) NetChoice v. Reyes (Utah) Amicus Briefs All NetChoice Cases [...] NetChoice v. Fitch (Mississippi) NetChoice v. Yost (Ohio) NetChoice v. Skrmetti (Tennessee) NetChoice & CCIA v. Paxton (Texas, 2021) CCIA & NetChoice v. Paxton (Texas, 2024) NetChoice v. Reyes (Utah) Amicus Briefs All NetChoice Cases
- NetChoice v. Griffin (Arkansas, 2025)
Search: Contact Us Content Moderation Posted 06/27/2025|NetChoice Cases NetChoice v. Griffin (Arkansas, 2025) Paul Taske Co-Director of the NetChoice Litigation Center ###### Key Takeaways: [...] Content Moderation Arkansas Doubles Down on Censorship, NetChoice Sues AgainContent Moderation NetChoice Wins: Court Declares Arkansas Age Verification Law UNCONSTITUTIONAL, Permanently Halts It to Protect Arkansans’ Rights & Privacy OnlineContent Moderation NetChoice Secures Key Victory for Free Speech: Court Permanently Blocks Arkansas Age Verification LawContent Moderation District Court Halts Unconstitutional Arkansas Law in NetChoice v. GriffinContent Moderation NetChoice v. Griffin [...] ###### Case Brief Case Status: Complaint Filed Latest Update: June 27, 2025 Attorneys: Paul Clement Erin Murphy James Xi Joseph DeMott Firms: Clement Murphy ###### Timeline U.S. District Court for the Western District of Arkansas June 27, 2025: Complaint Filed
- [PDF] 22-277 Moody v. NetChoice, LLC (07/01/2024) - Supreme Court
So far in these cases, no one has paid much attention to 10 MOODY v. NETCHOICE, LLC Opinion of the Court that issue. In the lower courts, NetChoice and the States alike treated the laws as having certain heartland applica tions, and mostly confined their battle to that terrain. More specifically, the focus was on how the laws applied to the content-moderation practices that giant social-media plat forms use on their best-known services to filter, alter, or la bel their users’ posts. Or more [...] For example, the law prohibits a platform from taking those actions against “a journalistic enterprise based on the con tent of its publication or broadcast.” §501.2041(2)(j). Simi larly, the law prevents deprioritizing posts by or about po litical candidates. See §501.2041(2)(h). And the law requires platforms to apply their content-moderation prac tices to users “in a consistent manner.” §501.2041(2)(b).
- [PDF] EXAMINING NETCHOICE AND MURTHY: CONTENT ...
2024] CONTENT MODERATION IN THE SUPREME COURT 101 i. District Court Findings In 2021, NetChoice, LLC and the Computer & Communications Industry Association (collectively “NetChoice”), trade associations whose members include most of the social media platforms in-scope of S.B. 7072, sued the Attorney General of Florida arguing that S.B. 7072 violates the First Amendment by infringing on social media platforms’ editorial discretion rights. 49 The district court [...] 2024] CONTENT MODERATION IN THE SUPREME COURT 103 b. NetChoice, LLC. v. Paxton In August of 2021, the Texas legislature enacted HB 20, a law aimed to serve the fundamental interests of both citizens and the state by “protecting the free exchange of ideas and information.” 67 HB 20 notes that social media platforms are “central public forums for public debate, and have enjoyed governmental support in [...] did not violate the respondent’s First Amendment rights. II. BACKGROUND OF SOCIAL MEDIA PLATFORM CONTENT -M ODERATION The NetChoice and Murthy cases focus on large social media platforms. While the most influential platforms – e.g., YouTube, Facebook, TikTok, and Twitter– are the focus of the current litigation, the outcome of the NetChoice and Murthy cases could impact any platform where “users can upload messages, videos, and