Online Safety Act (OSA)
A UK law that imposes content moderation and age-gating obligations on online platforms, targeting content deemed upsetting or illegal under UK law, which differs significantly from US standards.
First Mentioned
1/23/2026, 6:34:55 AM
Last Updated
1/23/2026, 6:35:21 AM
Research Retrieved
1/23/2026, 6:35:21 AM
Summary
The Online Safety Act (OSA) 2023 is a comprehensive United Kingdom law designed to regulate online content and establish a statutory duty of care for internet platforms. Enacted on October 26, 2023, the legislation empowers the Office of Communications (Ofcom) to oversee social media companies, search engines, and other services hosting user-generated content. Its primary objectives are to mandate the removal of illegal material—including terrorism, child sexual exploitation, and fraud—and to protect children from harmful but legal content. Platforms failing to comply face severe penalties, including fines of up to 18 million or 10% of their global annual turnover. While proponents argue the Act is essential for child protection, it has faced significant international criticism from figures like Sarah B. Rogers and tech leaders who view it as a threat to end-to-end encryption and free speech. Critics have characterized the Act as a 'Censorship Tariff' on American technology companies and linked it to a broader 'Censorship Industrial Complex' that allegedly suppresses disfavored viewpoints.
Referenced in 1 Document
Research Data
Extracted Attributes
Status
Current legislation
Full Name
Online Safety Act 2023
Regulator
Ofcom (Office of Communications)
Jurisdiction
United Kingdom
Maximum Penalty
18 million or 10% of annual global turnover
Statute Citation
2023 c. 50
Primary Objectives
Child safety, removal of illegal content, user-driven content filtering
Timeline
- The Online Safety Act receives Royal Assent and becomes law in the United Kingdom. (Source: Wikipedia)
2023-10-26
- New criminal offences for individual wrongdoers introduced by the OSA come into force. (Source: Taylor & Francis)
2024-01-31
- Phase 1 of the OSA roll-out roadmap is completed; platforms are legally required to protect users from illegal harm. (Source: CMS LawNow)
2025-03-17
- Ofcom issues a fine of over 1 million, the largest penalty under the Act since its enforcement began. (Source: CMS LawNow)
2025-12-01
Wikipedia
View on WikipediaOnline Safety Act 2023
The Online Safety Act 2023 (OSA) (c. 50) is an act of the Parliament of the United Kingdom to regulate online content. It was passed on 26 October 2023 and gives the relevant secretary of state the power to designate, suppress, and record a wide range of online content that they deem illegal or harmful to children. The act creates a new duty of care for online platforms, requiring them to take action against illegal content, or legal content that could be harmful to children where children are likely to access it. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher. It also empowers Ofcom to block access to particular websites. However, it obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content such as user comments on political parties and issues. The act also requires platforms – including end-to-end encrypted message providers – to scan for child pornography and terrorism content, which experts say is not possible to implement without undermining users' privacy. The government has said it does not intend to enforce this provision of the act until it becomes "technically feasible" to do so. The act also obliges technology platforms to introduce systems that will allow users to better filter out the harmful content they do not want to see. The legislation has drawn criticism both within the UK and overseas from politicians, academics, journalists and human rights organisations, who say that it poses a threat to the right to privacy and freedom of speech and expression. Supporters of the act say it is necessary for child protection.
Web Search Results
- Online Safety Act 2023 - Wikipedia
The Online Safety Act 2023 (OSA) (c. 50) is an act of the Parliament of the United Kingdom "Act of Parliament (UK)") to regulate online content. It was passed on 26 October 2023 and gives the relevant secretary of state "Secretary of State (United Kingdom)") the power to designate, suppress, and record a wide range of online content that they deem illegal or harmful to children. [...] | Act of Parliament | | Parliament of the United Kingdom | | Long title | An Act to make provision for and in connection with the regulation by Ofcom of certain internet services; for and in connection with communications offences; and for connected purposes. | | Citation | 2023 c. 50 | | Introduced by | Michelle Donelan, Secretary of State for Science, Innovation and Technology (Commons) Lord Parkinson of Whitley Bay, Parliamentary Under-Secretary of State for Arts and Heritage (Lords) | | Territorial extent | United Kingdom (most parts) Great Britain (certain parts) | | Dates | | Royal assent | 26 October 2023 | | Commencement | On royal assent and by regulations. | | Status: Current legislation | | History of passage through Parliament | | Text of statute as originally enacted | [...] Alan Woodward "Alan Woodward (computer scientist)"), a cybersecurity expert at the University of Surrey, described the act's surveillance provisions as "technically dangerous and ethically questionable", stating that the government's approach could make the internet less safe, not more. He added that the act makes mass surveillance "almost an inevitability" as security forces would be liable to mission creep, using the justification of "exceptional circumstances" to extend searches beyond their original remit. Elena Abrusci, a scholar at Brunel Law School, suggests that the OSA provides adequate legal basis for service providers to remove illegal content, but does not adequately protect users from disinformation and online harassment, which is necessary for ensuring political
- A New Era in Online Safety: What Global Companies Need to Know ...
By: Lucy Blake, Joanna Ludlam, Will Jones, Karam Jardaneh Over the course of 2025, the United Kingdom’s Online Safety Act (OSA) has been gradually coming into force reshaping the online safety landscape globally. The OSA requires in-scope companies to identify, mitigate, and manage the risks of harm from illegal content, as well as content that is harmful to children. While its stated intention is to make the United Kingdom one of the safest places in the world to use the internet, the OSA will impact companies globally. This is despite the United States’ opposition to such rules, in light of their impact on US-headquartered tech companies1 . [...] The OSA is a complex piece of legislation, introducing extensive and novel obligations for companies. As a result, there is some uncertainty in how the OSA will be interpreted and enforced. This is likely to generate litigation, particularly in the public law realm, as companies challenge regulatory decisions and seek clarity on their obligations. Below, we break down this complexity into the Act’s basic building blocks and set out the four key things companies need to know about the OSA. 1. Who needs to comply with the OSA? The OSA applies to providers of: [...] The OSA marks a turning point not only in how the online landscape is regulated in the United Kingdom but how it is regulated globally. Taking proactive steps now will help providers avoid regulatory enforcement in the United Kingdom and also prepare them to face other regulatory requirements globally in the future. However, given the extensive and novel obligations under the OSA, well-intentioned companies might still find themselves in Ofcom’s crosshairs. Our experienced team of lawyers at Jenner & Block is ready to help you navigate and comply with the OSA and face any governmental scrutiny that may arise.
- Full article: The online safety act 2023 - Taylor & Francis
The primary purpose of the OSA is to make the Internet a safe place for children and adults by placing certain duties and responsibilities on social media companies and search services to ensure the safety of the users of their platforms.Footnote5 It is aimed at tackling the growing concerns over illegal content (including priority illegal content), and content that is harmful to children, including misinformation, sexual images, hate speech, and exploitation. It seeks to do so by delineating comprehensive measures including the establishment of proactive duties, such as risk assessments, to prevent the uploading and spreading of harmful content by holding online platforms accountable for the safety of their users. The expectation is that with its focus on empowering users to challenge [...] The OSA not only represents a significant step for the UK in regulating digital spaces, but it also introduce/enhance user protection in an increasingly interconnected world through regulatory duty of care imposed on businesses operating online such as introducing age verification, more stringent content moderation and transparency.Footnote4 The OSA was granted royal assent in October 2023 and work has been underway this last year by Ofcom to put in place codes of practice and guidance pertinent to the implementation of the Act. Equally, the new criminal offences directed at individual wrongdoers that the OSA brought in came into force on the 31st of January 2024. [...] 3 See for instance Jodie, ‘Opinion - I received an anonymous email – the contents horrified me’ Metro (12 December 2024) (last visited 11 January 2025) 4 L Ghazal, ‘Introduced to tackle growing concerns over the safety of internet users – particularly children and vulnerable groups, the Online Safety Act (OSA) marks a significant shift in the regulatory landscape for businesses operating online platforms in the UK’ (BusinessMatters, 27 October 2024) (last visited 11 January 2025). 5 Online Safety Act, section 1 and ‘Guidance Online Safety Act: Explainer’ 8 May 2024 < > (last visited 11 January 2025).
- Online Safety Act: explainer
Hide cookie message Skip to main content is a new set of laws that protects children and adults online. It puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms. The Act will give providers new duties to implement systems and processes to reduce risks their services are used for illegal activity, and to take down illegal content when it does appear. The strongest protections in the Act have been designed for children. Platforms will be required to prevent children from accessing harmful and age-inappropriate content and provide parents and children with clear and accessible ways to report problems online when they do arise. [...] The Act will tackle suicide and self-harm content Any site that allows users to share content or interact with each other is in scope of the Online Safety Act. These laws also require sites to rapidly remove illegal suicide and self-harm content and proactively protect users from content that is illegal under the Suicide Act 1961. The Act has also introduced a new criminal offence for intentionally encouraging or assisting serious self-harm. Services that are likely to be accessed by children must prevent children of all ages from encountering legal content that encourages, promotes or provides instruction for suicide and self-harm. [...] Mis- and disinformation will be captured by the Online Safety Act where it is illegal or harmful to children. Services will be required to take steps to remove illegal disinformation content if they become aware of it on their services. This includes the removal of illegal, state-sponsored disinformation through the Foreign Interference Offence, forcing companies to take action against a range of state-sponsored disinformation and state-linked interference online. Companies must also assess whether their service is likely to be accessed by children and, if so deliver additional protections for them. This includes protections against in-scope mis- and disinformation.
- 2025 UK Online Safety Act Round-Up - CMS LawNow
A couple of weeks ago Ofcom issued a fine of over £1m under the Online Safety Act 2023 (“OSA”) – the biggest penalty issued under the Act since it came into force. The comes at an opportune time, as we look back at the development of online safety rules through the course of 2025. 2025 was the year the OSA moved from statute to sustained regulatory action. Key legal triggers came into force, Ofcom issued the first codes and guidance for illegal harms and children’s safety, and the regulator rapidly shifted to enforcement – opening multiple investigations and issuing fines under its new powers. In this article, we summarise the principal milestones for the OSA which affect online platforms in the UK, particularly those which host user-generated content. We also explain some of the key [...] ### Looking Back… #### 26th October 2023 – The OSA Receives Royal Assent The OSA increases protection for users of online platforms in the UK (particularly social media platforms, gaming platforms and other websites and apps that host user-generated content) and gives Ofcom the power to enforce these rules. The Act received Royal Assent on 26th October 2023 following various consultations and amendments. However, the actual roll out of the OSA has been gradual, and most of the provisions didn’t take effect until this year. Implementation of the OSA requires Ofcom to develop guidance and codes of practice setting out how online platforms must meet their duties under the OSA. On that basis, Ofcom set out a phased roadmap to implementation as follows: [...] 17th March 2025 was a major milestone for the OSA, and the date when Phase 1 of the OSA roll-out roadmap was completed. From this date, online platforms became legally required to protect their users from illegal harm, and by this date they should have undertaken a risk assessment of their platforms for any illegal content\ (the deadline for which was 16th March). \Illegal content is defined in the OSA as “content that amounts to a relevant offence”, with “relevant offences” including offences such as terrorism, harassment, CSEA and fraud.