Trusted Flaggers
A designation under the EU's DSA for certain NGOs whose reports of illegal content must be given priority by tech platforms. This system is seen as a way to formalize the role of activist groups in censorship.
First Mentioned
1/23/2026, 6:34:56 AM
Last Updated
1/23/2026, 6:37:00 AM
Research Retrieved
1/23/2026, 6:37:00 AM
Summary
Trusted Flaggers are specialized entities, including NGOs and government-affiliated bodies, that are granted prioritized status within the content moderation systems of major digital platforms. While the concept originated through voluntary partnerships like YouTube's Priority Flagger and Meta's Trusted Flagger programs, it was formally codified under Article 22 of the European Union's Digital Services Act (DSA). This regulation requires platforms to fast-track reports from accredited flaggers regarding illegal content, such as hate speech or terrorist material. However, the role of these entities is highly controversial; critics like Sarah B. Rogers and hosts of the All-In Podcast characterize them as key components of a "Censorship Industrial Complex." This complex is alleged to facilitate government-led suppression of protected speech—such as criticism of mass migration or COVID-19 policies—by using third-party flaggers to bypass constitutional protections like the First Amendment. In response to these top-down moderation structures, decentralized alternatives like X's Community Notes and the AI tool Grok have been proposed as more transparent, user-driven solutions.
Referenced in 1 Document
Research Data
Extracted Attributes
Criticized As
Censorship Industrial Complex / Censorship Tariff
Legal Basis (EU)
Article 22 of the Digital Services Act (DSA)
Primary Function
Identifying and notifying platforms of illegal content with prioritized review status
Eligibility Criteria
Expertise, independence from platforms, and commitment to diligence and objectivity
Voluntary Precursors
YouTube Priority Flagger, Meta Trusted Flagger, Amazon Project Zero
Reporting Obligations
Mandatory annual public reports on notices submitted and actions taken
Accreditation Authority
Digital Services Coordinator (DSC) in EU Member States
Timeline
- The EU Commission and major tech platforms announce the Code of Conduct on Countering Illegal Hate Speech Online, encouraging voluntary trusted flagger frameworks. (Source: undefined)
2016-05-31
- Germany establishes 'Beschwerdestellen' (complaint offices) under the Netzwerkdurchsetzungsgesetz, acting as early trusted flagger-like entities. (Source: undefined)
2016-10-01
- The EU's Strengthened Code of Practice on Disinformation is published, further integrating the role of trusted partners in content moderation. (Source: undefined)
2022-06-16
- The Digital Services Act (DSA) is officially adopted, formalizing the 'Trusted Flagger' status under Article 22. (Source: undefined)
2022-10-19
- The Digital Services Act (DSA) becomes fully applicable to all online platforms in the EU, mandating priority treatment for accredited Trusted Flaggers. (Source: undefined)
2024-02-17
- An Coimisiún (Ireland) and other EU regulators publish updated accreditation guidelines and FAQs for entities seeking Trusted Flagger status. (Source: undefined)
2025-01-01
Wikipedia
View on WikipediaList of fact-checking websites
This list of fact-checking websites includes websites that provide fact-checking services about both political and non-political subjects.
Web Search Results
- On “Trusted” Flaggers
! On “Trusted” Flaggers Naomi Appelman & Paddy Leerssen introduction Trusted flaggers are on the rise in platform governance. Platforms are entering into a growing array of trusted flagging arrangements – also re-ferred to as trusted ‘notifiers’, ‘reporters, ‘partners’, and so forth. @e con-cept has also recently started appearing in legislation. And yet, the mean-ing of this concept remains vague and contested. Flagging’ is the process by which third parties can report content to platforms for content moder-ation review. By now a “ubiquitous mechanism of governance”, flagging is in principle open for all to use.1 But some flaggers are more equal than others. We introduce a concept of “trusted flaggers” that describes, broadly speaking, how third parties have acquired certain privileges in [...] operationaliza-tion and critique, serves as a site of contestation between competing in-terests and legitimacy claims in platform governance. . Varieties of trusted flagging: an overview of practices !.! Platform policies Platform policies on trusted flagging are a useful starting point for our discussion. Some platforms explicitly mention “trusted flaggers,” or its variants, but there is also a broader ecosystem of institutions and practices with similar functions referred to by other names.6 @is essay will try to account for both. Platforms often grant flagging privileges voluntarily as part of their own content moderation policies. Importantly, these private ordering constructs follow quite naturally from conditional liability re-gimes such as in the EU’s eCommerce Directive or the US’ [...] of trusted flagging. We first discuss self-regulatory flagging partnerships on several major platforms. We then review various forms of government involvement and regulation, focus-ing especially on the EU context, where law-making on this issue is espe-cially prevalent. On this basis, we conceptualize different variants of trusted flagging, in terms of their legal construction, position in the con-tent moderation process and the nature of their flagging privileges. We then discuss competing narratives about the role of trusted flaggers; as a source of expertise and representation; as an unaccountable co-optation by public and private power; and as a performance of inclusion. In this way, we illustrate how “trusted flagging,” in its everyday operationaliza-tion and critique, serves as a site of
- Trusted Flaggers – Frequently Asked Questions
A Trusted Flagger is an entity accredited by the Digital Services Coordinator (DSC) in the Member State in which it is established. The role of an accredited Trusted Flagger is to identify, detect and flag illegal content to online platforms. 2. What are the accreditation conditions? To be awarded the status of Trusted Flagger, an applicant must meet the following conditions: Expertise and competence for the purposes of detecting, identifying and notifying illegal content. Independence from any provider of online platforms; and, Diligence, accuracy and objectivity in carrying out its activities for the purposes of submitting notices. 3. Who is eligible to apply for the Trusted Flagger status? [...] In terms of a Trusted Flagger’s obligations, they must publish an annual report on actions and notices made in the previous year to the DSC in their country of establishment. These reports should be precise, accurate and adequately substantiated. The minimum requirements for this report are listed below: identity of the provider of online platforms. type of allegedly illegal content notified action(s) taken by the provider. 3 The report should be made publicly available and include an explanation of the procedures in place to ensure that the Trusted Flagger retains its independence from online platforms. [...] The decision and award granted by An Coimisiún will be published on its website. An Coimisiún is required to notify to the European Commission the bodies it has awarded Trusted Flagger status in accordance with Article 22. The European Commission will also publish a list of Trusted Flaggers on its website: EC Trusted Flaggers list. The DSA does not impose a maximum period for the award of Trusted Flagger status. Currently, An Coimisiún grants the Trusted Flagger status for a period of three years. Upon the expiry of the accreditation period, the TF status is reassessed and, where appropriate, re-granted. Trusted Flaggers must continue to comply with the accreditation conditions and the other obligations under Article 22 for the entire period. This FAQ document is regularly updated,
- Trusted flaggers under the Digital Services Act (DSA)
## Importance of trusted flaggers Trusted flaggers form a crucial part of the DSA's strategy to tackle illegal content online. This system builds on years of voluntary cooperation between online platforms and trusted partners. The DSA has now introduced harmonized criteria to become a trusted flagger, helping to boost online safety and protect users’ rights across the EU. [...] ## Trusted flaggers Trusted flaggers are special entities under the DSA. They are experts at detecting certain types of illegal content online, such as hate speech or terrorist content, and notifying it to the online platforms. The notices submitted by them must be treated with priority as they are expected to be more accurate than notices submitted by an average user. Trusted flaggers only notify providers of online platforms of content they consider to be illegal. Providers have the sole responsibility to decide upon notices and, where justified, removing content. ## Importance of trusted flaggers [...] Only EU-based entities can apply for trusted flagger status. This ensures that trusted flaggers operate within the regulatory framework of the EU, contributing to a harmonized approach in tackling illegal content online. The trusted flagger status is valid across the EU, vis-a-vis any online platform within the scope of Article 22 DSA, regardless of the places of establishment. Trusted flaggers must publish easily understandable and detailed annual reports. These must include information on notices submitted, the types of illegal content reported, and the actions taken by the online platforms. Interested in becoming a trusted flagger? Check your national DSC and keep an eye on developments in your country for more information on the application process.
- Article 22 Digital Services Act: Building trust with ...
Trusted flaggers have long played a role in content moderation: a bilateral, voluntary affair between online platforms and individuals or organisations that are afforded prioritised access to the content moderation process. Due to their expertise, they are trusted to ‘flag’ illegal or harmful content. Article 22 Digital Services Act formalises this framework, allowing governmental and non-governmental organisations to apply for certification as trusted flaggers and requiring online platforms to treat their submitted notices on illegal content with priority and without undue delay. The certification of the first trusted flaggers under Article 22 has sparked public debate about their influence and power, especially in relation to the freedom of expression of internet users. Concerns about [...] Firstly, on selecting trusted flaggers and the requirements for certification. The Board of DSCs – a board made up of all DSCs and the European Commission established in Article 61 DSA – could create standard procedures on the appointment of trusted flaggers, particularly on the timeframes and appealability of such decisions, to equally facilitate organisations across the EU. The application procedure could be harmonised as well. Currently, it is fragmented: in some jurisdictions, for example Finland, organisations seeking certification can get into direct contact with the DSC to submit their application, whereas in other jurisdictions, such as Denmark and Austria, they must use a form. A standardised form that ensures that similar documentation is required from organisations across the [...] Originally, trusted flagger arrangements were a voluntary effort by online platforms to channel the expertise of civil society into their content moderation process. Examples of such programmes include Amazon’s Project Zero, Meta’s Trusted Flagger programme, and YouTube’s Priority Flagger programme. The EU-regulator has adopted this idea in co-regulatory documents such as the Strengthened Code of Practice on Disinformation 2022 (comm. 21) and the Code of Conduct on Countering Illegal Hate Speech Online 2016, encouraging providers of online platforms to offer voluntary trusted flagger frameworks. Similar examples are found on a national level. For example, the now-defunct 2016 German Netzwerkdurchsetzungsgesetz created trusted flagger-like entities called Beschwerdestellen: expert
- Trusted Flagger Platform | Tech Against Terrorism
##### Generate transparency data ### How does it work? ##### Through the TFP, Trusted Flaggers can submit URLs to platforms and utilise AI technology to automatically monitor the removal status of content referrals. ### Request a demo Book a demo ### Apply to become a Trusted Flagger Apply ### Modal Headline Supporting content goes here. [...] Skip to the main content. Facebook Instagram Linkedin X YouTube Medium Trusted FlaggerPlatform #### Developed and powered by Tech Against Terrorism, the Trusted Flagger Platform (TFP) is a secure online platform that enables trusted partners of Tech Against Terrorism to submit terrorist content removal requests to tech platforms hosting verified terrorist content. #### The TFP represents the most powerful terrorist content takedown and monitoring platform available. By using this you also improve our collective intelligence regarding terrorist content. ### Automated Takedown Monitoring ##### Integrating automated web searching and GPT 4o based assessment of TVEC URL status classifying whether content is still online with over 97% accuracy. [...] ### Tiered URL submission system & secure permissions ##### Allowing users to submit URLs categorised into 5 tiers with a secure a review environment wrapped in a sophisticated permissions framework. ### TAT’s OSINT Analysts ##### A team lending their expertise in assessing reported content, ensuring collective and controlled vetting of flagged content. ##### ### The Platform Insights Dashboard ##### Refreshes data every 30 days, providing users with up-to-date insights related to platform URL takedowns, successful interventions, and ongoing threats. ### How does it help? ##### Through the TFP, content takedown referrals are simplified for both Trusted Flaggers, and platforms #### Challenges: