Knowledge Hub

EU Code of Practice Monitor

Stay updated on how platforms are adjusting their commitments and subscriptions as the Code of Practice on Disinformation transitioned into a Code of Conduct.

EU action on Disinfo and Hate Speech

Explore how EU member states are addressing disinformation and online hate speech with this comprehensive overview of their different approaches and strategies.

DSA Framework for Online Election Integrity

Explore how EU and national authorities can use DSA enforcement to protect online integrity during elections.

 

Monitoring the EU Code of Practice on Disinformation

The Code of Practice on Disinformation plays a key role in the EU’s digital regulatory framework. Originally designed as a self-regulatory tool, it was always meant to complement and reinforce the Digital Services Act. On 13 February, the CoP transitioned into a legally binding Code of Conduct.
As part of this shift, major signatory platforms—including Google, Meta, Microsoft, and TikTok—have revised their commitments. In this resource, DRI, signatory to the CoP since 2023, provides an in-depth analysis of these changes: which commitments have been withdrawn, the justifications given by platforms, and what this means for EU’s policies on disinformation.

Tab System
From Code of Practice to Code of Conduct
Changes in COP Commitments per Topics
Changes in COP Commitments per Platform

 

Please choose a platform to see the data.

 

Microsoft
Google
Meta
TikTok

 

 

 

Related resources

(DRI) What Does platforms Self-Reporting under the Code of Practice Report Tell Us about Data Access?

See resource

(EFCSN) Commitments unfulfilled: big tech and the EU Code of Practice on Disinformation

See resource

But did they really? Platforms’ compliance with the Code of Practice on Disinformation in review

See resource

 

EU Approaches To Disinformation And Online Hate Speech

How do countries tackle online hate speech and disinformation? This knowledge hub provides an overview of the different approaches EU member states have for tackling disinformation and online hate speech. Our database shows what approaches have been taken and where, from binding laws to national task forces and media campaigns.

Working Group

Capacity Building

Awareness Raising

Illegal Behaviour

Platform Action

Regulatory Body

Media Literacy

Disinformation

Hate speech

 

Safeguarding Online Integrity During Elections: The DAS Framework

This resource presents a set of actionable measures that public authorities can take under the DSA framework to protect the integrity of online spaces in election periods.

The framework draws on lessons from the enforcement of the Digital Services Act (DSA) during Germany’s 2025 snap elections and explores how different actors can contribute to safeguarding democratic processes in the digital environment.

We focus on five key institutional players:

 

· The European Commission
· Germany’s Digital Services Coordinator (DSC)
· The European Board for Digital Services
· National regulatory authorities
· Political and electoral authorities

 

The analysis identifies potential avenues for advocacy and engagement, while also flagging gaps, limitations, and coordination challenges that need to be addressed to strengthen online election integrity across the EU.

 

Authority

 

Competencies in Elections Under the DSA

 

Advocacy avenues

 

Gaps and challenges

European Commission – DG Connect

The European Commission is the primary enforcer of the Digital Services Act (DSA). This entity coordinates soft law initiatives and co-regulatory tools such as the Code of Conduct on Disinformation and the Rapid Response System.

1. Indirect enforcement through ongoing open cases against VLOPs and VLOSEs.

Advocacy avenues

Adding new findings into on-going investigations is often more efficient than trying to convince the Commission to initiate new cases altogether. Researchers, CSOs, and other stakeholders should consider focusing on contributing evidence to ongoing investigations.

Gaps and challenges

Enforcement actions against VLOPs and VLOSEs carry significant legal implications, so these cases often take considerable time to process. As a result, this advocacy avenue is long-term, and unlikely to resolve issues or incidents quickly (e.g., before an election).

Transparency of enforcement cases is limited. Stakeholders and the public only have access to press releases, and not to the complete reasoning behind sanctions or the specific facts of each case. Furthermore, platforms and the Commission may negotiate and reach commitments to address concerns, but CSOs are not included in these discussions.

2. Enforcement through soft-law instruments: Guidelines on Electoral Integrity.

Advocacy avenues

The Guidelines on Electoral Integrity can help CSOs and researchers to design projects that assess how online platforms identify and mitigate systemic risks for civic discourse and democracy. Aligning research with these guidelines can make findings more actionable for enforcement.

Gaps and challenges

While the guidelines provide detailed and concrete suggestions for implementing DSA obligations, they remain non-mandatory. Platforms can choose other means for achieving the DSA’s objectives. Moreover, the guidelines do not contain benchmarks against which the success or failure of the suggested measures can be evaluated.

3. Enforcement through co-regulatory instruments: Codes of conduct on illegal hate speech and disinformation.

Advocacy avenues

With the integration of the CoC on Illegal Hate Speech and the CoC on Disinformation into the DSA framework, and platforms now subject to audits of their CoC commitments, it is more important than ever to monitor their compliance. CSOs, researchers, and other stakeholders can monitor platforms’ reports, and advocate for clear, detailed information covering both qualitative and quantitative key performance indicators.

Gaps and challenges

During the transition from the Code of Practice to the Code of Conduct on Disinformation, signatory platforms reduced their commitments under the CoP by 31 per cent.14 While all areas of the Code were affected, the most significant reductions occurred in measures supporting the fact-checking community (a 64 per cent decrease), followed by measures on political advertising and empowering the research community. This signals a lack of ambition in the Code, which is likely to undermine its effectiveness in holding platforms accountable and addressing disinformation threats. Many CSOs and fact-checking organisations involved in the Code's RRS do this work on a pro-bono basis, which places significant resource pressure on them for a time-consuming – yet crucial – activity.

Digital Services Coordinator (DSCs)

Digital Services Coordinators (DSCs) are national authorities designated by Member States to enforce provisions of the EU's online platform law, the Digital Services Act (DSA). They have multiple responsibilities under the DSA, which broadly fall into three categories: (1) enforcing DSA rules for intermediary services based in their country, except for due diligence obligations of Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs), which under the Commission’s competence; (2) serving as the central hub for users’ complaints, and (3) certifying third-party actors involved in the DSA's implementation.


 
1) Establish early connections with stakeholders, VLOPs, and VLOSEs to share knowledge and resources.
2) Facilitate the publication of voter information, provide DSA-specific guidance for candidates, and promote media literacy campaigns.
3) Support research and data sharing, monitor political advertising and ad libraries, and share lessons learned in post-election reports.
4) create incident protocols, establish networks, and ensure key escalation channels are in place to respond swiftly to complaints.

Advocacy avenues

• CSOs and researchers monitoring online speech during elections may find it valuable to engage early with their country’s DSC and take part in roundtables, tabletop exercises, and stress tests. Building coalitions with other CSOs can help strengthen these efforts and enhance their impact.

• Once the Delegated Act on Data Access is approved, DSCs will play a key role in facilitating access to non-public data from online platforms. Advocating for the effective implementation of Article 40(4) of the DSA will be crucial for researchers and CSOs monitoring online platforms.

• DSCs play a crucial role in incident response, particularly during elections, by facilitating escalation channels with other state institutions for urgent content assessment. The Toolkit highlights that CSOs and other stakeholders should be involved in these responses. This avenue is particularly important for incidents coming from non-VLOPs and non-VLOSEs.

• Complaints under Art. 53 serve as a crucial advocacy and legal tool for addressing both specific incidents and broader violations of the DSA by online platforms. Currently, nine complaints are pending before the German DSC, against TikTok, Meta, Google, YouTube, and LinkedIn.20

Gaps and challenges

• During the German elections, most DSC activities focused on engaging with VLOPs and VLOSEs. Non-VLOPs and non-VLOSEs, however, such as Telegram, also play a significant role in disseminating political content. Given their influence, CSOs and other stakeholders should advocate for a more comprehensive approach that includes these online platforms in the response framework.

• To effectively carry out their responsibilities, DSCs need adequate resources, and must maintain independence and autonomy from political pressure, especially during elections. Focus groups conducted by DRI last year revealed that political factors, such as setbacks in the Rule of Law or changes in government, often impact the pace and effectiveness of DSA implementation.21

• The messaging of DSCs differs, however, and they may have different perceptions of their mandates. After being criticised, rightly, for initially exaggerating the intervention possibilities,22 the president of BNetzA expressed caution about the DSA’s powers before the German election, emphasising in interviews that it was intended solely to target the dissemination of illegal content, and could only intervene after risks had materialised.23 In contrast, the president of Romania’s DSC (Ancom) took a far more assertive stance, even calling for the suspension of TikTok in Romania following the election controversy.24

European Board for Digital Services

The European Board for Digital Services is an independent advisory body established under the Digital Services Act (DSA). It plays a pivotal role in ensuring the consistent implementation of the DSA across the European Union. The Board facilitates effective collaboration between Digital Services Coordinators (DSCs) and the European Commission, provides expert analysis on emerging DSA-related issues, and supports the oversight of very large online platforms and search engines.


 
1) Facilitating cross-border cooperation and joint investigations by DSCs;
2) Publishing an annual report outlining best practices for mitigating systemic risks (Art. 35.2); and
3) Advising the Commission on when to activate the crisis mechanism (Art. 36.1).

Advocacy avenues

The Commission has opened opportunities for CSOs and other stakeholders to discuss gaps and failures in VLOP and VLOSE risk assessments.27 These discussions, along with the EBDS's annual risk mitigation report, provide a key avenue to improve the depth and effectiveness of these reports.

Gaps and challenges

None

National administrative and judicial authorities

While not directly responsible for enforcing the DSA, national administrative and judicial authorities play a crucial role in determining the justiciability of the DSA and establishing potential benchmarks for when an election may be considered compromised on the grounds of online integrity. Examples of this authority in a given national context include electoral authorities, judicial authorities, political financing institutions, and national security agencies.


 
1) National security agencies play a vital role in identifying and mitigating foreign influence.
2) Electoral authorities re central to ensuring information about the electoral process is trustworthy.
3) Political financing institutions also play a key role when campaign financing intersects with digital political advertising.
4) Judicial authorities:
• Play a key role in clarifying and strengthening DSA provisions, ensuring their enforceability.
• Have the power to annul elections, potentially setting benchmarks for when online integrity issues compromise an election.
5) All other administrative authorities in a country have competencies under Art. 9 and Art. 10 of the DSA to alert VLOPs and VLOSEs about illegal content online.

Advocacy avenues

Mapping national institutions and their areas of responsibility is essential for understanding their roles and authority in managing platform-related processes, such as issuing orders under Articles 9 and 10 of the DSA, escalating issues, or flagging content. CSOs should also be sensitive to the risks of overblocking or disproportionate administrative or judicial decisions that threaten the rule of law and democracy.

Gaps and challenges

There is no universally accepted, evidence-based benchmark for determining when an election has been compromised from the perspective of online integrity. A weakness of the Romanian Court ruling was that it did not even propose such a benchmark of severity, even though it is obvious that not any amount of disinformation renders an online campaign massively unfair. Assessing the impact of influence operations – whether foreign or domestic - on election outcomes remains a highly contested issue, often leading to tensions between national authorities, and potentially eroding public trust in elections and the rule of law.

There are also concerns about the power of authorities to flag and order the removal of illegal content, as this power could be used as an excuse in certain political contexts to censor or threaten fundamental rights.

Executive actors

Executive actors are powerful drivers of media attention, as they can play a crucial role in elevating issues, thus placing them on the political agenda. One example of this authority in a given national context includes government and public administration institutions.

Powerful drivers of media attention and agenda setters.

Advocacy avenues

Executive actors are powerful drivers of media attention and can play a crucial role in elevating issues, thus placing them on the political agenda. This, in turn, could make VLOPs and VLOSEs more aware and willing to activate all relevant mitigation measures.

Gaps and challenges

An overly politicised discourse surrounding content moderation, curation, and online integrity risks overshadowing the technical and often nuanced challenges of mitigating systemic online threats to elections. The most recent example, of course, is the deliberate misuse of free speech arguments to oppose platform regulation.