Back New report! How European law protects us from illegal content and disinformation online

New report! How European law protects us from illegal content and disinformation online

Download "Enforcing rules on illegal content and disinformation online"

 

Digital platforms have become central spaces for democratic debate, cultural exchange and access to information. At the same time, the features that make digital platforms powerful tools for expression can create unique risks due to their market power, their reach, and the lack of editorial control. This clearly raises urgent questions about platform content, enforcement, fundamental rights and the responsibilities of online intermediaries. This new report - Enforcing rules on illegal content and disinformation online - published by the European Audiovisual Observatory (part of the Council of Europe in Strasbourg), offers a comprehensive and timely analysis of how European legal frameworks are responding to these challenges.

Produced in collaboration with the Institute of European Media Law (EMR) and authored by twelve leading experts, the report examines how legislation targeting disinformation, illegal and harmful online content can be enforced at European and national level, and how lawmakers and regulators seek to strike a balance between freedom of expression, on the one hand, and the protection of democratic values on the other.

Chapter One, authored by Mark D. Cole and Sandra Schmitz-Berndt, sets the scene by examining the digital transformation of public debate and the growing power of digital platforms as an essential tool for participation in discussions concerning issues of general interest. The authors explore the conceptual and legal distinctions between illegal content and disinformation, highlighting why enforcement against disinformation (often harmful but not unlawful) raises particular regulatory difficulties. This first chapter underlines how platforms’ algorithmic curation and market dominance have altered shape what content is seen, shared or even removed. In addition, debates around content moderation and “censorship” have become more acute.

Chapter Two, authored by Sandra Schmitz-Berndt, looks at the overarching legal framework governing content regulation and enforcement in Europe. It starts by examining the Council of Europe’s human rights-based approach, and draws on recommendations, declarations and extensive case law of the European Court of Human Rights. The chapter then turns to the European Union legal framework, and analyses enforcement measures rooted in primary EU law, the Digital Services Act and instruments addressing foreign information manipulation and interference (FIMI). In looking at the work of the Council of Europe, the ECHR and EU law, the authors help to identify the principles, limits and safeguards that currently make up enforcement action across Europe.

Chapter Three, by Mark D. Cole, focuses on countering disinformation. It examines enforcement measures at EU level and assesses policy tools such as the Code of Practice on Disinformation, fact-checking initiatives and trusted flaggers. This chapter then presents national case studies from Romania (by Roxana Radu), France (by William Gilles and Irène Bouhadana) and Ukraine (by Dariia Opryshko), illustrating how different legal systems respond to disinformation in sensitive contexts such as elections and, in Ukraine’s case, wartime foreign interference.

Chapter Four, authored by Mark Cole, focuses on terrorist content online. It analyses enforcement mechanisms under the EU Terrorist Content Online Regulation (TCOR), including removal orders with cross-border effect. National examples from Germany (by Sandra Schmitz-Berndt) and Türkiye (by Mehmet Bedii Kaya) illustrate how these approaches are implemented in practice, particularly in situations involving heightened security risks.

Chapter Five, authored by Mark D. Cole, examines how Europe enforces rules against defamatory, hateful and violence-inciting speech. Alongside EU-level analysis, the report presents national perspectives from Ireland (by Roderick Flynn), Austria (by Clara Rauchegger) and Italy (by Giovanni di Gregorio). The authors shed light on specific rules regarding defamatory, hateful and violence-inciting speech, evolving case law and the interaction between criminal law, platform obligations and user safeguards.

Chapter Six addresses other forms of harmful content, with a particular focus on the protection of minors. The case study from Poland (by Dr Krzysztof Wojciechowski) explores the latest developments regarding the alignment to EU legislation and the case study from the United Kingdom (by Mariette Jones) delves into tools such as access restrictions, age-verification measures and the implications of the UK Online Safety Act for platform responsibilities and child protection.

Chapter Seven, drafted by Mark D. Cole and Sandra Schmitz-Berndt, provides a comparative analysis of the various national examples studied. The authors highlight converging approaches driven by overarching EU legislation, but they also look into persistent differences in national enforcement practices across Europe. This chapter identifies remaining challenges, including fragmentation, proportionality concerns and the risk of over-removal of lawful content.

The report concludes by looking ahead, underlining the need for enforcement models that are effective, proportionate and firmly grounded in fundamental rights. It shows that while Europe has made significant progress in regulating online content, ensuring a safe and open digital environment remains an ongoing and complex task.

Enforcing rules on illegal content and disinformation online is published as part of the IRIS series and is available from the European Audiovisual Observatory. 

This new report is a must-read for: European and national policymakers and legislators; media, digital and online safety regulators; judges, legal practitioners and judicial

Strasbourg 12 February 2026
  • Diminuer la taille du texte
  • Augmenter la taille du texte
  • Imprimer la page