
Disinformation poses an increasing challenge to the formation of public opinion. While Article 5 of the German Constitution (GG) protects ‘freedom of speech’, there are also limits to this right, especially when disinformation contains content that is relevant under criminal law. Nevertheless, there is currently no specific criminal offense in Germany covering the spread of disinformation, which makes its legal classification particularly complex and case dependent.
At the same time, disinformation can pose a serious threat to democracy and stability. It undermines trust in state institutions, influences elections and promotes social division. Targeted disinformation campaigns controlled by domestic political actors or foreign influencers are particularly perfidious. A functioning legal framework is therefore essential to protect democratic structures without disproportionately restricting freedom of expression.
Limitations of Article 5 GG – ‘Freedom of Speech’
Article 5(1) GG guarantees the right to freely express and disseminate one’s opinion in speech, writing and pictures. Freedom of the press and freedom of reporting are also protected.[1],[2] This fundamental right forms the foundation of a democratic society and also protects uncomfortable or controversial opinions.
However, freedom of expression is not protected without limits. Article 5(2) GG stipulates that this fundamental right is limited by general laws, the statutory provisions for the protection of youth and the right to personal honour. These include in particular:
- Insult – § 185 German Criminal Code (StGB): When disinformation violates personal honour.[3],[4]
- Defamation (§ 186 StGB) and Slander (§ 187 StGB): When factually incorrect statements damage a person’s reputation.[5],[6]
- Incitement to hatred (§ 130 StGB): When disinformation stirs up hatred against certain groups.[7],[8]
So, while deliberately false claims with defamatory or inflammatory content may be punishable by law, freedom of expression remains a key protected right that must not be jeopardised by excessive regulation.
Limitations of German Civil Law
Victims of disinformation can file lawsuits under German Civil Law (BGB). Claims under tort law and general personal rights are particularly relevant here.
Claims arising from tort law (§§ 823, 1004 BGB) include Compensation (§ 823 (1) BGB); if disinformation causes economic or personal damage to a person (e.g. damage to reputation with loss of income), the perpetrator can be held liable; Injunctive relief (§ 1004 BGB analogous), anyone whose rights are infringed by disinformation can sue the perpetrator(s) for injunctive relief to prevent further dissemination.[9],[10]
Infringement of Personality Rights
The General Right of Personality protects individuals from defamatory or reputation-damaging disinformation (Article 2(1) in conjunction with Article 1 (1) GG). In serious cases, those affected can demand a counterstatement in accordance with German press law, a correction or monetary compensation (e.g. in the case of particularly serious false reports concerning someone’s personal life).[11],[12]
Limitations of Civil Law Protection
Despite these possibilities, there are practical problems: Anonymity of the perpetrators – many disinformation stories are spread via social media platforms, often anonymously or via foreign platforms, which makes it more difficult to enforce the law. Lengthy proceedings: Civil lawsuits can drag on for months or years, while disinformation goes viral in a matter of seconds. Limited deletion obligations of platforms: Social media platforms only have a limited obligation to delete disinformation if it is not obviously criminal.[13]
Limitations of German Criminal Code
Although certain types of disinformation are covered by existing criminal offenses, there are gaps in the legal situation: Firstly, the difficulty in distinguishing between opinion and factual allegation: Much disinformation contains a mixture of subjective opinion and deliberately false claims, which makes it difficult to classify it legally.[14] Secondly, the lack of a general offense: Unlike in other countries, such as France, there is no specific criminal offense for disinformation in Germany. Thirdly, technological developments: Modern technologies such as deepfakes or social bots make it more difficult to identify and prosecute the disseminators.
Need for reform – proposals for better regulation
German criminal law experts from various universities are demanding that disinformation per se should not be punishable, but its dissemination mechanisms should. Deepfakes, social bots and targeted disinformation campaigns should be regulated more strictly. They suggest that legislators focus on specific regulation of digital manipulation techniques instead of introducing general criminal liability for the spread of disinformation. This would restrict freedom of expression to a lesser degree, while targeted disinformation campaigns could be combated more effectively.[15]
Furthermore, a clearer legal definition of disinformation is required, so a distinction can be made between punishable and non-punishable content. A more precise legal definition could prevent individual expressions of opinion from being incorrectly classified as disinformation. There are also calls for increased responsibility of platform operators who deliberately spread or facilitate the spread of disinformation. This could be achieved, for example, through reporting obligations and mandatory counterstatements, like for the requirements in the Network Enforcement Act (NetzDG).[16]
An additional approach under discussion would be the introduction of a criminal offense such as § 276 Criminal Code Austria (old version), which was in force in Austria until 2015. The law criminalized the dissemination of false and alarming rumours.[17] A similar regulation in Germany could take targeted action against false reports that cause social unrest or deliberately sabotage democratic processes. However, critics see the danger of an excessive restriction of freedom of expression here.[18]
Role of EU Legislation in Platform Regulation
The European Union has taken significant steps in regulating online platforms to curb the spread of disinformation. The Digital Services Act (DSA) imposes stricter obligations on large online platforms, requiring them to implement transparent content moderation policies and remove illegal content swiftly.[19] Additionally, the EU Code of Practice on Disinformation, though voluntary, encourages platforms to take proactive measures against disinformation.[20] These regulations complement national laws by addressing cross-border disinformation and ensuring greater accountability of tech companies. However, the challenge lies in harmonising EU-wide rules with national legislation while safeguarding fundamental rights.
Contrasting Developments in the US
In stark contrast to the EU’s regulatory approach, the United States under the second Trump administration has significantly weakened efforts to combat disinformation. Content moderation obligations for platforms have been rolled back, with tech companies facing fewer requirements to remove misleading or harmful content. Legislative proposals aiming to hold social media companies accountable have largely stalled, and disinformation is increasingly framed as a matter of "free speech," making regulatory efforts politically contentious.[21] This deregulated environment has enabled the unchecked proliferation of false information, influencing public discourse and electoral processes. The divergence between the EU and the US highlights the global challenge of finding the right balance between addressing disinformation while maintaining democratic principles.
Challenges for Law Enforcement Authorities and the FERMI Solutions
If entrusted with the investigation of disinformation, such efforts pose considerable difficulties for law enforcement authorities. The lack of a clear legal basis, the anonymity of some perpetrators and the jurisdictional problems between different security authorities often make effective prosecution impossible.
To overcome these challenges, not only legal adjustments are necessary, but also improved cooperation between the various security authorities and the development of new technological tools. The FERMI project attempts to fill the technological void by developing tools that analyse the spread, origins, and influence of disinformation-related posts on social media (Spread Analyser). Other FERMI tools estimate the future crime landscape (Dynamic Flows Modeler) and propose counter-measures (Community Resilience Management Modeler), which should greatly advance the conception of mitigation efforts.
[1] Grundgesetz der Bundesrepublik Deutschland – GG: Art. 5 (1) GG; BGH NJW 1987, 2225; BVerfG NJW-RR 2000, 1209; NJW 2003, 1856
[2] Wissenschaftliche Dienste des deutschen Bundestages (WD10-MMM-067/16)
[3] Strafgesetzbuch – StGB: § 185 StGB
[4] Wissenschaftliche Dienste des Deutschen Bundestages (WD7-MMM-052/22)
[5] Strafgesetzbuch – StGB: § 186 StGB
[6] Wissenschaftliche Dienste des Deutschen Bundestages (WD7-MMM-052/22)
[7] Strafgesetzbuch – StGB: § 130 StGB
[8] Wissenschaftliche Dienste des Deutschen Bundestages (WD7-MMM-111/22)
[9] Bürgerliches Gesetzbuch – BGB: § 823 i.V.m. 1004 BGB
[10] Wissenschaftliche Dienste des Deutschen Bundestages (WD-7-MMM-052/16)
[11] Grundgesetz der Bundesrepublik Deutschland - GG: Art. 2(1) GG i.V.m. Art. 1(2) GG
[12]Dreier, H. (Ed.). (2001). Grundgesetz: Kommentar (Art. 2 I para. 22, pp. 69 ff.). C.H. Beck.; Di Fabio, U. (2001). In Maunz, T., & Dürig, G. (Ed.), Grundgesetz: Kommentar (Art. 2 para. 127 f., 39th ed. update, July 2001). C.H. Beck.; Kube, H. (2001). Persönlichkeitsrecht. In J. Isensee & P. Kirchhof (Ed.), Handbuch des Staatsrechts der Bundesrepublik Deutschland (3rd ed., Vol. VII, § 148, paras. 28 ff.). C.F. Müller.; Enders, C. (2001). In M. Merten & H. Papier (Ed.), Handbuch der Grundrechte (Vol. IV, § 89). C.F. Müller.; Starck, C. (2001). In von Mangoldt, H., Klein, F., & Starck, C. (Ed.), Grundgesetz: Kommentar (7th ed., Vol. 1, Art. 2 paras. 14 ff., 86 ff.). Franz Vahlen.; Murswiek, D. (2001). In Sachs, M. (Ed.), Grundgesetz: Kommentar 8th ed., Art. 2 paras. 59 ff..). C.H. Beck.; Hofmann, H. (2001). In Schmidt-Bleibtreu, B., & Klein, F. (Ed.), Grundgesetz Kommentar (14th ed., Art. 2 paras. 14 ff.). Luchterhand.
[13] VG Köln, cases 6 L 1277/21 and 6 K 3769/21
[14] Zimmermann, F. (2025). Strafbarkeit von Fake News de lege ferenda. Kriminalpolitische Zeitschrift, 50(1), 76–95.;
[15] Beck, M., & Nussbaum, T. (2025). Alles eine Frage der Wirkung? Zur Strafbarkeit der Verbreitung von Individuen- und gruppenbezogenen Fake News im Spiegel der Beleidigungs- und Volksverhetzungsdelikte de lege lata. Kriminalpolitische Zeitschrift, 50(1), 15–35.; Beck, M., & Nussbaum, T. (2025). Zur Strafbarkeit des Verbreitens von Fake News. Kriminalpolitische Zeitschrift, 50(1), 36–55.; Zimmermann, F. (2025). Die Strafbarkeit von Fake News de lege ferenda – mit besonderem Augenmerk auf Deepfakes, Social Bots und Filter Bubbles. Kriminalpolitische Zeitschrift, 50(1), 56–75.; Zimmermann, F. (2025). Strafbarkeit von Fake News de lege ferenda. Kriminalpolitische Zeitschrift, 50(1), 76–95.; Lammich, T. (2023). Fake News als Herausforderung des deutschen Strafrechts. Kriminalpolitische Zeitschrift, 48(3), 200–220.
[16] Netzwerkdruchsetzungsgesetz – NetzDG: § 3(2) NetzDG i.V.m. § 5 NetzDG
[17] Zimmermann, F. (2025). The criminal liability of fake news de lege ferenda – with special focus on deepfakes, social bots, and filter bubbles. Kriminalpolitische Zeitschrift, 50(1), 50–70.
[18] Ibid.
[19] Beck, M., & Nussbaum, T. (2025). Alles eine Frage der Wirkung? Zur Strafbarkeit der Verbreitung von Individuen- und gruppenbezogenen Fake News im Spiegel der Beleidigungs- und Volksverhetzungsdelikte de lege lata. Kriminalpolitische Zeitschrift, 50(1), 15–35.;
[20] Ibd.
[21] Merkley, J. (2025, February 24). Welch leads colleagues in introducing bill to increase accountability from social media platforms that knowingly host false election administration content. (https://www.merkley.senate.gov/welch-leads-colleagues-in-introducing-bill-to-increase-accountability-from-social-media-platforms-that-knowingly-host-false-election-administration-content/, accessed 25.02.2025); Politico. (2025, February 24). California drops parts of social media law challenged by Elon Musk’s X. Politico.com. (https://www.politico.com/news/2025/02/24/california-drop-parts-social-media-law-challenged-elon-musk-x-00205890; accessed 25.02.2025)