An early leader within the European Union (EU) and global platform regulation, Germany has exhibited a strong stance against dis/misinformation, with many other countries following suit. Although the field of content moderation and social media regulations in policy is still relatively new, Germany is a global leader, with its policies being a blueprint worldwide. German laws and policies regarding internet regulation showcase an EU trend of offering an alternative to Chinese authoritative monitoring and over-regulation and US prioritization of individual freedom and lack of stringent regulation.[1]
Germany’s social media policies include a central content moderation and data security act, but are also defined by their European Union membership. Previously, as digital platforms emerged as a new arena for information dissemination, they were largely self-regulated. However, as illegal content or dis/misinformation have proven to be more profitable with more clicks, and business models that incentivize engagement, global digital platform regulatory trends are turning more proactive. Germany’s relatively stringent content regulation compared to other countries could be crucial in counteracting this phenomenon. As Germany is a pioneer in social media regulation, its choices regarding policy will be critical in establishing global standards for the future, especially considering its emerging wave of populism and the global wave of misinformation in the media.
German Social Media Use and Freedom of Expression
The overall regulatory environment of Germany is comparable to other stable democratic countries, although there are growing concerns about populism and violent incidents stemming from social media. Germany is highly industrialized with a thriving technological industry and has the largest population and economy in the EU.[2] Germany has a substantial population and high rate of internet penetration, with a sizable social media market of around 77.7 million social media users, or 81.4% of the population in 2024.[3] Although the absolute German population decreased in 2023,[4] internet use increased. Not only is there a large social media user base, but the majority are frequent users, with two-thirds logging in to a social media platform at least once a day.[5]
In 2024, the most used social media or messaging platform in Germany was WhatsApp, followed by Instagram, then Facebook.[6] The average social media user accessed 5.4 platforms in one month,[7] indicating the diverse social media environment and variety of content consumed.
Recently, the German federal government has made digital infrastructure and 5G access a policy priority. As a part of this, in 2022, the “Digital Strategy and Gigabit Strategy” established fiberoptic and digital communication access as a goal for all areas by 2030.[8] Although compared to global standards, access to internet and social media is high, this shows that it is continually being improved and prioritized by the German government. The government considers overall internet use and digital communication a public good for the populace and is committed to developing access.
Germany is a long-standing democracy, with a democracy rating of 10 according to the Polity5 rating system.[9] Germany’s rating of 10 means it is a “fully institutionalized democracy.” Germany has a long democratic history with free, fair elections and a vibrant political culture.[10] In line with this, freedom of online media consumption and of expression are strongly protected in Germany, reflecting its commitment to liberal democratic principles. German users also are able to access the internet and social media sites with few restrictions. In 2023, Germany was classified as “free” concerning internet controls by Freedom House. Freedom House listed one area of internet restriction: blocking some websites hosting political, social, or religious content, which lowered Germany’s internet freedom score to 77/100.[11] There were also other recorded cases of the government putting internet controls into effect including blocked websites, new censorship laws, and new surveillance laws.[12] Therefore, although having relatively liberal internet restriction policy and freedom of speech online compared to global standards, there have been cases of internet restriction by the government recently. A Comparitech study found something similar—Germany has a high internet freedom with a score of 3 out of 11 (a lower score indicates less censorship). [13] This score of three signifies the three methods of internet censorship that the study found instances of in Germany, including torrent restriction and banning and restrictions on pornography. Although torrents aren’t strictly illegal, their use is monitored in Germany and fines or prison time could be threatened in certain cases.[14] Therefore, law firms often send cease and desist letters along with fines, a process that is allowed by the German government. However, torrenting use withstanding, Germany has high internet freedom as the other categories, including political material, were not banned. Germany had high performance for freedom of expression according to the Global State of Democracy Indices in 2023.[15] Despite this, the Global State of Democracy Report noted a troubling decline in media integrity in Western Europe, particularly in Germany and Austria.[16]
Other Factors Shaping Germany’s Social Media Landscape
Germany is a stable country with a strong government, but like any country it has social and political cleavages that fuel dis/misinformation. Germany has a 2024 Fragile States Index score of 24/120—the most recent in consistently low and decreasing scores, suggesting a strong state with the power to respond effectively to various crises.[17] The Fragile States Index measures countries across social, economic, political, and cohesion indicators for risks and vulnerabilities.[18] Also, it is a welfare state, meaning that there are social protections for disadvantaged groups which improves social cohesion. However, the European refugee crisis has caused social strife with strains on the welfare system and xenophobic trends creating possible conflict. A report from the Competence Centre for Right-Wing Extremism and Democracy Research at Leipzig University found that one in three Germans believe that refugees or immigrants come to Germany to exploit the strong welfare state.[19] This issue remains one continuously exploited by misinformation and disinformation tactics on social media in recent years. The far right has increasingly used social media to sensationalize the migrant or refugee inflow and relation to crime incidents, or mainstream conspiracy theories surrounding migrants.[20]
Likewise, Germany scored relatively low on the Global Terrorism Index (GTI) in 2024; however, it has had one of the strongest deteriorations in scores in Europe over the past decade.[21] GTI recorded 19 terrorist attacks in 2021, all conducted by far-left terrorist groups and individuals targeting businesses and infrastructure.[22] Social media also contributed to this score. For instance, the gaming service Twitch was used to livestream a synagogue attack in Halle, Germany, in 2019, with 2,200 people viewing live over the 35-minute stream.[23] The attacker, Stephan Balliet, had far-right ideological motivations, a history in the military, and weapons training.[24] The assault resulted in two deaths and several injuries before police arrived. The video was also posted to various white supremacist channels on the instant messaging service Telegram. The video stayed on the platform for approximately 30 minutes, but the video is still available on some sites such as 4chan.[25] Live streaming technology and social media have made it more accessible to spread extremist thought or broadcast criminal activity.
Further, there has been a rise in populism and increased partisanship in Germany. The far-right populist party Alternative for Germany or AfD is an example of this. The AfD was a relatively marginal party that began in 2012, but became a major opposition party in the Bundestag and performed very well in the 2017 federal election.[26] The AfD has focused its campaigns on Facebook, having more accounts, posts, and interactions than other political parties.[27] It is also important to note that in the 2021 election, all major parties, barring AfD, pledged to a code of conduct in digital campaign strategy.[28] By targeting issues people have emotional reactions to, such as immigration or crime, or encouraging participation and interaction with the party within the social media platform, AfD is able to propagate its image and message more effectively than other parties.
Germany is a long-standing model representative government with solid protection of democracy in the Constitution and cultural values placing importance on this. However, like everywhere, social media has been used to disseminate violent events and far-right ideologies as discussed above. The trend of fringe ideologies utilizing social media to propagate disinformation and influence politics is a rising concern in Europe and across the world.[29]
Platform Liability and Content Moderation Policy
Compared to other countries, Germany has numerous initiatives against disinformation.[30] Formulating policy surrounding content moderation and platform liability is difficult for lawmakers as it must stay in line with constitutional values while trying to address the current issues effectively. Any legislation needs to straddle trying to proactively keep this content offline while also maintaining civil rights such as freedom of speech and expression.
Content Moderation Policy
Germany’s main content moderation policy is the Network Enforcement Act (Netzwerkdurchsetzungsgesetz) or NetzDG.[31] The law was passed in 2017, with new amendments in 2021. NetzDG aims to protect against hate speech, misinformation/fake news, and other illegal acts on the internet. The scope of the law is limited to social media platforms with over two million users.[32] Platforms still have internal guidelines and take down content according to those standards, but now must also consider content that violates NetzDG standards and take action against that content under NetzDG processes.
Under the NetzDG, platforms are required to maintain clear procedures on content takedowns and blocking, provide accessible pathways to report content, and publish data on complaint trends in transparency reports to the public if they receive more than 100 complaints. Applicable social media companies must takedown explicitly illegal content within 24 hours and more ambiguous illegal content within seven days or forward the case to a relevant self-regulatory institution, such as Meta’s Oversight Board, an internal content moderation review body.[33] However, the demarcation between explicitly illegal content and ambiguous illegal content is not detailed in the law. Social media companies with more than 100 complaints are required to communicate their process of managing these complaints and content removal. These removal requests and complaints can be issued by users, complaints bodies, or the government. From March 2020, these ambiguous cases were forwarded to the German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM). The Federal Office of Justice recognized FSM as an institution for self-regulation which meets independence and transparency requirements. This more indeterminate illegal content will then be evaluated by a NetzDG review committee and its decisions will be binding to the social media companies. However, the FSM ceased these operations once NetzDG was repealed under the broader EU Digital Services Act (DSA), which will be discussed later in the paper.[34]
In 2021, NetzDG was amended to include increased transparency and accessibility requirements for complaint reporting, introducing an appeals procedure for complaints, and a stipulation that digital platforms must forward information on complaints to criminal authorities.[35] Content that violates criminal law will be deleted and could also result in criminal prosecution. However, Google, Meta, TikTok, and Twitter (now X) filed a lawsuit against the amendment and currently do not have to forward information on their users and complaints to the Federal Criminal Police Office.[36]
In fiscal quarters 3 and 4 of 2022, Facebook received 125,195 complaints related to NetzDG regulations, leading to the removal of 17,242 contents, or 13.77%, an increase from the latter half of 2020.[37] Likewise, the platform X (formerly Twitter), has observed an increasing number of complaints resulting in removal or blocking. In the first half of 2023, X received 1,101,456 complaints related to NetzDG and took action on 24.28% of them, an increase from 11.11% in 2021.[38] For TikTok in the first half of 2023, there were 202,747 total complaints, 16.33% of them resulting in the removal of content or suspension of the user. During the same period in 2021, TikTok reported 174,224 total complaints and 6.28% that resulted in action.[39] Google reported 193,131 complaints under the NetzDG law in the first half of 2023 from the YouTube platform with 15.98% removed.[40]
Critics are concerned that the trend of increasing complaints and actions related to NetzDG taken by digital platforms correlates to self-censorship and hampering freedom of speech and expression. The sheer number of content removals and bans leave many wondering how accurate or effective the system is. In 2018, a well-known satirical magazine in Germany, Titanic, was suspended and had content taken down for hate speech, despite being clearly satirical.[41] Cases such as this show the danger in content moderation where legitimate political discourse can be stifled. Therefore, critics argue that there needs to be mechanisms for recourse after takedowns and clearer processes.
Google has filed a complaint over the law, saying that the requirement to submit user data to law enforcement under suspicion of a crime is an overstep.[42] The prohibited content in the NetzDG act includes “insult, malicious gossip, defamation, public incitement to crime, incitement to hatred, disseminating portrayals of violence and threatening the commission of a felony” and relates to explicitly criminal content.[43] The primary mechanism that the law entails to prevent these acts is more transparent complaint management. Germany, in particular, bans the use of symbols or “propaganda material of unconstitutional and terrorist organisations.”[44] Therefore, this would also apply to social media according to the NetzDG policy. As the criminal content banned is not newly banned and is just being applied to new forums, the NetzDG law does not “introduce a new liability regime nor does it render previously legal speech illegal. It rather sets up a compliance regime under which the complaints management is to become more transparent and presumably more effective”, according to one paper.[45] The law aims to create a regulatory system for the enforcement of content regulation on digital platforms, hence the naming, Digital Enforcement Act[46].
In 2019, Facebook was fined two million euros for underreporting illegal content complaints in Germany and inaccessible reporting forms for users.[47] The Federal Office of Justice stated that the complaint process was difficult to find, resulting in artificially low numbers, meaning noncompliance with the standards under NetzDG. In October of 2022, Telegram was also fined by the Federal Office for Justice for noncompliance with NetzDG. Telegram was charged over five million euros for failing to develop tools for reporting illegal content and setting up a legal entity in Germany for interactions with the government.[48] The enforcement of NetzDG regulations by the German government on Facebook and Telegram shows that this legislation will have consequences for media intermediaries and result in fundamental changes in their behavior. This is especially true for Telegram, as they are seen as a safe space for radical opinions and content.[49]
With the passing of NetzDG, some predicted that user overreporting would occur and freedom of speech and expression would be suppressed or put into question. This continues to be debated. However, a report by the Counter Extremism Project and the Centre for European Policy Studies found this was not the case.[50] There have been no reports of taking down content based on false claims and no evidence of widespread blocking of users, or takedowns of content. On the contrary, the report produced recommendations about how the law could go even further to prevent illegal content, including targeting terrorist content, improving transparency, and making complaint standards more cohesive across companies.
In 2020, the German government ratified the Interstate Media Treaty or Medienstaatsvertrag (MStV) to comply with the 2018 EU revision of the Audiovisual Media Services Directive,[51] which applies to all member states and was required to be implemented in September 2020.[52] It was an update to the previous Rundfunkstaatsvertrag or State Broadcasting Treaty as an acknowledgement of the evolution of traditional media transmission. This is a significant step in updating the language of legislation to more accurately reflect the media landscape. Rather than simply targeting television and radio services, the wording of the law now refers more broadly to “media platforms, user interfaces, media intermediaries, and video sharing services.”[53] This language more accurately reflects the increasingly complicated and interconnected nature of digital platforms. The legislation was enacted to increase transparency and decrease algorithmic discrimination. Significantly, media providers are now required to disclose information about how their algorithms operate, designate authorized representatives and recipients relating to this legislation in Germany, ensure ease in retrievability of content, and ensure non-discrimination.[54] State Media Authorities are given the authority to conduct review processes under these criteria and issue fines of up to 500,000 euros.[55]
Although there are standards and consequences in German content moderation policy, much still depends on self-regulation by platform guidelines. For example, in 2021, Meta removed several Instagram and Facebook accounts of Querdenken-linked content. Querdenken is a COVID-19 Pandemic denying group based in Germany. Meta cited violations of its internal Community Standards, including “posting harmful health misinformation, hate speech and incitement to violence,” shutting down around 150 accounts because of threats to offline violence.[56] This action by Facebook was unprecedented in establishing consequences for “coordinated social harm” online.[57] In 2021, the German Federal Court of Justice – Bundesgerichtsho (BGH) issued a decision on Facebook’s internal policies on deleting posts and blocking accounts for violations of terms of use and community standards. The BGH ruled that Facebook must inform users about actions taken against their account, the reasoning, and give a method for disputing the decision.[58] This decision extends regulative processes beyond illegal content stipulated in the law to the digital platforms’ internal guidelines.
Platform Liability Policy
According to NetzDG, platforms must comply with the content moderation policies.[59] There are mandatory measures for social media platforms to continue operating in Germany. For example, platforms must include mechanisms for users to easily report illegal content, mechanisms for redress, and increased transparency regarding content policy. Once reported, the platforms are obligated to investigate whether the content violates criminal law, then take down or block the content within 24 hours of receiving the complaint for “clearly illegal content” or seven days for other illegal content. Finally, social media platforms must inform the users of all decisions made within their jurisdiction. The platforms are liable if they do not comply with this policy by not setting up a complaints system or taking down illegal content, they are subject to fines of up to 50 million euros for the company or five million euros for the person responsible for managing the complaint system.[60]
Some have criticized the high fines as it may lead to self and over-censorship by platforms that want to avoid the fines. However, only systemic violations of NetzDG lead to penalties, not individual cases of noncompliance. One early report found that the law’s passing has not led to mass takedowns of digital content in Germany.[61] Although the law has made it easier to report illegal content online, it has not systematically changed the behavior of online platforms. Contrary to criticism from some, the NetzDG has merely institutionalized the position of the government on digital platforms, but has not created much systemic change in behavior. The NetzDG is just the first step by Germany in creating a global order in content moderation and platform liability.
Digital platforms are not only liable to moderate content according to the German Criminal Code, but also German Copyright Law. In March 2021, internet service providers and the copyright industry formed a joint “Clearing Body for Copyright on the Internet” (CUII) to initiate Domain Name System Blocking (DNS) against copyright-infringing content on digital platforms.[62] Previously, these cases were decided by the judiciary, but they are now examined by the Federal Network Agency (Bundesnetzagentur or BNetzA) to see if they violate net neutrality, considering CUII recommendations. Therefore, content can be blocked based on copyright infringement without a court order, as previously required.
In May of 2021, German state and national governments passed several laws which amended the existing Act on the Management of Copyright and Related Rights by Collecting Societies.[63] These new laws harmonized the German laws with the EU Directive on Copyright in the Digital Single Market. The liability of platforms in monitoring copyright-violating content is covered in the German Copyright Service Provider Act (Urheberrechts-Diensteanbieter-Gesetz or UrhDaG), passed in July 2021 and came into force in August 2021. The DSM Copyright Directive stipulates that platforms are liable for user-uploaded content that violates copyright law. However, under UrhDaG, platforms are not responsible for repercussions of copyright violations by fulfilling licensing and blocking obligations and other processes detailed in the legislation. Blocking procedures would include pre-emptive upload filters. Also, not all digital platforms are liable under UrhDaG. Only platforms whose main purpose is uploading and organizing copyright-violating content and compete with similar platforms to earn a profit.[64] Content on digital platforms are moderated through a diverse range of laws and types of violating content, with copyright being one way.
Another way that platforms are liable to German law is through the Federal Data Protection Act (Bundesdatenschutzgesetz in German or BDSG for short).[65] This act primarily relates to the storage or exposure of personal data. Under this act, platforms that dedicate at least ten employees relating to process personal data must now employ a “data protection officer.”[66] Also, personal data may be processed in the context of employment with the employee’s consent. Violators could be subject to up to three years in prison or a fine.
Most recently, in May of 2023, the Federal Ministry of Justice (BMJ) announced a proposal for a “Law on the Protection of Digital Violence.[67] NetzDG focused on enforcing the criminal code online by placing responsibility on digital platforms. However, this law strengthens private law enforcement to target individuals engaging in violence online. Digital platforms are still liable for digital violence under this law and have several requirements to fulfill. If passed, civil claims against liable platforms can be introduced to block accounts found to be participating in digital violence, especially violence that seriously violates a victim’s personal rights. During this process, the account holders would be notified and could comment or dispute the blocking. The suspension or blocking would take place for a “reasonable period of time,” up to the discretion of the digital platform.[68] For charges to be filed against the offending account holder, platform operators would be required to store personal data such as IP addresses for future civil or criminal charges.[69] This aspect is taken by critics as restricting the right to anonymity and free speech.
European Union Regulation and Other International Organizations
Germany is a part of the European Union, and all member states must meet the standards of the Copenhagen Criteria, or the initial rules on whether a nation is eligible to join the EU, as well as various directives. If these are not implemented or followed, countries could be subject to fines or revocation of membership. One such EU regulation is the General Data Protection Regulation (GDPR). The purpose of the regulation is protecting individual rights over personal data which affects ads and content in social media. Many marketing tactics in social media use demographic or geographic-specific user data to tailor ads to those social media users. However, under the GDPR policy, this data is regulated, and users must consent to have their data processed. The German BDSG policy was created in harmony to supplement the GDPR.
Previously, the EU’s platform liability policy was determined by the 2000 E-Commerce Directive. This directive detailed standardized liability for online intermediaries across member states. When this legislation was enacted, the question of degree of liability for third party content on intermediary sites was just beginning to be discussed. The answer to this debate was the E-Commerce Directive under which the ‘safe harbour’ principle was decided. This principle stated that online intermediaries are not liable for third-party content unless they were aware of the illegality and took no action to moderate it, making self-regulation the main mechanism for content moderation regulation.[70] As this was an early piece of regulation, many gaps were quickly identified. The legislation did not make clear what organizations would be subject to these policies, especially as social media and information dissemination methods continued to evolve. Also, what types of content are illegal and how and to what degree intermediaries are expected to intervene were not clearly defined.
In October of 2022, the European Council approved the Digital Services Act (DSA).[71] The act became effective on February 17, 2024 and is applicable to all digital platforms in the EU. The aim of the act is to modernize the E-Commerce Directive that all member states are subject to. The act will harmonize different state regulations concerning illegal content. In particular, it will reconcile the differences between Germany’s NetzDG and similar laws in Austria (KoPI-G) and France (“Loi Avia”), meaning that these laws will have to be amended or repealed, and new laws relating to illegal content and disinformation online will have to be passed.[72]
Under the DSA, social media platforms are considered “intermediaries” and the DSA means new disclosure requirements for algorithms, advertisements, and content moderation. There are new obligations for social media companies specifically, including disclosing to regulators about algorithm mechanisms and transparency over content deletion. Platforms must conduct annual risk analyses of their products relating to the dissemination of illegal content, public security, discrimination, freedom of expression and information, gender-based violence, etc. and implement mitigatory measures from these analyses. Therefore, this regulation is minimized to ‘induced self-regulation’ rather than proactive regulation.[73]
However, the DSA does define several consequences that could arise from noncompliance, including a temporary suspension of the platform in the European market or fines up to six percent of a platform’s annual revenue. This would only apply to social media platforms in the EU with over 45 million users, categorized between “very large online platforms” (VLOP) or “very large online search engines” (VLOSE). As of November 2023, 19 sites fall under this threshold.[74] Also, according to the EU E-Commerce directive, member states must notify the European Commission of legislation relating to “information society service.”[75]
In contrast to the German NetzDG, the DSA does not include deadlines in deletion requirements or require any kind of reporting by the digital platforms to governmental institutions; rather it places the responsibility onto the platforms themselves to self-regulate. The act was also written with fairly vague language, as the scope is considerable. Future interactions with the act by digital platforms themselves, such as the lawsuit by Zalando over being categorized as a VLOP, will continue to establish the boundaries of the DSA and encourage active participation by digital platforms.[76] As much of the digital content moderation regulations are relatively new, the scope and specific procedures will evolve.
Oversight of the implementation of the DSA falls under the European Commission, supported by National-level Digital Service Coordinators (DSCs) and the European Centre for Algorithmic Transparency.[77] So far, two formal investigations have been started by the European Commission based on the DSA into X and TikTok, two significant VLOPs. The investigation into X was the first ever DSA investigation and was based upon claims of “breaches in its transparency obligations, failure to counter illegal content and disinformation, and deceptive design practices.”[78] One day later, the Commission announced a formal investigation against TikTok, according to the DSA, on February 19, 2024. This investigation focused on allegations against TikTok’s failure to integrate DSA requirements on illegal content, protection of minors, and data access.[79] EU Commission Directorate General Wezenbeek stated that future investigations and DSA priorities will be protecting children’s rights, election integrity, and hate speech.[80] These types of lawsuits will be very important in establishing precedent for these relatively recent regulations’ reach and power. It also allows for more direct discussion with specific platforms and how to apply the policies.
Tied to the DSA, the European Council approved the Digital Markets Act (DMA) in the same week.[81] Under this act, certain large social media companies classified as “gatekeepers,” or providers of “core platform services,” who engage in monopolization tactics or have significant market power, defined by criteria such as annual turnover, average market capitalization, and number of active users, will be subject to restrictions.[82] Specifically, Facebook is predicted to be targeted by this act. Social media companies will be unable to reuse personal data; for example, Facebook’s subsidiary WhatsApp’s data could not be used for Facebook. Also, data concerning ads and publications posted on the “gatekeeper” platforms will be able to be accessed by advertisers or other business users. This means that smaller social media platforms will not be as stifled by large social media platforms that take the majority of the market. It will allow for a greater diversity of platforms and different interactions with social media.
The EU has established terrorism as a specific agenda for reform, specifically terrorist content online. In April 2021, new rules on terrorist content online were approved by the EU Parliament. According to this legislation, digital platforms must take action on terrorist content removal notices issued by member state authorities within one hour of receiving the notice. Content that “incites, solicits or contributes to terrorist offenses; provides instructions for such offenses; or solicits people to participate in a terrorist group” is subject to removal notices.[83] This legislation entered into force in June of 2022 and includes possible fines at the discretion of member countries, but not to exceed 4% of the platform’s global annual income.[84] Platforms are also obliged to maintain transparency and publish information on their takedown actions and how they identify terrorist content in an annual report; however, they are not obliged to monitor or filter for terrorist content. Not only are platforms responsible for taking down content after receiving a notice, but they also must ensure similar content is not uploaded again.[85] The methodology for ensuring this is up to the platform and not detailed in the legislation. However, examples of “[including] appropriate technical or operational measures or capacities such as staffing or technical means to identify and expeditiously remove or disable access to terrorist content, mechanisms for users to report or flag alleged terrorist content” are listed.[86] Critics have said that this law is too vague in terms of the specific actions digital platforms must take and that the one-hour deadline could induce self-censorship to avoid fines. But, it would be difficult to be more specific as the regulation applies to all member states, which all have different social media landscapes and issues with terrorism. One way to address potential self-censorship would be complying with initial take downs but then having a post take down analysis of the content and whether it truly meets the guidelines in order to maintain initial security concerns but also protect freedom of expression.
The EU Commission published The Code of Practice on Disinformation in 2018. It was established as a framework to harmonize the understanding of disinformation among technology firms and establish a policy and self-regulatory structure. These commitments include increased review of ad placements, transparency of political advertising, making sure bots are clearly identified, prioritizing authentic and relevant information, and tracking disinformation.[87] Significant signatories include Google, Facebook, and X, which pledged to prepare yearly self-reports for the commission. This code was revisited in 2021 with “Guidance to strengthen the Code of Practice on Disinformation” being published by the Commission.[88] The revised version introduced increased commitments, including more extensive participation, demonetizing disinformation, providing user tools to flag disinformation, and increased fact-checking and monitoring.[89]
The aspect of demonetizing disinformation is especially significant. The increased demonetization in the revised code includes more stringent review and eligibility for content monetization and using ad sellers that have shown a commitment to banning disinformation content.[90] One reason disinformation is so prevalent in the digital sphere is that it is profitable. Fake news with eye-catching titles or scandalous stories generate more attention, clicks, and revenue for the websites.[91] The intention of the code is to take away part of the incentive and revenue that comes from disinformation.
Another main improvement of the code is the addition of performance indicators. Before, although there were broad values and goals, there was no way for a company to measure its success in fulfilling the code as well as analyze if the code is an effective policy method to counteract disinformation. As of 2023, these structural indicators have passed their pilot stage including prevalence and source of disinformation.[92] The code is also applicable under the DSA as it is considered a precursor to the anticipated regulatory framework of the DSA and an outline for compliance with the DSA. Article 45 of the DSA recommends signing the EU’s voluntary codes of content, including the Code of Practice of Disinformation.[93] The revised code is an opportunity for companies to work with the Commission to voluntarily revise their content moderation and disinformation policies in anticipation of DSA.[94] Recently, X withdrew from the code in May of 2023, following the company’s incomplete report submission to the EU in 2022, showing the voluntary nature.[95] As the DSA became obligatory in August of 2023, X will nonetheless be subject to disinformation regulations and future legal repercussions.
In December 2023, the European Parliament and the Council on the European Media Freedom Act found a consensus on the European Media Freedom Act (EMFA) proposed by the European Commission in 2022.[96] Together with a recommendation by the Commission, it advocates for editorial independence and media ownership transparency. It builds upon the DSA and DMA, using the DSA definition of VLOP and similar language. It seeks to uphold the independence of media and journalists, conduct media pluralism tests, and establish requirements for state advertising allocation. These activities will be overseen by the proposed European Board for Media Services, issuing opinions and creating guidelines. If approved by the EU Parliament and Council, it will be binding to member states in 15 months. This policy combined with the regulation directly moderating the content of social media platforms will complement each other to encourage more moral and trustworthy digital media. The collaboration of regulation from regulatory authorities as well as more soft policy that addresses the broader issues in digital communication is crucial.
In addition to membership in the EU, Germany is also part of the Group of 7 (G7). At the most recent 2024 G7 summit, the Hiroshima AI Process was developed. These guidelines will be voluntary and complement the EU AI Act currently under review.[97] The Comprehensive Policy Framework includes guidelines on generative AI, which are particularly important to digital platforms. Generative AI is an upcoming phenomenon with little regulation and the potential to exacerbate dis/misinformation online. The policies include potential digital watermarking of AI-generated content and protections for intellectual property and personal information to create an international standard.[98] G7 Digital and Tech Ministers also discussed future monitoring methods, a related international body under the OECD, and outreach to non-member states for implementation. The EU Commission, as a non-enumerated member of the G7, was involved in the drafting process and will incorporate the results into the EU AI Act as well.[99] Although G7 policies are less binding than EU policies, this showcases the aligned policy goals of some of the most economically powerful countries, including Germany.
Conclusion
German laws in social media and content moderation, especially NetzDG, have been used as a template for similar laws around the world such as in France, Russia, Austria, and countries in Southeast Asia. However, the law has been widely criticized by social media companies because of unclear language and the high levels of financial penalties.[100] Others, such as human rights groups, criticize the law because of the possibility to hamper freedom of speech and expression and could lead to censorship. Human Rights Watch stated that “[NetzDG] can lead to unaccountable, overbroad censorship and should be promptly reversed.”[101] However, as technology and human interaction with technology changes and develops, there must be new legislation to monitor and regulate this. The German model of regulation illustrates a moderate alternative to authoritarian over-censorship and laissez faire social media deregulation. Germany is a global leader in digital platform regulation with content moderation, and antitrust laws in digital platforms, and is one of the first to make such regulations attempting to address problems in digital platforms such as mis/malinformation.[102]
Endnotes
[1] Shahbaz, A., & Funk, A. (2021). Freedom on the Net 2021: The Global Drive to Control Big Tech. Freedom House; Freedom House. https://freedomhouse.org/report/freedom-net/2021/global-drive-control-big-tech
[2] Germany – The World Factbook. (2023, October 6). CIA.GOV; Central Intelligence Agency of the United States. https://www.cia.gov/the-world-factbook/countries/germany/#introduction
[3] Kemp, S. “Digital 2024: Germany.” DataReportal – Global Digital Insights, Kepios, 21 Feb. 2024, https://datareportal.com/reports/digital-2024-germany.
[4] Kemp, S. “Digital 2024: Germany.”
[5] “Germany Social Media Statistics 2023 | Most Used Popular Platforms – The Global Statistics.” The Global Statistics, 10 Oct. 2023, https://www.theglobalstatistics.com/germany-social-media-statistics/?expand_article=1.
[6] Kemp, S. “Digital 2024: Germany.”
[7] Kemp, S. “Digital 2024: Germany.”
[8] Broadband in Germany | Shaping Europe’s digital future. (2021). European Commission. Retrieved February 25, 2024, from https://digital-strategy.ec.europa.eu/en/policies/broadband-germany
[9] Center for Systemic Peace. (2021). The Polity Project. PolityProject. Retrieved December 16, 2022, from https://www.systemicpeace.org/polityproject.html
[10] Freedom House. (2022). Germany: Freedom in the World 2022 Country Report. Freedom House. Retrieved December 16, 2022, from https://freedomhouse.org/country/germany/freedom-world/2022
[11] Freedom House. (2024). Germany: Freedom in the World 2024 Country Report. Freedom in the World 2024; Freedom House. https://freedomhouse.org/country/germany/freedom-world/2024
[12] Germany: Freedom on the Net 2022 Country Report. (2022). Freedom House; Freedom House. https://freedomhouse.org/country/germany/freedom-net/2022
[13] Bischoff, P. (2023, October 16). Internet Censorship 2024: A Map of Internet Censorship and Restrictions – Comparitech. Comparitech; Comparitech Limited. https://www.comparitech.com/blog/vpn-privacy/internet-censorship-map/
[14] Schachner, T. (2022, September 9). How to torrent in Germany (safely and privately) in 2022. VPNBrains. Retrieved December 17, 2022, from https://www.vpnbrains.com/blog/torrenting-in-germany/
[15] Germany | The Global State of Democracy. (2023). International IDEA. Retrieved June 24, 2024, from https://www.idea.int/democracytracker/country/germany
[16] International IDEA. (2022). Global State of Democracy Report 2022: Forging social contracts in a time of discontent. Global State of Democracy Initiative. Retrieved December 16, 2022, from https://idea.int/democracytracker/gsod-report-2022
[17] Country Dashboard. (2024). Fragile States Index. Retrieved June 24, 2024, from https://fragilestatesindex.org/country-data/
[18] The Fragile States Index Team. (2023). About. Fragile States Index Annual Report 2023. Retrieved June 24, 2024, from https://fragilestatesindex.org/wp-content/uploads/2023/06/FSI-2023-Report_final.pdf
[19] Nielsen, N. (2018, November 9). Xenophobia on the rise in Germany, study finds. Euobserver.com; EUobserver. https://euobserver.com/migration/143336
[20] Seibriger, M. (2024, April 18). Changing tides: Discourse towards migrants and asylum seekers on Facebook and X in Germany in 2023. ISD. Retrieved April 29, 2024, from https://www.isdglobal.org/digital_dispatches/changing-tides-discourse-towards-migrants-and-asylum-seekers-on-facebook-and-x-in-germany-in-2023/
[21] Institute for Economics & Peace. Global Terrorism Index 2024: Measuring the Impact of Terrorism, Sydney, February 2024. Available from: http://visionofhumanity.org/resources (accessed June 24, 2024)
[22] Institute for Economics and Peace. (2022, March). Global Terrorism Index 2022: Measuring the Impact of Terrorism. Retrieved December 16, 2022, from http://visionofhumanity.org/resources
[23] Graham, M. G., & Haselton, T. (2019, October 9). About 2,200 people watched the German synagogue shooting on Amazon’s twitch. CNBC. Retrieved December 16, 2022, from https://www.cnbc.com/2019/10/09/the-german-synagogue-shooting-was-streamed-on-twitch.html
[24] “Far-Right Terrorism in Germany: Shooting Exposes Lapses in Security Apparatus – DER SPIEGEL.” DER SPIEGEL | Online-Nachrichten, DER SPIEGEL, 11 Oct. 2019, https://www.spiegel.de/international/germany/far-right-terrorism-in-germany-shooting-exposes-lapses-in-security-apparatus-a-1291075.html.
[25] Graham, M. G., & Haselton, T. (2019)
[26] Lees, C. (2018). The ‘Alternative for Germany’: The rise of right-wing populism at the heart of Europe. Politics, 38(3), 295-310. https://doi.org/10.1177/0263395718777718
[27] Diehl, Jörg, et al. “Germany: AfD Populists Dominate on Facebook – DER SPIEGEL.” DER SPIEGEL | Online-Nachrichten, DER SPIEGEL, 29 Apr. 2019, https://www.spiegel.de/international/germany/germany-afd-populists-dominate-on-facebook-a-1264933.html.
[28] Jaursch, Julian. “Disinformation in the 2021 German Federal Elections: What Did and Did Not Occur | Institut Montaigne.” Institut Montaigne, Institut Montaigne, 10 May 2021, https://www.institutmontaigne.org/en/expressions/disinformation-2021-german-federal-elections-what-did-and-did-not-occur.
[29] Schmidt, N. (2019, April 19). Far-right groups shout the loudest on social media. Investigate Europe; Investigate Europe. https://www.investigate-europe.eu/posts/far-right-groups-shout-the-loudest-on-social-media
[30] Bayer, J., By, & Hernández-Echevarría, C. (2021, October 13). Policies and measures to counter disinformation in Germany: The power of informational communities: Heinrich Böll stiftung: Brussels Office – European Union. Heinrich-Böll-Stiftung. Retrieved December 16, 2022, from https://eu.boell.org/en/2021/10/13/policies-and-measures-counter-disinformation-germany-power-informational-communities
[31] Federal Ministry of Justice. (2017). Act to Improve Enforcement of the Law in Social Networks (Network Enforcement Act, NetzDG) – Basic Information (2017). Bundesministerium der Justiz. Retrieved December 16, 2022, from https://www.bmj.de/DE/Themen/FokusThemen/NetzDG/NetzDG_EN_node.html
[32] Tworek, H. (2019, April 15). An Analysis of Germany’s NetzDG Law. Transatlantic Working Group; University of British Columbia. https://www.ivir.nl/publicaties/download/NetzDG_Tworek_Leerssen_April_2019.pdf
[33] FSM’s role of self-regulation under the German Network Enforcement Act. (2020, August 17). INHOPE. Retrieved February 10, 2024, from https://www.inhope.org/EN/articles/fsms-role-of-self-regulation-under-the-german-network-enforcement-act
[34] NetzDG. (2023). FSM. Retrieved February 10, 2024, from https://www.fsm.de/en/fsm/netzdg/
[35] Gesley, J. (2021, July 6). Germany: Network Enforcement Act Amended to Better Fight Online Hate Speech. Library of Congress. Retrieved February 22, 2024, from https://www.loc.gov/item/global-legal-monitor/2021-07-06/germany-network-enforcement-act-amended-to-better-fight-online-hate-speech/
[36] Noyan, O. (2022, February 2). Big tech opposes Germany’s enhanced hate speech law. Euractiv. Retrieved February 21, 2024, from https://www.euractiv.com/section/internet-governance/news/german-reinforcement-of-hate-speech-law-faces-opposition-from-big-online-platforms/
[37] Freedom House. (2022). Germany: Freedom in the World 2022 Country Report.
[38] Twitter Netzwerkdurchsetzungsgesetzvericht; Januar – Juni 2023. (2023). X Transparency Center. Retrieved February 19, 2024, from https://transparency.twitter.com/content/dam/transparency-twitter/country-reports/germany/NetzDG-Jan-Jun-2023.pdf
[39] NetzDG Transparenzbericht Jan – Jun 2021. (2023). TikTok. Retrieved February 19, 2024, from https://www.tiktok.com/transparency/de-de/netzdg-2021-1/
[40] Removals under the Network Enforcement Law – Google Transparency Report. (2023). Google Transparency Report. Retrieved February 19, 2024, from https://transparencyreport.google.com/netzdg/youtube?hl=en
[41] Martin, D. (2018, January 6). Satire magazine back on Twitter after ban – DW – 01/06/2018. DW. Retrieved February 19, 2024, from https://www.dw.com/en/german-satire-magazine-titanic-back-on-twitter-following-hate-speech-ban/a-42046485
[42] Reuters. (2021, July 27). Google takes legal action over Germany’s expanded hate-speech law. Reuters. https://www.reuters.com/technology/google-takes-legal-action-over-germanys-expanded-hate-speech-law-2021-07-27/
[43] Federal Ministry of Justice. (2017). Act to Improve Enforcement of the Law in Social Networks (Network Enforcement Act, NetzDG) – Basic Information (2017). Bundesministerium der Justiz. Retrieved December 16, 2022, from https://www.bmj.de/DE/Themen/FokusThemen/NetzDG/NetzDG_EN_node.html
[44] German Criminal Code (Strafgesetzbuch – StGB) (2021). Retrieved from https://www.gesetze-im-internet.de/englisch_stgb/englisch_stgb.html.
[45] Schmitz, Sandra and Berndt, Christian, The German Act on Improving Law Enforcement on Social Networks (NetzDG): A Blunt Sword? (December 14, 2018). Available at SSRN: https://ssrn.com/abstract=3306964 or http://dx.doi.org/10.2139/ssrn.3306964
[46] Zurth, Patrick, The German NetzDG as Role Model or Cautionary Tale? Implications for the Debate on Social Media Liability (April 27, 2021). 31 Fordham Intellectual Property, Media & Entertainment Law Journal 1084 (2021), Available at SSRN: https://ssrn.com/abstract=3668804 or http://dx.doi.org/10.2139/ssrn.3668804
[47] Escritt, T. (2019, July 2). Germany fines facebook for under-reporting complaints. Reuters. Retrieved December 17, 2022, from https://www.reuters.com/article/uk-facebook-germany-fine-idUKKCN1TX1I0
[48] Tech & Terrorism: Germany Fines Telegram For Failing To Comply With Online Content Moderation Law. (2022, October 21). Counter Extremism Project. Retrieved February 10, 2024, from https://www.counterextremism.com/press/tech-terrorism-germany-fines-telegram-failing-comply-online-content-moderation-law
[49] Germany: Telegram becoming a ‘medium for radicalization’. (2022, January 26). AP News. Retrieved February 10, 2024, from https://apnews.com/article/coronavirus-pandemic-technology-health-business-germany-35249e78c65b9010a5bc67e9ec1e06b8
[50] Echikson, W., & Knodt, O. (2018, November). Germany’s NetzDG:. Counter Extremism Project. Retrieved February 10, 2024, from https://www.counterextremism.com/sites/default/files/CEP-CEPS_Germany%27s%20NetzDG_020119.pdf
[51] European Union: European Parliament, European Parliament legislative resolution of 2 October 2018 on the proposal for a directive of the European Parliament and of the Council amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services in view of changing market realities (COM(2016)0287 – C8-0193/2016 – 2016/0151(COD)), 02 October 2018, https://www.europarl.europa.eu/doceo/document/TA-8-2018-0364_EN.html?redirect#title1
[52] Hansen, M. (2020, February 13). Germany Likely to Adopt Unique Regulatory Regime for Intermediaries to Media Services. Inside Global Tech. Retrieved February 8, 2024, from https://www.insideglobaltech.com/2020/02/13/germany-likely-to-adopt-unique-regulatory-regime-for-intermediaries-to-media-services/
[53] The new German State Media Treaty – legal requirements on telemedia. (2020, August 5). Simmons & Simmons. Retrieved February 8, 2024, from https://www.simmons-simmons.com/en/publications/ckdhmlhufqy9g09265hhz6tw1/the-new-german-state-media-treaty—legal-requirements-on-telemedia
[54] Simmons & Simmons, 2020.
[55] OECD DIRECTORATE FOR FINANCIAL AND ENTERPRISE AFFAIRS COMPETITION COMMITTEE. (2021, December 3). News Media and Digital Platforms – Note by Germany. Retrieved February 8, 2024, from https://one.oecd.org/document/DAF/COMP/WD(2021)69/en/pdf
[56] Gleicher, N. (2021, September 16). Removing New Types of Harmful Networks | Meta. Meta. Retrieved February 13, 2024, from https://about.fb.com/news/2021/09/removing-new-types-of-harmful-networks/
[57] Heldt, A. (2021, September 28). Facebook suspends accounts of German Covid-19-deniers: Can „Coordinated Social Harm“ be a justification for limiting freedom of expression? Verfassungsblog. Retrieved February 13, 2024, from https://verfassungsblog.de/querdenker-suspension-fb/
[58] Etteldorf, C. (2021). [DE] FEDERAL SUPREME COURT FINDS FACEBOOK TERMS OF USE INEFFECTIVE IN RELATION TO HATE SPEECH. IRIS Merlin. Retrieved February 21, 2024, from https://merlin.obs.coe.int/article/9273
[59] Federal Ministry of Justice. (2017)
[60] Federal Ministry of Justice. (2017)
[61] Echikson, William and Knodt, Olivia, Germany’s NetzDG: A Key Test for Combatting Online Hate (November 22, 2018). CEPS Policy Insight, Available at SSRN: https://ssrn.com/abstract=3300636
[62] Cooke, C. (2021, March 15). New organisation launched in Germany to allow web-blocking without court orders | Complete Music Update. CMU. Retrieved February 26, 2024, from https://archive.completemusicupdate.com/article/new-organisation-launched-in-germany-to-allow-web-blocking-without-court-orders/
[63] Freedom House. (2023). Germany: Freedom in the World 2023 Country Report.
[64] Leška, R. (2022, February 28). Implementation of Art. 17 DSM Directive into German National Law – the German Act on the Copyright Liability of Online Content Sharing Service Providers (UrhDaG). Kluwer Copyright Blog. Retrieved February 27, 2024, from https://copyrightblog.kluweriplaw.com/2022/02/28/implementation-of-art-17-dsm-directive-into-german-national-law-the-german-act-on-the-copyright-liability-of-online-content-sharing-service-providers-urhdag/
[65] Federal Data Protection Act (BDSG) (n.d.). bill. Retrieved December 16, 2022, from https://www.gesetze-im-internet.de/englisch_bdsg/englisch_bdsg.html#p0012.
[66] Federal Data Protection Act (BDSG)
[67] German Law on the Protection of Digital Violence Must Not Curtail Civil Rights – eco. (2023, May 24). eco – Association of the Internet Industry. Retrieved February 27, 2024, from https://international.eco.de/presse/german-law-on-the-protection-of-digital-violence-must-not-curtail-civil-rights/
[68] Clasen, A. (2023, April 17). Germany plans legislation to block cyber-hate accounts. Euractiv. Retrieved February 27, 2024, from https://www.euractiv.com/section/platforms/news/germany-plans-legislation-to-block-cyber-hate-accounts/
[69] Meineck, S., & Pitz, L. (2023, November 24). Digitale Gewalt: Bundesregierung im Blindflug. Netzpolitik. Retrieved February 27, 2024, from https://netzpolitik.org/2023/digitale-gewalt-bundesregierung-im-blindflug/
[70] Madiega, T. (2020, April 16). Reform of the EU liability regime for online intermediaries. European Parliament. Retrieved February 8, 2024, from https://www.europarl.europa.eu/RegData/etudes/IDAN/2020/649404/EPRS_IDA(2020)649404_EN.pdf
[71] Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC, (2020), 15 December
[72] epicenter.works. (2020, September 16). First Analysis of the Austrian anti-hate speech law (netdg/koplg). European Digital Rights (EDRi). Retrieved December 17, 2022, from https://edri.org/our-work/first-analysis-of-the-austrian-anti-hate-speech-law-netdg-koplg/
[73] Bayer, J., By, & Hernández-Echevarría, C. 2021.
[74] Tackling disinformation: the EU Digital Services Act explained. (2023, November 11). Siren Associates. Retrieved February 8, 2024, from https://sirenassociates.com/policy-papers/the-eu-digital-services-act-overview-and-opportunities/
[75] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (2000) Official Journal L 178, P. 0001 – 0016
[76] Chee, F. Y. (2023, June 27). Zalando sues EU Commission over online content rules, seen as first challenge. Reuters. Retrieved February 27, 2024, from https://www.reuters.com/business/retail-consumer/zalando-sues-eu-commission-over-landmark-online-content-rules-2023-06-27/
[77] Siren Associates, 2023.
[78] Miller, G. (2023, December 18). EU Tests DSA with Investigative Proceedings Against Musk’s X | TechPolicy.Press. Tech Policy Press. Retrieved February 27, 2024, from https://www.techpolicy.press/eu-tests-dsa-with-investigative-proceedings-against-musks-x/
[79] DSA: Commission opens formal proceedings against TikTok. (2024, February 19). European Commission. Retrieved February 27, 2024, from https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926
[80] Miller, G. (2024, February 19). The Digital Services Act Is Fully In Effect, But Many Questions Remain | TechPolicy.Press. Tech Policy Press. Retrieved February 27, 2024, from https://www.techpolicy.press/the-digital-services-act-in-full-effect-questions-remain/
[81] (Digital Markets Act) (2020)
[82] Kawaguchi, T. (2024, April 28). How democratic states are regulating digital platforms. The Japan Times. Retrieved May 31, 2024, from https://www.japantimes.co.jp/commentary/2024/04/28/world/digital-platform-regulation/
[83] Leška, R. (2022, February 28)
[84] Leška, R. (2022, February 28)
[85] Goujard, C. (2022, June 7). Online platforms now have an hour to remove terrorist content in the EU. POLITICO.eu. Retrieved February 27, 2024, from https://www.politico.eu/article/online-platforms-to-take-down-terrorist-content-under-an-hour-in-the-eu/
[86] Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (2021) EUR – Lex
[87] European Commission. (2022, June 16). 2018 Code of Practice on Disinformation | Shaping Europe’s digital future. EU Commission; European Union. https://digital-strategy.ec.europa.eu/en/library/2018-code-practice-disinformation
[88] Commission presents guidance to strengthen the Code of Practice on Disinformation. (2021, May 26). European Commission – European Commission; European Union. https://ec.europa.eu/commission/presscorner/detail/en/ip_21_2585
[89] European Commission. (2022, June 16) 2022 Strengthened Code of Practice on Disinformation | Shaping Europe’s digital future. European Union. https://digital-strategy.ec.europa.eu/en/library/2022-strengthened-code-practice-disinformation
[90] Tanner, B. (2022, August 5). EU Code of Practice on Disinformation. Brookings; Brookings Institute. https://www.brookings.edu/articles/eu-code-of-practice-on-disinformation/
[91] Dizikes, P. (2018, March 8). Study: On Twitter, false news travels faster than true stories. MIT News; Massachusetts Institute of Technology. https://news.mit.edu/2018/study-twitter-false-news-travels-faster-true-stories-0308
[92] Monitoring disinformation online and the effectiveness of the Code of Practice on Disinformation: The first pilot measurement of Structural Indicators is now available. (2023, October 25). Centre for Media Pluralism and Freedom; European University Institute. https://cmpf.eui.eu/first-pilot-measurement-of-structural-indicators-on-disinformation-2/
[93] Digital Services Act (2020)
[94] Guidance to strengthen the Code of Practice on Disinformation – Questions and Answers. (2021, May 26). European Commission ; European Union. https://ec.europa.eu/commission/presscorner/detail/pl/QANDA_21_2586
[95] Krukowska, E. (2023, May 27). Twitter Withdraws From EU Disinformation Code: Commissioner. Time. Retrieved February 8, 2024, from https://time.com/6283183/twitter-withdraws-from-eu-disinformation-code-commissioner-says/
[96] Commission welcomes political agreement on EMFA. (2023, December 15). European Commission. Retrieved February 8, 2024, from https://ec.europa.eu/commission/presscorner/detail/en/ip_23_6635
[97] Commission welcomes G7 leaders’ agreement on Guiding Principles and a Code of Conduct on Artificial Intelligence. (2023, October 30). European Commission. Retrieved February 8, 2024, from https://ec.europa.eu/commission/presscorner/detail/en/ip_23_5379
[98] HIROSAWA, M. (2023, December 2). G7 agrees on first comprehensive guidelines for generative AI. Nikkei Asia. Retrieved February 8, 2024, from https://asia.nikkei.com/Business/Technology/G7-agrees-on-first-comprehensive-guidelines-for-generative-AI
[99] Commission Welcomes G7 Leaders’ Agreement on Guiding Principles and a Code of Conduct on Artificial Intelligence | EEAS. (2023, November 2). EEAS. Retrieved February 8, 2024, from https://www.eeas.europa.eu/delegations/montenegro/commission-welcomes-g7-leaders-agreement-guiding-principles-and-code-conduct-artificial-intelligence_en?s=225
[100] Hate Aid. (2019, June 17). Stellungnahme zum Gesetzentwurf zur Änderung des Netzwerkdurchsetzungsgesetzes. HateAid. Retrieved December 17, 2022, from https://cdn.netzpolitik.org/wp-upload/2017/05/Facebook_Stellungnahme_zum_Entwurf_des_NetzDG.pdf
[101] Human Rights Watch. (2020, October 28). Germany: Flawed social media law. Human Rights Watch. Retrieved December 17, 2022, from https://www.hrw.org/news/2018/02/14/germany-flawed-social-media-law
[102] Germany’s New Competition Rules for Tech Platforms. (2021, January 19). Jones Day. Retrieved February 19, 2024, from https://www.jonesday.com/en/insights/2021/01/germany-adopts-new-competition-rules
Bibliography
AP News. (2022, January 26). Germany: Telegram becoming a ‘medium for radicalization’. Retrieved February 10, 2024, from https://apnews.com/article/coronavirus-pandemic-technology-health-business-germany-35249e78c65b9010a5bc67e9ec1e06b8
Bayer, J., By, & Hernández-Echevarría, C. (2021, October 13). Policies and measures to counter disinformation in Germany: The power of informational communities: Heinrich Böll stiftung: Brussels Office – European Union. Heinrich-Böll-Stiftung. Retrieved December 16, 2022, from https://eu.boell.org/en/2021/10/13/policies-and-measures-counter-disinformation-germany-power-informational-communities
Bischoff, P. (2023, October 16). Internet Censorship 2024: A Map of Internet Censorship and Restrictions – Comparitech. Comparitech; Comparitech Limited.
Center for Systemic Peace. (2010). Polity IV Country Report 2010: Germany – Systemicpeace.org. PolityIV Country Reports 2010. Retrieved December 17, 2022, from http://www.systemicpeace.org/polity/Germany2010.pdf
Center for Systemic Peace. (2021). The Polity Project. PolityProject. Retrieved December 16, 2022, from https://www.systemicpeace.org/polityproject.html
Central Intelligence Agency of the United States. 2023, October 6). Germany – The World Factbook. (CIA.GOV) https://www.cia.gov/the-world-factbook/countries/germany/#introduction
Centre for Media Pluralism and Freedom; European University Institute. (2023, October 25). Monitoring disinformation online and the effectiveness of the Code of Practice on Disinformation: The first pilot measurement of Structural Indicators is now available. https://cmpf.eui.eu/first-pilot-measurement-of-structural-indicators-on-disinformation-2/
Chee, F. Y. (2023, June 27). Zalando sues EU Commission over online content rules, seen as first challenge. Reuters. Retrieved February 27, 2024, from https://www.reuters.com/business/retail-consumer/zalando-sues-eu-commission-over-landmark-online-content-rules-2023-06-27/
Clasen, A. (2023, April 17). Germany plans legislation to block cyber-hate accounts. Euractiv. Retrieved February 27, 2024, from https://www.euractiv.com/section/platforms/news/germany-plans-legislation-to-block-cyber-hate-accounts/
Cooke, C. (2021, March 15). New organisation launched in Germany to allow web-blocking without court orders | Complete Music Update. CMU. Retrieved February 26, 2024, from https://archive.completemusicupdate.com/article/new-organisation-launched-in-germany-to-allow-web-blocking-without-court-orders/
Counter Extremism Project. (2022, October 21). Tech & Terrorism: Germany Fines Telegram For Failing To Comply With Online Content Moderation Law. Retrieved February 10, 2024, from https://www.counterextremism.com/press/tech-terrorism-germany-fines-telegram-failing-comply-online-content-moderation-law
Country Dashboard. (2024). Fragile States Index. Retrieved June 24, 2024, from https://fragilestatesindex.org/country-data/
DER SPIEGEL. (2019, 11 October). Far-Right Terrorism in Germany: Shooting Exposes Lapses in Security Apparatus – DER SPIEGEL. DER SPIEGEL | Online-Nachrichten, https://www.spiegel.de/international/germany/far-right-terrorism-in-germany-shooting-exposes-lapses-in-security-apparatus-a-1291075.html
Diehl, J., et al. “Germany: AfD Populists Dominate on Facebook – DER SPIEGEL.” DER SPIEGEL | Online-Nachrichten, DER SPIEGEL, 29 Apr. 2019, https://www.spiegel.de/international/germany/germany-afd-populists-dominate-on-facebook-a-1264933.html
Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (2000) Official Journal L 178, P. 0001 – 0016
Dizikes, P. (2018, March 8). Study: On Twitter, false news travels faster than true stories. MIT News; Massachusetts Institute of Technology. https://news.mit.edu/2018/study-twitter-false-news-travels-faster-true-stories-0308
Echikson, W., & Knodt, O. (2018, November). Germany’s NetzDG:. Counter Extremism Project. Retrieved February 10, 2024, from https://www.counterextremism.com/sites/default/files/CEP-CEPS_Germany%27s%20NetzDG_020119.pdf
Echikson, W. and Knodt, O. (November 22, 2018). Germany’s NetzDG: A Key Test for Combatting Online Hate CEPS Policy Insight, CEPS Available at SSRN: https://ssrn.com/abstract=3300636
eco – Association of the Internet Industry. (2023, May 24). German Law on the Protection of Digital Violence Must Not Curtail Civil Rights – eco. (Retrieved February 27, 2024, from https://international.eco.de/presse/german-law-on-the-protection-of-digital-violence-must-not-curtail-civil-rights/
EDRi (2020, September 16). First Analysis of the austrian anti-hate speech law (netdg/koplg). European Digital Rights (EDRi). Retrieved December 17, 2022, from https://edri.org/our-work/first-analysis-of-the-austrian-anti-hate-speech-law-netdg-koplg/
EEAS. (2023, November 2). Commission Welcomes G7 Leaders’ Agreement on Guiding Principles and a Code of Conduct on Artificial Intelligence | EEAS. Retrieved February 8, 2024, from https://www.eeas.europa.eu/delegations/montenegro/commission-welcomes-g7-leaders-agreement-guiding-principles-and-code-conduct-artificial-intelligence_en?s=225
Escritt, T. (2019, July 2). Germany fines facebook for under-reporting complaints. Reuters. Retrieved December 17, 2022, from https://www.reuters.com/article/uk-facebook-germany-fine-idUKKCN1TX1I0
Etteldorf, C. (2021). [DE] FEDERAL SUPREME COURT FINDS FACEBOOK TERMS OF USE INEFFECTIVE IN RELATION TO HATE SPEECH. IRIS Merlin. Retrieved February 21, 2024, from https://merlin.obs.coe.int/article/9273
European Commission. (2021, May 26).Commission presents guidance to strengthen the Code of Practice on Disinformation. European Union. https://ec.europa.eu/commission/presscorner/detail/en/ip_21_2585
European Commission. (2021, May 26). Guidance to strengthen the Code of Practice on Disinformation – Questions and Answers. European Union. https://ec.europa.eu/commission/presscorner/detail/pl/QANDA_21_2586
European Commission. (2021). Broadband in Germany | Shaping Europe’s digital future. Retrieved February 25, 2024, from https://digital-strategy.ec.europa.eu/en/policies/broadband-germany
European Commission. (2022, June 16). 2018 Code of Practice on Disinformation | Shaping Europe’s digital future. EU Commission; European Union. https://digital-strategy.ec.europa.eu/en/library/2018-code-practice-disinformation
European Commission. (2022, June 16) 2022 Strengthened Code of Practice on Disinformation | Shaping Europe’s digital future. European Union. https://digital-strategy.ec.europa.eu/en/library/2022-strengthened-code-practice-disinformation
European Commission. (2023, December 15). Commission welcomes political agreement on EMFA. Retrieved February 8, 2024, from https://ec.europa.eu/commission/presscorner/detail/en/ip_23_6635
European Commission. (2024, February 19). DSA: Commission opens formal proceedings against TikTok. Retrieved February 27, 2024, from https://ec.europa.eu/commission/presscorner/detail/en/ip_24_926
European Commission (2023, October 30). Commission welcomes G7 leaders’ agreement on Guiding Principles and a Code of Conduct on Artificial Intelligence. Retrieved February 8, 2024, from https://ec.europa.eu/commission/presscorner/detail/en/ip_23_5379
European Union: European Parliament, European Parliament legislative resolution of 2 October 2018 on the proposal for a directive of the European Parliament and of the Council amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services in view of changing market realities (COM(2016)0287 – C8-0193/2016 – 2016/0151(COD)), 02 October 2018, https://www.europarl.europa.eu/doceo/document/TA-8-2018-0364_EN.html?redirect#title1
Federal Data Protection Act (BDSG) (n.d.). bill. Retrieved December 16, 2022, from https://www.gesetze-im-internet.de/englisch_bdsg/englisch_bdsg.html#p0012.
Federal Ministry of Justice. (2017). Act to Improve Enforcement of the Law in Social Networks (Network Enforcement Act, NetzDG) – Basic Information (2017). Bundesministerium der Justiz. Retrieved December 16, 2022, from https://www.bmj.de/DE/Themen/FokusThemen/NetzDG/NetzDG_EN_node.html
Federal Ministry of Justice. (2017). Act to Improve Enforcement of the Law in Social Networks (Network Enforcement Act, NetzDG) – Basic Information (2017). Bundesministerium der Justiz. Retrieved December 16, 2022, from https://www.bmj.de/DE/Themen/FokusThemen/NetzDG/NetzDG_EN_node.html
Freedom House. (2022). Germany: Freedom in the World 2022 Country Report. Freedom House. Retrieved December 16, 2022, from https://freedomhouse.org/country/germany/freedom-world/2022
Freedom House. (2024). Germany: Freedom in the World 2024 Country Report. Freedom in the World 2024; Freedom House. https://freedomhouse.org/country/germany/freedom-world/2024
Fukuzawa, N. (2013). A Model of the Welfare System in FRG (Federal Republic of Germany) after WWII in Respect to Its Economic Order(Whither European Models of Capitalism?). Political Economy Quarterly, 49(4), 43–53. J-STAGE. https://doi.org/10.20667/peq.49.4_43
German Criminal Code (Strafgesetzbuch – StGB) (2021). Retrieved from https://www.gesetze-im-internet.de/englisch_stgb/englisch_stgb.html
Gesley, J. (2021, July 6). Germany: Network Enforcement Act Amended to Better Fight Online Hate Speech. Library of Congress. Retrieved February 22, 2024, from https://www.loc.gov/item/global-legal-monitor/2021-07-06/germany-network-enforcement-act-amended-to-better-fight-online-hate-speech/
Gleicher, N. (2021, September 16). Removing New Types of Harmful Networks | Meta. Meta. Retrieved February 13, 2024, from https://about.fb.com/news/2021/09/removing-new-types-of-harmful-networks/
Google (2023). Removals under the Network Enforcement Law – Google Transparency Report. (2023). Google Transparency Report. Retrieved February 19, 2024, from https://transparencyreport.google.com/netzdg/youtube?hl=en
Goujard, C. (2022, June 7). Online platforms now have an hour to remove terrorist content in the EU. POLITICO.eu. Retrieved February 27, 2024, from https://www.politico.eu/article/online-platforms-to-take-down-terrorist-content-under-an-hour-in-the-eu/
Graham, M. G., & Haselton, T. (2019, October 9). About 2,200 people watched the German synagogue shooting on Amazon’s twitch. CNBC. Retrieved December 16, 2022, from https://www.cnbc.com/2019/10/09/the-german-synagogue-shooting-was-streamed-on-twitch.html
Hansen, M. (2020, February 13). Germany Likely to Adopt Unique Regulatory Regime for Intermediaries to Media Services. Inside Global Tech. Retrieved February 8, 2024, from https://www.insideglobaltech.com/2020/02/13/germany-likely-to-adopt-unique-regulatory-regime-for-intermediaries-to-media-services/
Hate Aid. (2019, June 17). Stellungnahme zum Gesetzentwurf zur Änderung des Netzwerkdurchsetzungsgesetzes. HateAid. Retrieved December 17, 2022, from https://cdn.netzpolitik.org/wp-upload/2017/05/Facebook_Stellungnahme_zum_Entwurf_des_NetzDG.pdf
Heldt, A. (2021, September 28). Facebook suspends accounts of German Covid-19-deniers: Can „Coordinated Social Harm“ be a justification for limiting freedom of expression? Verfassungsblog. Retrieved February 13, 2024, from https://verfassungsblog.de/querdenker-suspension-fb/
Hirosawa, M. (2023, December 2). G7 agrees on first comprehensive guidelines for generative AI. Nikkei Asia. Retrieved February 8, 2024, from https://asia.nikkei.com/Business/Technology/G7-agrees-on-first-comprehensive-guidelines-for-generative-AI
Human Rights Watch. (2020, October 28). Germany: Flawed social media law. Human Rights Watch. Retrieved December 17, 2022, from https://www.hrw.org/news/2018/02/14/germany-flawed-social-media-law
INHOPE. (2020, August 17). FSM’s role of self-regulation under the German Network Enforcement Act. Retrieved February 10, 2024, from https://www.inhope.org/EN/articles/fsms-role-of-self-regulation-under-the-german-network-enforcement-act
Institute for Economics and Peace. (2022, March). Global Terrorism Index 2022: Measuring the Impact of Terrorism. Retrieved December 16, 2022, from http://visionofhumanity.org/resources
International IDEA. (2022). Global State of Democracy Report 2022: Forging social contracts in a time of discontent. Global State of Democracy Initiative. Retrieved December 16, 2022, from https://idea.int/democracytracker/gsod-report-2022
International Institute for Democracy and Electoral Assistance . (2023). Germany 2023 Democracy. The Global State of Democracy Indices. Retrieved June 24, 2024, from https://www.idea.int/gsod-indices/
Jaursch, J. “Disinformation in the 2021 German Federal Elections: What Did and Did Not Occur | Institut Montaigne.” Institut Montaigne, Institut Montaigne, 10 May 2021, https://www.institutmontaigne.org/en/expressions/disinformation-2021-german-federal-elections-what-did-and-did-not-occur
Jones Day. (2021, January 19). Germany’s New Competition Rules for Tech Platforms. Retrieved February 19, 2024, from
https://www.jonesday.com/en/insights/2021/01/germany-adopts-new-competition-rules
Kawaguchi, T. (2024, April 28). How democratic states are regulating digital platforms. The Japan Times. Retrieved May 31, 2024, from https://www.japantimes.co.jp/commentary/2024/04/28/world/digital-platform-regulation/
Kemp, S. (2024, Feb 21). “Digital 2024: Germany.” DataReportal – Global Digital Insights, Kepios, https://datareportal.com/reports/digital-2024-germany.
Kosse, F., & Piketty, T. (2020, July). Electoral cleavages and socioeconomic inequality in Germany 1949-2017 . World Inequality Database. Retrieved December 16, 2022, from https://wid.world/document/electoral-cleavages-and-socioeconomic-inequality-in-germany-1949-2017-world-inequality-lab-wp-2020-15/
Krukowska, E. (2023, May 27). Twitter Withdraws From EU Disinformation Code: Commissioner. Time. Retrieved February 8, 2024, from https://time.com/6283183/twitter-withdraws-from-eu-disinformation-code-commissioner-says/
Lees, C. (2018). The ‘Alternative for Germany’: The rise of right-wing populism at the heart of Europe. Politics, 38(3), 295-310. https://doi.org/10.1177/0263395718777718
Leška, R. (2022, February 28). Implementation of Art. 17 DSM Directive into German National Law – the German Act on the Copyright Liability of Online Content Sharing Service Providers (UrhDaG). Kluwer Copyright Blog. Retrieved February 27, 2024, from https://copyrightblog.kluweriplaw.com/2022/02/28/implementation-of-art-17-dsm-directive-into-german-national-law-the-german-act-on-the-copyright-liability-of-online-content-sharing-service-providers-urhdag/
Madiega, T. (2020, April 16). Reform of the EU liability regime for online intermediaries. European Parliament. Retrieved February 8, 2024, from https://www.europarl.europa.eu/RegData/etudes/IDAN/2020/649404/EPRS_IDA(2020)649404_EN.pdf
Martin, D. (2018, January 6). Satire magazine back on Twitter after ban – DW – 01/06/2018. DW. Retrieved February 19, 2024, from https://www.dw.com/en/german-satire-magazine-titanic-back-on-twitter-following-hate-speech-ban/a-42046485
Meineck, S., & Pitz, L. (2023, November 24). Digitale Gewalt: Bundesregierung im Blindflug. Netzpolitik. Retrieved February 27, 2024, from https://netzpolitik.org/2023/digitale-gewalt-bundesregierung-im-blindflug/
Meta. (2023, January 31). Facebook veröffentlicht zehnten NetzDG-Transparenzbericht | Über Meta. Retrieved February 19, 2024, from https://about.fb.com/de/news/2023/01/facebook-veroeffentlicht-zehnten-netzdg-transparenzbericht/
Miller, G. (2023, December 18). EU Tests DSA with Investigative Proceedings Against Musk’s X | TechPolicy.Press. Tech Policy Press. Retrieved February 27, 2024, from https://www.techpolicy.press/eu-tests-dsa-with-investigative-proceedings-against-musks-x/
Miller, G. (2024, February 19). The Digital Services Act Is Fully In Effect, But Many Questions Remain | TechPolicy.Press. Tech Policy Press. Retrieved February 27, 2024, from https://www.techpolicy.press/the-digital-services-act-in-full-effect-questions-remain/
NetzDG. (2023). FSM. Retrieved February 10, 2024, from https://www.fsm.de/en/fsm/netzdg/
Nielsen, N. (2018, November 9). Xenophobia on the rise in Germany, study finds. Euobserver.com; EUobserver. https://euobserver.com/migration/143336
Noyan, O. (2022, February 2). Big tech opposes Germany’s enhanced hate speech law. Euractiv. Retrieved February 21, 2024, from https://www.euractiv.com/section/internet-governance/news/german-reinforcement-of-hate-speech-law-faces-opposition-from-big-online-platforms/
OECD. (2016). Germany. OECD Better Life Index; Organisation for Economic Co-operation and Development. https://www.oecdbetterlifeindex.org/countries/germany/
OECD DIRECTORATE FOR FINANCIAL AND ENTERPRISE AFFAIRS COMPETITION COMMITTEE. (2021, December 3). News Media and Digital Platforms – Note by Germany. Retrieved February 8, 2024, from https://one.oecd.org/document/DAF/COMP/WD(2021)69/en/pdf
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC, (2020), 15 December
Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (2021) EUR – Lex
Reset Tech. 9 months after the Capitol Hill insurrection, Big Tech puts the German election at risk. (2021, September 20). Retrieved February 21, 2024, from https://public.reset.tech/documents/210831_Reset_Facebook_Bundestagswahl_EN.pdf
Reuters. (2021, July 27). Google takes legal action over Germany’s expanded hate-speech law. Reuters. https://www.reuters.com/technology/google-takes-legal-action-over-germanys-expanded-hate-speech-law-2021-07-27/
Schachner, T. (2022, September 9). How to torrent in Germany (safely and privately) in 2022. VPNBrains. Retrieved December 17, 2022, from https://www.vpnbrains.com/blog/torrenting-in-germany/
Schmidt, N. (2019, April 19). Far-right groups shout the loudest on social media. Investigate Europe; Investigate Europe. https://www.investigate-europe.eu/posts/far-right-groups-shout-the-loudest-on-social-media
Schmitz, S. and Berndt, C. (December 14, 2018). The German Act on Improving Law Enforcement on Social Networks (NetzDG): A Blunt Sword? Available at SSRN: https://ssrn.com/abstract=3306964 or http://dx.doi.org/10.2139/ssrn.3306964
Seibriger, M. (2024, April 18). Changing tides: Discourse towards migrants and asylum seekers on Facebook and X in Germany in 2023. ISD. Retrieved April 29, 2024, from https://www.isdglobal.org/digital_dispatches/changing-tides-discourse-towards-migrants-and-asylum-seekers-on-facebook-and-x-in-germany-in-2023/
Shahbaz, A., & Funk, A. (2021). Freedom on the Net 2021: The Global Drive to Control Big Tech. Freedom House; Freedom House. https://freedomhouse.org/report/freedom-net/2021/global-drive-control-big-tech
Shahbaz, Funk, Brody, Vesteinsson, Baker, Grothe, Barak, Masinsin, Modi, Sutterlin eds. (2023). Freedom on the Net 2023, Freedom House, freedomonthenet.org
Simmons & Simmons. (2020, August 5). The new German State Media Treaty – legal requirements on telemedia. Retrieved February 8, 2024, from https://www.simmons-simmons.com/en/publications/ckdhmlhufqy9g09265hhz6tw1/the-new-german-state-media-treaty—legal-requirements-on-telemedia
Siren Associates. (2023, November 11). Tackling disinformation: the EU Digital Services Act explained. Retrieved February 8, 2024, from https://sirenassociates.com/policy-papers/the-eu-digital-services-act-overview-and-opportunities/
Tanner, B. (2022, August 5). EU Code of Practice on Disinformation. Brookings; Brookings Institute. https://www.brookings.edu/articles/eu-code-of-practice-on-disinformation/
The Fragile States Index Team. (2023). About. Fragile States Index Annual Report 2023. Retrieved June 24, 2024, from https://fragilestatesindex.org/wp-content/uploads/2023/06/FSI-2023-Report_final.pdf
The Global Statistics. (2023, 10 October ) “Germany Social Media Statistics 2023 | Most Used Popular Platforms – The Global Statistics.” https://www.theglobalstatistics.com/germany-social-media-statistics/?expand_article=1.
The World Bank. (2021). Access to electricity (% of population) – germany. World Bank Global Electrification Database. Retrieved November 26, 2023, from https://data.worldbank.org/indicator/EG.ELC.ACCS.ZS?locations=DE
TikTok. (2023). NetzDG Transparenzbericht Jan – Jun 2021. Retrieved February 19, 2024, from https://www.tiktok.com/transparency/de-de/netzdg-2021-1/
Twitter (2023). Twitter Netzwerkdurchsetzungsgesetzvericht; Januar – Juni 2023. X Transparency Center. Retrieved February 19, 2024, from https://transparency.twitter.com/content/dam/transparency-twitter/country-reports/germany/NetzDG-Jan-Jun-2023.pdf
Tworek, H. (2019, April 15). An Analysis of Germany’s NetzDG Law. Transatlantic Working Group; University of British Columbia. https://www.ivir.nl/publicaties/download/NetzDG_Tworek_Leerssen_April_2019.pdf
United Nations Development Programme. (2022, December 14). Human climate horizons data platform. Home | Human Development Reports. Retrieved December 17, 2022, from https://hdr.undp.org/
Zurth, P., (2021, April 27). The German NetzDG as Role Model or Cautionary Tale? Implications for the Debate on Social Media Liability. 31 Fordham Intellectual Property, Media & Entertainment Law Journal 1084 (2021), Available at SSRN: https://ssrn.com/abstract=3668804 or http://dx.doi.org/10.2139/ssrn.3668804