The gendered disinformation playbook in Germany is a warning for Europe

Election posters of Germany's top candidates for chancellor—Annalena Baerbock, co-leader of Germany's Green party, Olaf Scholz, of the Social Democratic Party, and Armin Laschet of the Christian Democratic Union leader—are displayed in Berlin.

Gendered disinformation attacks online are a well-known tactic that illiberal actors around the world—including Russia, Hungary and Brazil—have developed to undermine their opponents. By building on sexist narratives these actors intimidate women in order to eliminate critics, consolidate power, and undermine democratic processes. Such disinformation tactics are being imported to the West and are increasingly being adopted by both foreign actors and the far right in Europe.  

Recent elections in Germany provided ample evidence for how such attacks are deployed. Russian state-backed media amplified disinformation and provided more extensive negative coverage regarding Annalena Baerbock, the Green Party’s candidate for chancellor, compared to her male rivals, according to data from the Alliance for Securing Democracy, German Marshall Fund, and the Institute for Strategic Dialogue. Amid mounting concerns about disinformation and foreign interference, Germany has adopted the world’s toughest law against online hate speech and harassment—the Network Enforcement Act (NetzDG). But this wasn’t enough to overcome the disinformation and gender-based online violence facilitated by social media platforms.   

These developments in Germany provide important lessons for European policymakers at work crafting updated regulations. One major piece of that reform package is the Digital Services Act (DSA), which is intended to create a safer, more open digital space across the European Union, greater platform accountability, and more democratic oversight—especially through recently proposed amendments. These changes to the DSA would improve platform accountability for algorithms, would force large platforms to assess algorithms’ impact on fundamental rights, and mandate risk assessments regarding platforms’ impact on “the right to gender equality.” As online abuse is facilitated by platforms’ design features, platforms need to be obligated to identify, prevent, and mitigate the risk of gender-based violence taking place on and being amplified by their products. If the DSA is ever going to address gendered disinformation, it is critical that these amendments be adopted, and whether that happens depends on the extent to which lawmakers in Brussels understand and care about the risks to future elections when women and gender equality are undermined online. 

Against this backdrop, understanding what happened in Germany and how it illustrates the formula of gender disinformation could not be more relevant.  

Gendered disinformation targeting Baerbock 

As Germany’s election results came in, it became clear that Baerbock would not be the next chancellor. With less than 15% of the vote, her Green Party trailed far behind Olaf Scholz’s Social Democrats and Armin Laschet’s Christian Democrats. Many, even in her own party, feared such a result. Public opinion polls measuring support for each candidate showed that Baerbock entered the election as front runner, yet over a five month period her support dropped by more than ten points. While Baerbock made mistakes as a candidate, including in her handling of a cheating scandal and a claim of plagiarism, her candidacy also suffered due to gendered disinformation and predictable bias against women. Baerbock’s mistakes, like for most women candidates, became outsized sources of attacks and soon morphed into disinformation that played on societal bias and sexism. Unlike men, whose qualifications for political office are usually assumed, women candidates must prove they are qualified and “look the part”—which is why character-based attacks, like those faced by Baerbock, are particularly damaging for women.  

From the beginning of her candidacy, Baerbock faced a disproportionate array of vicious online attacks from both foreign and domestic actors, often steeped in sexism. According to a recent study by the Institute of Strategic Dialogue, 18 of the most shared Facebook posts about Baerbock contained false information or allusions to conspiracy theories, compared to only three of the most shared posts about Scholz and none of the posts regarding Laschet. This trend was replicated on Telegram, where 43 of the most popular posts about Baerbock contained misinformation or conspiratorial narratives, compared to only 26 posts about Laschet and 17 of the posts about Scholz.  

Not only was Baerbock attacked more often than her male competitors, the nature of the attacks against her were different. Attacks on Baerbock tended to be more personal (often referring to her “incompetence” as opposed to her policy proposals or ideas), hyperbolic (on Facebook, Baerbock was described 15 times more often than Scholz and 7.5 times more often than Laschet as a “danger to Germany”), and most often sexist, containing clear references to her gender. On Telegram, a fake quote circulated that Baerbock wanted to ban cats and dogs as pets to help combat climate change. She was immediately inundated with hate and threats of violence online. Around the same time as the circulation of the fabricated quote, a fake nude photo of Baerbock went viral with a quote insinuating she had engaged in sex work. On Telegram alone, it was viewed more than 150,000 times. Other social media posts threatened to share Baerbock’s home address, contained slurs about her appearance, and attempted to discredit her qualifications.  

Online abuse and disinformation against women in Germany 

This isn’t only about Baerbock. Online abuse against women is pervasive in Germany. In July, the research firm Pollytix partnered with RESET to survey a sample of German internet users eligible to vote and found that 85% of them see hate comments on the internet as a big problem for society. While 38% of internet users eligible to vote reported having already been personally exposed to numerous forms of hate on the internet, the numbers are even higher for young women, people with an immigrant background, gender nonconforming or members of the LGBTQIA+ community. One in three women between the ages of 18 and 34 reported having been sexually harassed on social media, and more than half of the women interviewed described their mental health being negatively affected by the use of social media. 

Gender-based abuse on social media affects German women’s ability to take part in political debates and infringes on their freedom of expression. More than one in three German women report having become more cautious about expressing their opinions online. Only a small minority, 15%, said they frequently express their views on political issues online, while 63% rarely or never do so. 

Online abuse particularly affects women elected officials and candidates for public office. A recent study showed that nearly nine in 10 female MPs in Germany have been targeted by online hate speech and threats. A February study by Der Spiegel found that 69% of female members of parliament in Germany had experienced “misogynistic hatred as members of the Bundestag.” And a report from the Institute of Strategic Dialogue on the 2018 Bavarian elections showed that several public figures were targeted by harassment and disinformation campaigns by the Alternative for Germany, the populist right-wing German nationalist party. 

The role of digital technology  

Sexism on the campaign trail is nothing new, but digital technology has made it much easier for gendered disinformation campaigns to be organized, amplified, and cheaply financed. While conventional wisdom and the digital platforms contend that social media merely reflects societal sexism and misogyny, this does not take into consideration the way social media acts as a behavioral modification system that encourages groups of individuals to behave in ways that they would not normally and uses algorithmic principles to manipulate people. 

Companies that do not address the harms caused by their products enable the mainstreaming of sexist attacks that curtail democratic debate and legitimize conspiratorial and often illicit content aimed at women. By spreading them across platforms, sharing them in a coordinated manner, and simulating interest in them, such attacks can reach a wide audience. The design of digital platforms sometimes reflects pre-existing sexism and bias against women in politics, but also increases it. Harmful narratives are boosted and amplified through algorithms that make such content sticky and often viral, serving companies’ commercial interests at the expense of society.  

That is why Facebook pages from the Alternative for Germany (AfD) achieve on average five times as many interactions as the pages belonging to the Christian Democrats, Greens, or Social Democrats, according to research from the NGO Hope not Hate.  Indeed, AfD built their online presence in the 2019 elections for the European Parliament in part by creating a large, dense network of suspect accounts on Facebook that promoted AfD posts. Despite researchers and journalists revealing this effort as violating Facebook’s rules, little was done to curtail it.  

While Facebook and other major digital platforms generally have terms of service, community standards, and codes of conduct that supposedly ban hate speech, harassment, and the promotion of violence, these rules are not properly enforced. On more than one occasion, platforms have committed to doing better but failed to deliver needed reforms. Facebook in particular has a history of misleading consumers, academics, and lawmakers about what is happening on their platform, often disregarding information on harms caused by their products. In April, the Facebook whistleblower Sophie Zhang revealed how the company has allowed some authoritarian world leaders to use social media to “deceive the public or harass opponents,” despite being alerted to evidence of wrongdoing. Internal Facebook documents show that the company has repeatedly failed to combat hate speech targeting minority groups and hire enough local staff to quell religious sectarianism in countries with a history of violent conflict. The Facebook whistleblower Frances Haugen recently described to UK and U.S. lawmakers how the company studies its products’ failures and then buries results that don’t promote the company’s interests, including on the health and safety of kids, manipulation of politics and elections, incitement to violence around the world, and anti-vaxx content. By failing to enforce its own standards and regulations, and creating special exemptions for accounts with large followers, Facebook has actively endangered women and girls on its platforms and has consistently contributed to the spread of disinformation that has put democratic processes worldwide at risk. 

The limits of the NETZDG and the DSA’s opportunity 

In 2017, legislators in the German Bundestag tried to address gendered disinformation with the passage of NetzDG, which required social media companies to take timely action on illegal content, including hate speech and threats, after it had been flagged by users. Under this law, many of the attacks against Baerbock during the election should have been flagged and taken down, but NetzDG proved unable to protect Baerbock and candidates like her. During the most recent federal election, the German nonprofit HateAid reported 100 manifestly illegal comments from public Facebook pages and groups of the AfD. In 33 cases, Facebook saw no violation, and the comments remained on the platform.  

In the aftermath of the German election, the weakness of NetzDG has become apparent. The high volume of hate speech amplified by coordinated sharing, a lack of enforcement and oversight, and content deemed legal by social media platforms made NetzDG largely toothless in the 2020 election. As demonstrated by HateAid’s report, some forms of abuse are subjectively deemed legal by the platforms, and smaller platforms like Telegram are exempt from the law in its entirety. Even when content is removed, it’s a one-time action, and a systemic problem remains. The lack of transparency about the decision-making process on flagged content is also highly problematic. Information about removed posts isn’t shared with authorities, making prosecutions almost impossible.  

Legal frameworks like NetzDG can only go so far, and current EU legal frameworks do not address cyberviolence and harms to women online. Instead of focusing on what is to be designated as illegal or legal content, regulators should instead demand greater transparency and accountability from digital media companies for the harms that are being caused by the malign use of their products and strengthen provisions to consider gender abuse and online harms.  

This is where the EU’s attempt to consolidate and update existing pieces of legislation on illegal content comes into play. The DSA would maintain existing principles like the exemption of platform liability for user content but would also require platforms to expeditiously remove illegal content and take action to detect, identify, and act against such content. Through these measures, the DSA would create important accountability and responsibility practices. Recent proposed amendments would require the platforms to have a risk mitigation plan in place to reduce online harms and an audit to check their work. With these provisions to strengthen oversight, the DSA may shift platform focus away from simple profit and allow more decisive action on gender-based online violence.  

The DSA also contains other provisions relevant to gender-based online violence, such as the obligation to establish points of contact and legal representatives, a complaint and redress mechanism, out-of-court dispute settlement, trusted flaggers, measures against abusive reporting and counter-reporting, codes of conduct, and crisis response cooperation. As the recent German election illustrates, public authorities need stronger mechanisms to require platform transparency and due diligence obligations to curtail harms. Through amendments ensuring that the DSA forces platforms to “price in” societal risks to their business operations, the legislation can shift platforms’ priorities to focus more directly on gender-based online violence. 

Digital platform reform must be anchored in the reality of how the ecosystem works in practice, particularly given the growing body of evidence around the experiences of women candidates and elected officials. Gendered disinformation is used by authoritarian regimes to silence political opponents and keep women from pursuing leadership roles, blocking democratic processes. The influence of Russian state-backed media in shaping the narrative around Baerbock in the German election, where the NetzDG is supposed to provide one of the strongest protections against online abuse in the world, should act as a wakeup call for the rest of Europe. 

Kristina Wilfore is an international global democracy and disinformation specialist, co-founder of #ShePersisted and Adjunct Professor at The George Washington University Elliott School of International Affairs