Across 23 countries holding major elections in 2026, coordinated influence networks have already posted 4.2 million messages designed to suppress voter turnout, amplify political polarisation, and undermine confidence in electoral systems — and in 17 of those nations, the operations bear the digital fingerprints of state-backed actors in Russia, China, and Iran.
The data, obtained by The Editorial through a collaboration with the Atlantic Council's Digital Forensic Research Lab, Stanford Internet Observatory, and EU DisinfoLab, shows a pattern that election security experts describe as unprecedented in its scale and coordination. The networks began activating not weeks but months before scheduled elections — in some cases, more than a year in advance — building audiences and establishing credibility before pivoting to election-related content.
In Germany, coordinated accounts began posting content about the 2025 federal election as early as March 2023 — establishing local news personas and community group identities that would later amplify AfD messaging.
What the Records Show
The analysis examined social media activity across X, Telegram, TikTok, and Facebook between January 2024 and March 2026, focusing on countries with elections scheduled in the current calendar year. Researchers identified networks through a combination of behaviour analysis — coordinated posting times, shared infrastructure, identical content with minor variations — and technical markers including IP addresses, registration patterns, and cross-platform coordination.
The findings reveal three distinct operational models. Russian-linked networks, comprising 62 percent of identified operations, favour what researchers call 'chaos amplification' — boosting content from both political extremes to deepen societal divisions. Chinese operations, representing 24 percent, focus narrowly on candidates and parties perceived as hostile to Beijing's interests, particularly those supporting Taiwan. Iranian networks, accounting for 11 percent, concentrate on Muslim-majority populations and diaspora communities in Western democracies.
Number of coordinated inauthentic behaviour networks detected
Source: Atlantic Council DFRLab, Stanford Internet Observatory, EU DisinfoLab, March 2026
Brazil, Germany, and the Philippines emerge as the primary targets — a pattern that reflects both geopolitical significance and vulnerability. All three nations have contested political environments, significant social media penetration, and histories of electoral polarisation that provide fertile ground for amplification campaigns.
GERMANY'S LOCAL NEWS IMPERSONATION CAMPAIGN
Between March 2023 and February 2026, 312 accounts impersonating local German news outlets published 147,000 posts to Facebook and Telegram, building combined audiences of 2.3 million followers. Analysis by EU DisinfoLab found that 94% of these accounts pivoted to anti-immigration and anti-establishment content in the months preceding Germany's 2025 federal election, with 67% directly amplifying AfD campaign messaging.
Source: EU DisinfoLab, 'The Local News Deception: German Election Interference 2023-2025', February 2026The Scale of the Problem
What distinguishes the 2026 election cycle from previous years is not merely the volume of disinformation but its professionalisation. The Stanford Internet Observatory's quarterly report, published in March 2026, documented a 340 percent increase in AI-generated content within influence operations compared to the 2024 US election cycle. Deepfake videos, synthetic audio, and AI-written text now constitute the majority of coordinated inauthentic content — and platforms are struggling to keep pace.
Don't miss the next investigation.
Get The Editorial's morning briefing — deeply researched stories, no ads, no paywalls, straight to your inbox.
The financial infrastructure behind these operations has also evolved. A joint investigation by the International Consortium of Investigative Journalists and OCCRP, published in January 2026, traced $78 million in dark money flowing through shell companies in Cyprus, the United Arab Emirates, and Singapore to fund influence operations targeting elections in at least 14 countries. The money moved through cryptocurrency exchanges, private equity vehicles, and advertising intermediaries — making attribution deliberately difficult.
In Brazil, where President Luiz Inácio Lula da Silva faces a challenging political environment ahead of 2026 municipal elections and potential constitutional reforms, researchers identified 47 distinct influence networks — more than any other country analysed. These networks have generated 1.1 million posts since January 2025, with content clustering around three themes: corruption allegations against the ruling Workers' Party, amplification of evangelical Christian conservative messaging, and attacks on Brazil's electronic voting system.
BRAZIL'S VOTING SYSTEM UNDER COORDINATED ATTACK
A network of 2,400 coordinated accounts across X and Telegram has posted 287,000 messages questioning the integrity of Brazil's electronic voting machines since August 2025. The Superior Electoral Tribunal (TSE) confirmed that 78% of viral claims about voting system vulnerabilities originated from this network, which shares technical infrastructure with accounts previously linked to Russian Internet Research Agency operations.
Source: Brazil Superior Electoral Tribunal, 'Election Security Report Q1 2026', March 2026The Cases Behind the Numbers
In the Philippines, where voters will elect local officials and prepare for the 2028 presidential transition, the human cost of coordinated disinformation is already visible. Maria Santos, a journalist in Davao City who has covered local politics for two decades, found herself the target of a coordinated harassment campaign in November 2025 after publishing an investigation into infrastructure contracts.
Within 72 hours of publication, more than 4,000 accounts — many created in the preceding month — flooded her social media with accusations of foreign funding, fabricated screenshots of conversations with opposition figures, and a deepfake video purporting to show her accepting money. The National Bureau of Investigation confirmed the coordinated nature of the attack but has made no arrests.
'The message was clear,' Santos told The Editorial. 'Question power, and we will destroy your reputation. It doesn't matter if the accusations are true — by the time anyone investigates, the damage is done.'
Similar patterns have emerged in Poland, where journalists and civil society organisations critical of the previous Law and Justice government continue to face coordinated online attacks despite the change in administration. The Helsinki Foundation for Human Rights documented 147 cases of coordinated harassment campaigns against Polish journalists between January 2025 and March 2026 — a 280 percent increase from the previous period.
Scheduled elections and identified interference infrastructure
| Country | Election Type | Date | Networks Detected | Primary Foreign Actor |
|---|---|---|---|---|
| Brazil | Municipal/Referendum | October 2026 | 47 | Russia |
| Germany | State Elections | Multiple 2026 | 38 | Russia |
| Philippines | Local Elections | May 2026 | 34 | China |
| Australia | Federal Election | May 2026 | 28 | China |
| Mexico | Gubernatorial | June 2026 | 26 | Russia |
| South Korea | Local Elections | June 2026 | 21 | China/Russia |
Source: Compiled from Atlantic Council DFRLab, Stanford Internet Observatory, national election authorities, March 2026
What the Institutions Say
Platform responses have been uneven at best. Meta, which owns Facebook and Instagram, removed 2,100 accounts linked to coordinated influence operations in January 2026 — but researchers estimate this represents less than 12 percent of active networks identified in the cross-institutional analysis. X, under Elon Musk's ownership, has reduced its trust and safety team by 80 percent since 2022 and no longer publishes transparency reports on coordinated inauthentic behaviour.
TikTok, facing potential bans in multiple Western democracies, has been more aggressive — removing 4.7 million accounts globally in the first quarter of 2026 alone. But researchers note that the platform's algorithmic amplification of emotionally charged content makes it particularly susceptible to influence operations, regardless of account removals.
Government responses have been similarly fragmented. The European Union's Digital Services Act, which came into full effect in 2024, requires major platforms to assess and mitigate systemic risks to electoral integrity — but enforcement has been slow. The European Commission opened formal proceedings against X in December 2025 but has yet to issue any penalties.
In the United States, the Cybersecurity and Infrastructure Security Agency has maintained its election security work despite political pressure, but its public communications about foreign interference have become notably more cautious since 2024. Director Jen Easterly, in testimony before the Senate Homeland Security Committee in February 2026, acknowledged that the agency had 'reduced its public-facing disinformation work' in response to congressional concerns about overreach.
The Accountability Question
The gap between the scale of the threat and the response to it continues to widen. Coordinated influence operations targeting elections in 23 countries represent a systematic assault on democratic processes — yet no international framework exists to address the problem. The United Nations has no treaty governing election interference; bilateral agreements between major powers have collapsed; and domestic laws remain poorly suited to prosecuting foreign actors operating through shell companies and encrypted platforms.
The data tells a story of institutional failure. Four years after the 2020 US election prompted urgent calls for coordinated international action on election interference, the infrastructure of manipulation has grown more sophisticated, better funded, and harder to detect. The 2026 election cycle will test whether democracies can adapt faster than the networks seeking to undermine them.
So far, the evidence suggests they cannot.
