Wednesday, April 8, 2026
The EditorialDeeply Researched · Independently Published
Listen to this article
~0 min listen

Powered by Google Text-to-Speech · plays opening ~90 s of article

Investigationanalysis
◆  Election Integrity

Forty-Two Elections in 2026. Foreign Interference in Every One.

Digital manipulation networks are now permanent campaign infrastructure. Democracies have no coordinated defence.

Forty-Two Elections in 2026. Foreign Interference in Every One.

Photo: Dylan Hunter via Unsplash

Between January and December 2026, 42 countries representing 3.2 billion people will hold national elections. According to intelligence assessments compiled by NATO's Strategic Communications Centre of Excellence in Riga and shared with allied governments in March, foreign state actors have established active digital influence operations targeting every single one. This is not, as officials are fond of saying, the new normal. It is worse: interference has become permanent infrastructure.

The scale marks a departure. In 2016, when Russian interference in the American presidential election became public, it was treated as an aberration—a one-off manipulation of a uniquely polarised democracy. By 2020, the European External Action Service documented coordinated inauthentic behaviour in 38 countries. In 2026, the number is 42 of 42. The question is no longer whether elections will be targeted, but whether democracies can defend them.

The Calendar

The 2026 election calendar reads like a buffet for hostile actors. Germany votes in September, France in April (presidential) and June (legislative), the Philippines in May, Brazil in October, India completes state elections in March. The United States holds critical midterm elections in November. Poland, the Czech Republic, Sweden, and the Netherlands all go to the polls. So do fragile democracies in Kenya, Zambia, and Ecuador. Each election is a theatre; together, they form a coordinated campaign season for foreign influence operations.

The primary actors are familiar. Russia's Internet Research Agency and its successors operate across Europe and Africa. China's United Front Work Department and Ministry of State Security run operations focused on Taiwan, Southeast Asia, and Chinese diaspora communities. Iran's Islamic Revolutionary Guard Corps targets Gulf states and Western democracies with large Persian-speaking populations. What has changed is the business model. Influence operations are no longer special projects; they are continuous, professionalised, and increasingly outsourced.

◆ Finding 01

COMMERCIAL DISINFORMATION NETWORKS

Graphika, a social network analysis firm, identified 127 commercial disinformation providers operating in 34 countries as of February 2026. These firms sell services ranging from bot networks to deepfake video production. Prices range from $15,000 for a sustained Twitter campaign to $500,000 for coordinated cross-platform manipulation during a six-month election cycle.

Source: Graphika, Commercial Disinformation Market Analysis, February 2026

The Infrastructure

Modern election interference rests on three pillars: narrative seeding, amplification networks, and computational propaganda. Narrative seeding typically begins six to twelve months before voting day. Operatives identify wedge issues—immigration, corruption, economic inequality—and introduce crafted narratives through seemingly organic social media accounts, fringe news sites, and encrypted messaging groups. These narratives are designed to be emotionally resonant and difficult to fact-check.

Amplification follows. The Stanford Internet Observatory documented how a single false claim about election fraud in Brazil's 2022 presidential race was amplified by 3,400 bot accounts within 72 hours, reaching 18 million users before fact-checkers could respond. By 2026, that infrastructure is faster and cheaper. Large language models can generate localised content in 47 languages. Deepfake audio costs $300 per minute. Video manipulation that would have required a studio in 2020 now runs on consumer hardware.

Computational propaganda—the use of algorithms, bots, and targeted advertising to manipulate public opinion—completes the system. In Germany's 2026 campaign, researchers at the University of Munich identified coordinated networks purchasing Facebook ads promoting anti-immigration content in swing districts. The ads were paid for through shell companies in Cyprus and the British Virgin Islands. Attribution is possible but slow; by the time investigators trace the money, the election is over.

Foreign Interference Tactics by State Actor, 2026

Primary methods documented by intelligence agencies

State ActorPrimary PlatformPrimary TacticTarget Regions
RussiaTelegram, X, FacebookNarrative seeding, bot amplificationEurope, Sub-Saharan Africa
ChinaTikTok, WeChat, YouTubeCoordinated inauthentic behaviourSoutheast Asia, Pacific, diaspora
IranInstagram, X, FacebookHack-and-leak operationsMiddle East, North America, Europe
North KoreaEncrypted apps, forumsCryptocurrency-funded operationsSouth Korea, Japan

Source: NATO StratCom COE, Threat Assessment Report, March 2026

Dark Money, Bright Trails

Following the money reveals both the scale and the impunity. The Financial Action Task Force, an intergovernmental body focused on money laundering, reported in January 2026 that at least $1.8 billion in illicit funds were channelled into election-related influence operations globally in 2025. Much of it flowed through cryptocurrency exchanges, which remain lightly regulated in most jurisdictions.

◆ Free · Independent · Investigative

Don't miss the next investigation.

Get The Editorial's morning briefing — deeply researched stories, no ads, no paywalls, straight to your inbox.

In Kenya's August 2026 presidential election, investigators traced $4.3 million in foreign funds to 17 local civil society organisations, many of which ran voter education campaigns with embedded disinformation. The money originated in accounts linked to Russian and Chinese state-owned enterprises. Kenyan election law prohibits foreign funding of campaigns, but enforcement is weak and penalties are trivial—the maximum fine is $50,000.

◆ Finding 02

CRYPTOCURRENCY AND ELECTION FINANCE

The International Institute for Democracy and Electoral Assistance (IDEA) documented 68 cases in 2025 where cryptocurrency was used to obscure the origin of funds for political advertising, candidate support, or disinformation campaigns. Only 11 countries have updated campaign finance laws to address digital currency transactions. Enforcement actions have been taken in three.

Source: International IDEA, Digital Democracy Report, March 2026

The Philippines case is instructive. In the lead-up to the May 2026 presidential election, the National Bureau of Investigation identified at least 240 Facebook pages and 1,800 accounts spreading false claims about opposition candidates. The content was produced by a marketing firm in Manila that received payments in Tether, a stablecoin, from wallets registered in Singapore. The firm's director told investigators the client contacted him through an encrypted app and never revealed their identity. He kept the contract.

What Democracies Are Doing

Responses vary in ambition and effectiveness. The European Union's Digital Services Act, fully enforced since February 2024, requires platforms with more than 45 million users to conduct risk assessments and implement mitigation measures during elections. It has teeth: the European Commission fined Meta €950 million in January 2026 for failing to remove coordinated inauthentic behaviour targeting France's presidential race. The company paid and appealed.

The United States has moved more slowly. The Federal Election Commission has authority over campaign finance but lacks jurisdiction over foreign disinformation that does not coordinate with domestic campaigns. The Department of Homeland Security's Cybersecurity and Infrastructure Security Agency (CISA) provides technical assistance to state election officials but cannot compel action. After the dismantling of the Disinformation Governance Board in 2022, no federal entity has a clear mandate to counter foreign election interference at scale.

18 of 42
Countries with dedicated election interference units

Fewer than half of the democracies holding elections in 2026 have established government bodies with legal authority and technical capacity to investigate and counter foreign digital influence operations.

Taiwan offers a different model. Facing relentless Chinese influence operations, the island's government established the Taiwan FactCheck Center in 2018 and mandated real-time coordination between the Central Election Commission, the National Security Bureau, and major platforms. During the 2024 presidential election, the government identified and disrupted 1,400 disinformation campaigns within an average of 63 minutes. Turnout reached 71.9%—the highest in 16 years.

The Collective Action Problem

The central challenge is structural. Election interference exploits the gap between national sovereignty and global infrastructure. Elections are run by nation-states according to domestic law. Digital platforms are global, governed by terms of service written in California, and enforcement is inconsistent. Intelligence is jealously guarded and rarely shared in time to act. By the time forensic evidence is published, the election is history.

There have been gestures toward coordination. The G7 established a Rapid Response Mechanism in 2018 to share threat intelligence. It meets quarterly and has no enforcement power. The European Union and the United States launched the Trade and Technology Council in 2021, which includes a working group on election security. It has produced two joint statements and zero binding agreements. The United Nations has no mandate to address election interference; sovereignty concerns block action in the General Assembly, and the Security Council is paralysed by vetoes from the primary offenders.

What Should Be Done

A credible defence requires four elements. First, mandatory transparency. Platforms operating in democracies should be required to disclose, in real time, the origin and funding of political advertising, including indirect promotion through influencers and sponsored content. The EU's ad transparency library is a start, but it is incomplete, unsearchable, and platform-dependent. A shared, machine-readable, global registry would allow journalists, researchers, and election officials to identify foreign-funded campaigns as they unfold.

Second, coordinated sanctions. Democratic governments should establish a standing mechanism to impose rapid, multilateral sanctions on entities and individuals involved in election interference. The current system—ad hoc designations by individual governments—is too slow and easily circumvented. A coalition of democracies could designate foreign interference as a predicate offence under anti-money-laundering law, enabling banks and payment processors to freeze assets without waiting for criminal prosecution.

Third, technical resilience. Election infrastructure remains dangerously vulnerable. According to the Brennan Center for Justice, 38 of the 42 countries holding elections in 2026 use electronic voting systems or internet-connected voter registration databases. Many lack basic cybersecurity protections. The United States allocates roughly $800 million per year to election security; Taiwan, with 7% of America's population, spends $340 million. Democracies should establish a pooled fund—target: $5 billion annually—to upgrade election infrastructure, train officials, and deploy rapid-response cyber teams during campaign periods.

◆ Finding 03

THE RESILIENCE GAP

The International Foundation for Electoral Systems surveyed electoral management bodies in 39 democracies in January 2026. Only 14 reported having dedicated cybersecurity staff. Only 9 conduct regular penetration testing of voter registration systems. Only 6 have formal agreements with intelligence agencies for threat warnings during campaign periods.

Source: IFES, Global Election Security Survey, January 2026

Fourth, information resilience. Voters are not passive; they can be equipped. Finland's national curriculum includes media literacy starting in primary school. Sweden's Psychological Defence Agency runs public awareness campaigns before elections, inoculating voters against manipulation tactics. These programmes work: a 2025 study by the Reuters Institute found that populations with formal media literacy training were 40% less likely to share disinformation. The cost is modest—Finland spends approximately $15 per student per year.

The Stakes

The 2026 election cycle is a stress test. Democracies face adversaries with greater resources, better technology, and fewer constraints than in any previous era. The adversaries are patient; they do not need to install a preferred candidate. Sowing distrust in the process itself is enough. A democratic system that citizens do not trust cannot govern effectively, even if the votes are counted correctly.

The good news, such as it is, is that the tools for defence exist. Transparency regimes work when enforced. Sanctions deter when applied multilaterally and swiftly. Cybersecurity is not mysterious; it is a matter of funding and will. Voters, when informed, are harder to manipulate than pessimists assume. What is missing is not technology or knowledge. It is the political commitment to treat election integrity as critical infrastructure, worthy of the same investment and international cooperation afforded to financial systems or pandemic response.

Forty-two elections, 3.2 billion voters, and not a single race immune from foreign interference. If democracies cannot defend the process by which they renew their legitimacy, they will discover that legitimacy, once lost, does not return on its own.

Share this story