Friday, May 1, 2026
The EditorialDeeply Researched · Independently Published
Listen to this article
~0 min listen

Powered by Google Text-to-Speech · plays opening ~90 s of article

ExclusiveInvestigationanalysis
◆  Election Interference

Eighty-Nine Election Platforms Use AI to Generate Propaganda. Three Vendors Own Them All.

An analysis of contract data from 12 democracies reveals the global disinformation market has consolidated into an oligopoly worth $2.8 billion annually.

Eighty-Nine Election Platforms Use AI to Generate Propaganda. Three Vendors Own Them All.

Photo: boris misevic via Unsplash

Between January 2024 and April 2026, political parties, government agencies, and private campaign consultants in 12 countries purchased access to 89 distinct artificial intelligence platforms designed to generate election content at scale. The platforms — with names like VoxPulse, Narrative Engine, and CivicReach AI — promise to create thousands of social media posts, campaign emails, and micro-targeted advertisements daily, each calibrated to the psychological profile of individual voters.

An analysis by The Editorial of procurement records, corporate filings, and leaked contract documents reveals that all 89 platforms are owned by just three companies: Sandstream Analytics (United States), Logica Group (United Kingdom), and Narrative Dynamics (Israel). Together, they generated $2.8 billion in revenue in 2025 — a 340 percent increase from 2023. Their client list includes campaign teams in Brazil, Poland, the Philippines, and the United States, as well as government communications offices in Turkey, India, and Hungary.

The data shows an industry that has industrialised voter manipulation. Where foreign election interference once required state intelligence agencies, botnets, and coordination across borders, it now requires a credit card and a target demographic. The tools are legal in 11 of the 12 countries where they have been purchased. They operate in regulatory silence.

47 elections
scheduled in 2026 where these platforms are already deployed

The platforms were operational in Indonesia's presidential election in February 2024, Mexico's June 2024 vote, and the United States midterm primaries in 2025.

What the Contract Data Shows

The Editorial obtained 1,247 pages of contracts, invoices, and internal communications through freedom of information requests in six countries, leaked documents from two campaign consultancies, and corporate filings in Delaware, London, and Tel Aviv. The records span January 2024 to March 2026.

Cross-referencing the vendor names, registered trademarks, and corporate ownership structures revealed that platforms marketed under different brands — often in different languages and to different political parties in the same country — shared identical back-end infrastructure, server addresses, and billing entities. VoxPulse, which won a $4.2 million contract with a major party in Poland in September 2025, is a wholly owned subsidiary of Sandstream Analytics. Narrative Engine, used by campaign teams in the Philippines and Kenya, is registered to Logica Group's office in Canary Wharf. CivicReach AI, deployed in India's Maharashtra state elections in November 2024, lists Narrative Dynamics as its parent company in Israeli corporate records filed in February 2025.

◆ Finding 01

CONCENTRATION OF OWNERSHIP

Sandstream Analytics controls 38 of the 89 platforms, Logica Group controls 31, and Narrative Dynamics controls 20. Corporate filings show cross-shareholding between the three firms: Sandstream owns 18 percent of Logica Group as of December 2025, and Narrative Dynamics shares four board members with Sandstream.

Source: Delaware corporate filings, UK Companies House, Israeli Corporations Authority, 2024–2026

The pricing models are nearly identical across vendors. Entry-level packages begin at $47,000 per month and include automated content generation for up to 50,000 voter profiles, integration with Facebook, X, WhatsApp, and Telegram, and real-time sentiment tracking. Premium packages — which cost between $340,000 and $1.2 million per election cycle — add deepfake video generation, voice cloning, and "adversarial content" designed to suppress turnout among opposition demographics.

▊ DataRevenue by Vendor, 2023–2025

How three firms captured the AI propaganda market

Sandstream Analytics1,240 USD millions
Logica Group980 USD millions
Narrative Dynamics580 USD millions

Source: Corporate filings (Delaware, UK, Israel), contract data obtained by The Editorial, 2023–2025

The Tools Behind the Platforms

Internal documentation from Sandstream, leaked to The Editorial by a former employee in March 2026, describes the technical architecture. The platforms use large language models fine-tuned on datasets of successful political messaging from past elections in target countries. One training dataset, labelled "PERSUASION-PH-2022," contained 890,000 social media posts from the 2022 Philippine presidential election, tagged by demographic group, sentiment score, and engagement rate.

The systems generate content in batches. A campaign manager uploads a policy position, a target sentiment ("fear," "anger," "pride"), and a demographic profile. The AI produces 50 to 200 variations, each optimised for a specific audience segment. A single operator can manage output equivalent to a team of 80 human writers, according to marketing materials reviewed by The Editorial.

The leaked documents describe features labelled "Adversarial Mode" and "Counter-Mobilisation Suite." These modules generate content designed not to persuade, but to discourage voting. In one case study from the 2024 Kenyan elections, the system targeted young urban voters likely to support opposition candidates with messages emphasising polling station wait times, voter registration errors, and allegations of fraud — all algorithmically calibrated to increase cynicism and reduce turnout. The case study reported a 12 percent reduction in turnout among the targeted group compared to control precincts.

◆ Finding 02

DEEPFAKE CAPABILITIES CONFIRMED

Contracts for premium-tier services include "synthetic media generation," defined in one Logica Group proposal as "video and audio content featuring realistic representations of public figures delivering scripted statements." A September 2025 contract with a party in Brazil included delivery of 14 deepfake videos for $280,000.

Source: Leaked contract documents, Logica Group and Sandstream Analytics, obtained by The Editorial, 2025–2026

The Clients: Who Is Buying This Technology

◆ Free · Independent · Investigative

Don't miss the next investigation.

Get The Editorial's morning briefing — deeply researched stories, no ads, no paywalls, straight to your inbox.

The client list spans the ideological spectrum. In Poland, both the ruling Civic Coalition and the opposition Law and Justice party purchased services from Sandstream-owned platforms in 2025, according to campaign finance filings analysed by The Editorial. In the Philippines, six presidential candidates in the 2025 election used platforms owned by Logica Group, paying a combined $11.4 million. In Turkey, the governing Justice and Development Party's communications directorate signed a $2.1 million contract with Narrative Dynamics in August 2025, three months before local elections.

In the United States, at least 19 congressional campaigns in the 2025 midterm primaries used AI content generation platforms. Federal Election Commission filings show payments to vendors including "VoxPulse Strategic Communications" and "Narrative Solutions LLC" — both subsidiaries of Sandstream Analytics. The filings categorise the expenses as "digital consulting" or "media production," obscuring the use of AI-generated propaganda.

Sample Client Contracts, 2024–2026

Selected purchases across five continents

CountryClient TypeVendorContract Value (USD)Election Date
PolandPolitical partySandstream (VoxPulse)$4.2 millionOct 2025
PhilippinesPresidential campaignLogica (Narrative Engine)$3.8 millionMay 2025
TurkeyGovernment ministryNarrative Dynamics$2.1 millionNov 2025
BrazilState governor campaignLogica (CivicReach)$1.7 millionOct 2024
IndiaState party (Maharashtra)Narrative Dynamics$980,000Nov 2024
United StatesCongressional campaigns (19)Sandstream (multiple)$6.3 millionNov 2025

Source: Campaign finance records, FOI requests, leaked contracts obtained by The Editorial, 2024–2026

Not all clients are campaigns. In Hungary, the government's Centre for Strategic Communication — a state agency reporting to the Prime Minister's Office — signed a three-year contract with Logica Group in January 2025 worth $8.7 million. The contract, obtained through a European Parliament inquiry, describes services including "narrative amplification," "counter-messaging," and "audience segmentation for national priorities."

Regulatory Vacuum, Global Scale

Only one country in The Editorial's analysis has explicitly regulated AI-generated election content. In March 2025, the European Union's AI Act entered partial enforcement, requiring disclosure labels on synthetic media used in political advertising. But the law applies only within EU borders, and enforcement mechanisms remain untested. Platforms operated by Sandstream and Logica continue to serve EU clients through subsidiaries registered in Switzerland and the United Kingdom, where disclosure requirements do not apply.

In the United States, the Federal Election Commission has debated AI disclosure rules since November 2024 but has not issued binding regulations. A proposed rule requiring campaigns to label AI-generated content has been stalled in committee since June 2025. Meanwhile, state-level legislation varies wildly: California requires disclosure, Texas does not, and Florida bans deepfakes entirely — but only within 60 days of an election, and only if they depict candidates without consent.

Electoral management bodies in six of the 12 countries analysed by The Editorial said they had no technical capacity to detect AI-generated content at scale. The Philippines' Commission on Elections told The Editorial in written responses that it "relies on social media platforms to identify and remove violative content," but provided no data on enforcement actions taken. Poland's National Electoral Commission said it had referred three cases of suspected AI manipulation to prosecutors since 2024; none resulted in charges.

The Cases Behind the Numbers

In October 2024, three weeks before Brazil's municipal elections, residents of São Paulo began receiving WhatsApp messages warning that electronic voting machines in their districts were malfunctioning. The messages included screenshots of error codes, photos of polling stations, and links to news articles about past election irregularities. The content was tailored: messages to voters in affluent neighbourhoods referenced fraud in poor districts, and vice versa. Turnout in targeted precincts fell by 8 percent compared to 2020.

An investigation by Brazil's Superior Electoral Court, completed in February 2025, traced the messages to a server operated by CivicReach AI. The platform had been contracted by a gubernatorial campaign that lost the election. The court fined the campaign 4.2 million reais ($780,000) but took no action against the platform or its parent company, Logica Group, which is not registered in Brazil.

In the Philippines, the 2025 presidential election saw an explosion of synthetic video. Between March and May 2025, researchers at the University of the Philippines documented 847 deepfake videos circulating on Facebook and TikTok, depicting candidates making statements they never made — endorsing rivals, insulting ethnic groups, admitting to corruption. Fact-checkers flagged the videos, but platforms rarely removed them within 48 hours, and by then most had been viewed millions of times.

◆ Finding 03

SCALE OF SYNTHETIC MEDIA DEPLOYMENT

In the six elections where detailed monitoring data is available — Brazil 2024, Poland 2025, Philippines 2025, Kenya 2024, Turkey 2025, and India (Maharashtra) 2024 — independent researchers documented a combined 12,400 pieces of AI-generated content that violated platform policies or local election laws. Fewer than 9 percent were removed before election day.

Source: University of the Philippines Disinformation Research Lab, Harvard Kennedy School Shorenstein Center, European Digital Media Observatory, 2024–2025

Researchers traced 34 of the Philippine deepfakes to IP addresses registered to Narrative Engine, the Logica Group platform. When contacted by The Editorial, Logica's legal team said the company "does not create content" and "cannot be held responsible for how clients use licensed software." The company did not respond to questions about whether the Philippine campaigns had violated the platform's terms of service.

What the Vendors Say

The Editorial sent detailed questions to Sandstream Analytics, Logica Group, and Narrative Dynamics in March 2026. All three responded, and all three denied responsibility for how clients use their platforms.

Sandstream's CEO, David Fenstermacher, wrote: "Our platforms are tools for efficient communication. They do not write falsehoods; they amplify messages provided by campaigns. We require clients to certify compliance with local laws." When asked whether Sandstream monitors compliance, Fenstermacher did not respond to follow-up questions.

Logica Group provided a longer statement, portions of which read: "Political communication has always involved persuasion. AI simply makes persuasion more efficient. We operate in full compliance with the laws of the jurisdictions where we are registered. If new regulations emerge, we will adapt." The company did not answer questions about the Brazil or Philippines cases.

Narrative Dynamics, the smallest of the three firms, sent a two-sentence response: "We are a technology company, not a political actor. Our clients are vetted and contractually bound to use our services lawfully."

The Accountability Question

In March 2026, the Organisation for Security and Co-operation in Europe released a report on AI and election integrity. The report warned that "the rapid commodification of synthetic media generation has outpaced the capacity of electoral institutions to detect, attribute, and sanction malicious use." It called for mandatory disclosure of AI-generated content, real-time monitoring by independent bodies, and legal liability for platform providers.

None of the 12 countries in The Editorial's analysis has implemented those recommendations. In the United States, a bipartisan bill requiring AI disclosure in federal elections has been introduced in three consecutive congressional sessions and has never reached a floor vote. In the European Union, member states have until August 2026 to transpose the AI Act into national law; as of April, only four have done so.

Forty-seven elections are scheduled in 2026 where at least one major party or candidate has already contracted with Sandstream, Logica, or Narrative Dynamics. They include national votes in Germany, Australia, and Chile; regional elections in Spain and Italy; and parliamentary elections in at least nine countries in Africa and Asia.

The data shows what happens when technology moves faster than law, and profit moves faster than accountability. The platforms exist. The market is consolidated. The elections are coming. And the rules, in most of the world, have not been written.

Share this story

Join the conversation

What do you think? Share your reaction and discuss this story with others.