Elections: how disinformation campaigns aim to wreck public trust in them

Why public trust in elections is being undermined by global disinformation campaigns

Public trust in elections is being undermined by disinformation campaigns. Shutterstock

Public trust in elections is being targeted around the world by a series of disinformation campaigns from a range of international players. This is giving rise to an increasing lack of trust in how votes are counted.

The almost unlimited capacity for individuals and organisations to publish information using websites (only limited by time and manpower), social media and other outlets has given disinformation campaigns a set of new media to manipulate in the last decade.

With the Brazilian election coming up this autumn, analysts have already suggested that public trust in voting processes is being targeted, with similar tactics to those used around the last US presidential election. Like former US president Donald Trump, Brazilian president Jair Bolsonaro has started undermining public confidence in the democratic process by claiming that elections were fraudulent.

Bolsonaro has also raised questions about both electronic voting and the vote-counting process.

Disinformation campaigns often begin well before elections to create confusion and allow the losers to challenge results. During Mexico’s 2021 election disinformation was spread through social networks in a bitter and polarised campaign. There was evidence of organised trolls spreading insults and attacks against candidates, and a rise in fake news stories about the election.

These tactics are being used across the world. The European Parliament said the “most systemic threats to political processes and human rights arise from organised attempts to run coordinated campaigns across multiple social media platforms”. A 2019 report discovered evidence of organised social media manipulation campaigns in 70 different countries, employing armies of “cyber troops” (300,000 in China for instance) to influence public opinion on various issues, and create political chaos. And a US foundation has raised concerns about new state laws shifting election administration powers to political or partisan bodies.

A Chinese disinformation campaign to discredit presidential candidate Tsai Ing-wen, and another against Hong Kong’s pro-democracy activists were reported on. Twitter took down 900 fake accounts used by the Chinese authorities and another 200,000 new accounts linked with another Chinese network.

How do they work?

Disinformation campaigns often rely on an enormous volume of messages, using a variety of methods. They use traditional media such as newspapers, radio broadcasts and television, but disinformation is also spread via websites, social media, chat rooms, and satellite broadcasting and include a whole mix of texts, photographs and videos using thousands of fake accounts.




Read more:
Why Brazil’s Bolsonaro is following Trump’s pre-election playbook


Internet “troll farms” are often set up, with teams of people putting out misleading messages to counter political viewpoints or other narratives. These farms employ workers on 12-hour shifts, 24 hours a day, with daily quotas of 135 posted messages per day, per worker.

One example is the Russian Internet Research Agency (also known as Glavset), ostensibly a private company but one that appears to be funded by the Russian government(now operating under different guises as part of “Project Lakhta”). It spreads Kremlin disinformation on social media using false identities and false information, under different names.

Using a variety of sources that employ different narratives and arguments but point to the same conclusion is more persuasive, because it conceals the fact that the propaganda ultimately derives from the same source. A study conducted by the Harvard University’s Kennedy School of Government on the use of Twitter as a forum for disinformation found: “Evidence from an analysis of Twitter data reveals that Russian social media trolls exploited racial and political identities to infiltrate distinct groups of authentic users, playing on their group identities.”

Russia is also accused of mounting various campaigns to influence elections, including the presidential elections in the US in 2016 and 2020. Academic analysis of how the Russian Internet Research Agency used social media showed how they specifically targeted “self-described Christian patriots, supporters of the Republican party and of presidential candidate Donald Trump”.

The Russian governing elite believes that the west is committed to transforming the post-Soviet countries using non-military instruments of warfare, including economic instruments, the spreading of ideas about democracy and human rights, and support for NGOs and human rights activists with the purpose of inducing “colour revolutions” that will topple governments. By conducting information warfare Russia claims it is only responding to western methods.

The overall purpose is to create mistrust of the core institutions of liberal democracy including parliaments, mainstream media, elections and the judiciary..

Governments can respond by introducing regulations to combat the spread of disinformation, but this is controversial because it forces governments to define the limits of free speech. In practice, it means introducing and further developing elaborate codes of practice and guidelines for the internet and social media. Another tool is the development of fact-checking networks.

If disinformation creates a widespread public belief that elections are “stolen” or manipulated, it undermines belief in public institutions that are essential to democratic governance. Therefore such disinformation campaigns can pose a very serious threat to liberal democracy and public order. This is the outcome that some of the state actors are seeking. The development of the instruments to deal with this challenge is only just beginning.

The Conversation

Christoph Bluth does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.