Researchers' Zone:

Monks checking their phones i Myanmar. Facebook was integral in spreading disinformation that facilitated the Rohingya-genocide in Myanmar. (Photo: Shutterstock)
Monks checking their phones i Myanmar. Facebook was integral in spreading disinformation that facilitated the Rohingya-genocide in Myanmar. (Photo: Shutterstock)

Disinformation goes South

In the Global South, social media monopolies and a surge in digital media users allow information operations to reach millions and affect political elections and developments. The consequences can be fatal, as seen in Myanmar.

Published

Political actors have always tried to outsmart or even outmanipulate their rivals. This is the nature of the political game. But we have entered a new playing field through the immense proliferation of modern Information and Communication Technologies (ICT) to the Global South, from Africa to Asia to Latin America.

Campaigns of disinformation and so-called fake news that we have seen launched during elections in the West over the past years, from the US to the UK, are now increasingly employed in less developed countries, from Kenya to Myanmar.

The services of political technologists are in high demand as information operations, even when addressing a very local political context, may be launched from and target an objective anywhere in the world.

With increasingly sophisticated aesthetics and credibility, these operations are now shaping elections in the developing world.

Here, they are not least facilitated by a combination of growing internet access and concentration of social media and news outlets, a dangerous cocktail when mixed with varying levels of media literacy, unstable regimes, and ethnic tensions.

Two trends are disturbing democracy

Elections worldwide are increasingly characterized by two system-upsetting trends.

The first trend is the ability to reach large numbers of voters quickly and often at low costs

This may range from large-scale and well-polished campaigns run on private or state-controlled networks to much simpler and crowd-fueled campaigns rolled out on social media platforms; and from wall-to-wall campaigning to micro-targeting, where small groups of voters, or even individual voters, are targeted by tailor-made messages.

The second trend, when appearing in its most extreme from, is the generation and spread of disinformation, defined as 'intentionally wrong information' or simply lies. Disinformation of course has also always been part of the political game, but modern ICT allow many more users to spread false information to much larger audiences.

And it seems that this fact alone has brought many more disinformers out into the open.

The two trends combine in a myriad of ways. The spectrum ranges, at the one end, from disinformation campaigns run by a state on its own state-controlled networks and targeting a global audience to the single political activist or ‘citizen reporter’ posting regular information on social media.

The former end of the spectrum is where resourceful actors, often supported by political technologists representing domestic or foreign companies, operate in a manipulative manner in order to achieve certain effects.

Increased internet access in the Global South

In the Global South, information operations can exploit the combined effects of growing ICT and social media monopolies. The International Telecommunication Union reported an internet penetration breakthrough in 2018 as it assessed that more than half of the world’s population is now using the internet.

The fastest increase in internet use is seen in Africa: Whereas only 2.1 per cent of Africans had access to the internet in 2005, a quarter of Africans had access in 2018. And the number of mobile users on the continent has quadrupled to almost 800 million people, the majority of whom have smart phones with access to the internet.

The vast majority of Facebook's almost 150 million African users access the social media through their phones.

Some of the figures on penetration may even obscure real and higher levels of access. 'Community watching', when more people gather around a single screen, may offer more people access to for instance news and therefore also potentially expose them to information operations.

Studies, for instance by the Massachusetts Institute of Technology, show that disinformation travels faster than regular information and it is very likely that people will also share or request access to disinformation more often than to regular information. It is simply more popular than regular information.

Disinformation can have vast consequences

At the same time, and with alarming speed, the consequences of overreliance on social media for information and news, are becoming clear.

According to the UN, Facebook was a central platform for facilitating the Rohingya genocide in Myanmar, not least through the spread of disinformation by the Burmese military, whose specialists were reported to have been trained in Russia.

At the same time, all African elections of 2019 thus far, seem to have been affected by disinformation campaigns, from Senegal to South Africa.

In May of 2019, Facebook announced they had banned Israeli 'political consulting group' Archimedes who had reached millions of people with political manipulation in Nigeria and other African countries during elections, masking as local news organizations.

Almost a million US dollars had been spent on discrediting opposition figures. And by the time Facebook decided to intervene, numerous elections had already been targeted and possibly affected.

The company used hundreds of fake accounts, pages and groups on Facebook to try and influence groups during elections. Their website allegedly promised that the company could 'change reality according to our client's wishes'.

One of the company's fake Nigerian accounts included a banner image depicted presidential opposition candidate Atiku Abubakar as Darth Vader, the Star Wars villain, holding a sign stating 'Make Nigeria Worse Again'.

In 2018, an undercover video by British Channel 4 saw Cambridge Analytica executives brag about influencing Kenya’s presidential elections in 2017 and 2013. And in the Spring of 2019, in Nigeria, where tens of millions of voters turned out to elect their new president, violence erupted at many voting sites as disinformation on social media escalated local tensions by for example reporting on fake instances of ethnic violence.

Operations can have a variety of effects

The information operation – whether executed by a state or by a single political activist – is designed to have a more or less well-defined effect on a target audience. That is the link between the digital domain and the cognitive domain.

While the information operation is executed within and through the support of the digital domain, its purpose is to influence the political preferences of a target audience. This process takes place in the cognitive domain of the intended audience.

Needless to say the hoped-for effects mirror the human imagination. Information operations may be conducted with the hope of strengthening or weakening adherence to certain norms or the socio-political cohesion of one or more states.

They may be designed to deflate or inflate political, religious, ethnic etc. tensions. Or they may be engineered to legitimize or delegitimize political, religious, cultural etc. actors.

Of particular concern is the possible use of information operations to initiate and legitimize intra- or interstate conflict, to violate the rights of minority groups or to bolster non-democratic and/or illiberal rule.

Susceptible to exploitation

The digital media landscapes in the Global South vary enormously. In Ethiopia, Chinese state-owned ZTE helped the former authoritarian government build a closed digital infrastructure allowing for control and regulation.

In other countries, Facebook or Facebook-owned social media such as WhatsApp, dominate internet use, with many telecom operators often including free access to them without imposing any data-costs on users, de facto favorizing them as information channels.

Facebook seems to be working more closely with civil society and digital media rights groups than earlier, recruiting native language speakers, but questions remain as to whether it has learned from past experiences in Myanmar and elsewhere.

Only a limited group of developing countries have data protection and privacy laws, and many still struggle with cocktails of varying levels of media literacy, unstable regimes, and ethnic tensions. What binds them together, in all their differences, is their susceptibility to exploitation, from within or from without, as information operations increasingly move South.

A former and somewhat different version of this article appeared in DIIS policy brief earlier this year. Read the danish version on Videnskab.dk.

References:

Adam Moe Fejersskov's profile (DIIS)

Flemming Splidsboel Hansen's profile (DIIS)

'Half the world is now connected to the internet—driven by a record number of Africans'. Quartz Africa (2018).

'Study: On Twitter, false news travels faster than true stories'. MIT News (2018).

'U.N. investigators cite Facebook role in Myanmar crisis'. Reuters (2018).