New report details how Facebook is grappling with spectre of Russia’s half-truths and support for coups in Africa
Facebook is struggling to contain pro-Russian and anti-western posts that are contributing to political instability in West Africa, investigators and analysts have said.
The platform, which has expanded rapidly across the continent in recent years, has made significant investment in content moderation, but still faces enormous challenges in curbing deliberate disinformation campaigns. One major area of concern is the strategically important Sahel region, which has suffered a series of military takeovers over the past 18 months.
Campaigns on Facebook appear to have prepared the ground for many of the coups, pushing an anti-western, pro-Russian agenda that has undermined governments. The efforts are similar to the “hybrid warfare” campaign launched by Moscow in Ukraine and elsewhere.
A report by investigators from the Digital Forensic Lab, a global network of digital forensic researchers run by US-based thinktank the Atlantic Council, reveals how pro-Russian Facebook pages in Mali coordinated support for anti-democracy protests and the Wagner Group, a controversial Russian private military contractor that was invited into the unstable country last year after the overthrow of President Bah N’daw by the military.
The US and others have alleged that Wagner is funded by the powerful businessman Yevgeny Prigozhin, who is closely linked to Vladimir Putin. It has a growing presence in Africa and its mercenaries have been deployed in Mozambique, Sudan, Libya and in Central African Republic, where Wagner Group fighters committed human rights abuses while fighting alongside government forces against rebels, according to a group of independent UN experts. Prigozhin and the Kremlin have denied any knowledge of Wagner.
Western officials described Wagner as the “thin end of the wedge” and a “Trojan Horse” for a Russian effort to extend its influence covertly in resource-rich and unstable parts of the continent. Earlier this year, France announced that it was withdrawing thousands of troops from Mali, ending a near decade-long effort to fight Islamist insurgents from bases there.
The Wagner Group has deployed between 400 and 600 fighters, trainers and support staff to Mali and appears to have already launched offensive operations against extremists.
In early April Human Rights Watch reported that suspected Russian mercenaries had participated in an operation with Mali’s army in March in which about 300 civilians died. HRW did not mention Wagner specifically.
The DFR Lab identified a coordinated network of five pages pushing narratives that promoted Russian intervention in Mali while disparaging the West, and France in particular. The pages have published nearly 24,000 posts and are followed by more than 140,000 accounts.
In September 2021, the pages of the network began promoting Wagner as an alternative to the French forces. Pages in the network frequently posted identical content, often less than 20 seconds apart, the DFR Lab found.
In a second report, DFR investigators found that pro-Russian content spread on Facebook in West Africa in the months before the military takeover in Burkina Faso in January. Hours after the coup there, demonstrators in Ouagadougou, the country’s capital, chanted pro-Russian and anti-French slogans.
Independent fact-checkers labelled several posts as misleading, including a pro-Russian page that repurposed images of apparently well-equipped hobbyists in combat gear as Russian soldiers.
The DFR Lab has an ongoing partnership with Facebook to independently monitor the platform for disinformation campaigns, with a particular emphasis on election interference and receives funding from Meta.
Facebook declined to take down the pages described in the reports when alerted, saying that, although the posts were clearly part of a coordinated effort, they did not appear to be a front for unidentified users and so “inauthentic.”
Many posts that breached Facebook’s moderation polices have been labelled as false or misleading after being investigated by third-party fact-checkers, a spokesperson said.
Toussaint Nothias, research director at the Digital Civil Society Lab of Stanford University, who has worked extensively on Facebook, said the decision was surprising and underlined the big challenge of effective content moderation.
“The boundary between inauthentic and authentic coordinated behaviour is very tricky to manage. Authentic coordinated behaviour can often resemble social movements and determining when this behaviour is harmful depends largely on the context and standpoint,” Nothias said.
A spokesperson for Facebook’s owner, Meta, said the company took the problem of coordinated campaigns seeking to manipulate public debate very seriously and was taking aggressive steps to fight the spread of misinformation in Africa as elsewhere.
“We’ve built the largest global third-party fact-checking network of any platform and in the last few years we’ve more than doubled the number of countries we cover across the continent,” the spokesperson said. “While nobody can eliminate misinformation from society entirely, we continue to consult with outside experts, grow our fact-checking programme and improve our technology to tackle it on our services in the most comprehensive and effective way possible.”
Facebook has moved repeatedly in recent years to take down hundreds of “inauthentic” accounts targeting Africa, many linked to Moscow.
In October 2019, Facebook took down three networks of accounts linked to Prigozhin. The accounts were actively seeking to influence the domestic politics in eight countries: Madagascar, the CAR, Mozambique, the Democratic Republic of the Congo, Ivory Coast, Cameroon, Sudan and Libya.
In 2020, Facebook targeted a second Russian-led network of professional trolls outsourced to Ghanaian and Nigerian operatives.
The DFR report on the recent Mali accounts – and Facebook’s decision not to take down the network – underlines the risk of actors exploiting loopholes in the company’s policies.
Experts say influencing efforts have become more sophisticated and are likely to involve hiring individuals who will push a particular narrative. This makes it much harder to establish whether such operations are “inauthentic” and thus breach Facebook policy.
“It’s not bots any more. The major [social media] firms are very good at identifying bots … There are real people behind the accounts.” The trend has been a tactical shift for Prigozhin-linked operations. Local operators offer a better understanding of the context on the ground and a level of plausible deniability,” said Shelby Grossman, a research scholar at the Stanford Internet Observatory.
Jean Le Roux, author of the new DFR report, said those behind the most recent networks could have been people in Mali who were genuinely supportive of Russia’s efforts and anti-French, or members of a “franchising operation using locals who know the slang, the vernacular”.
- The Guardian report