For people around the world, the now-iconic images of a man in a horned headdress roaming the US Capitol during the January 6 insurrection came as a shock. For Kate Starbird, the images were frighteningly familiar. ‘QAnon Shaman’ – the online persona of Jacob Anthony Chansley, or Jake Angeli – is a known superspreader of conspiracy theories that her research group has been monitoring for years.
The storming of the Capitol was “this physical manifestation of all of these digital characters we’ve been studying”, says Starbird, a social scientist at the University of Washington in Seattle, who investigates the spread of disinformation on social media. “To see all of that come alive in real time was horrifying, but not surprising.”
The epic battle against coronavirus misinformation and conspiracy theories Starbird is among a cadre of researchers in the United States and abroad who study the way disinformation and conspiracy theories take root and spread through social and mass media.
As US president and a prolific tweeter, Republican Donald Trump turned their research upside down when he helped to push typically fringe theories into the mainstream – most recently by downplaying the coronavirus pandemic and promoting the unfounded claim that the US presidential election had been stolen from him.
With Trump out of office, this group of researchers is now working to make sense of the deluge of data that they’ve collected from platforms such as Twitter and Facebook. It’s been a lesson in modern populism: a world leader amplified once-obscure conspiracy theories, with each tweet and retweet strengthening the ideas and emboldening their supporters. Now, researchers are retooling to understand – and prepare for – what comes next.
During his presidency, Trump frequently retweeted followers linked to the notorious conspiracy theory QAnon, a narrative that originated in 2017 and claimed that a powerful cabal of Democrats and elites are trafficking and abusing children – and that Trump is fighting them. Although Trump never endorsed QAnon, he repeatedly refused to condemn the conspiracy theory in interviews and once praised its followers for their support.
One debate in the conspiracy-theory research community is whether Trump has pushed more people into QAnon or whether he just emboldened those who already believed. Polling suggests that QAnon adherents remain a small, if increasingly vocal, minority, says Joseph Uscinski, a political scientist at the University of Miami in Florida who has been tracking public support for several years. Others argue that polls don’t necessarily capture radicalization at the extremes.
QAnon has clearly gained ground under Trump in recent years, says Joan Donovan, a disinformation researcher at Harvard University in Cambridge, Massachusetts. The activity that she and her team monitor online, as well as the real-world protests and political rallies taking place, add up to “a growing interest in or dedication to these ideas”, she argues.
Researchers like Donovan knew QAnon was primed to embrace the theory that the 2020 US presidential election was rigged. They had already watched QAnon merge with the anti-vaccine movement to back theories that the coronavirus was engineered to earn money for vaccine makers.
Trump began pushing the idea that the election would be illegitimate when he suggested that postal ballots can be falsified. Things came to a head at a January 6 rally, when Trump told attendees, “If you don’t fight like hell, you’re not going to have a country anymore.” He then called for them to march to the US Capitol, just as Congress was preparing to certify Democrat Joe Biden as the next US president.
The false narrative about the election was a landmark – albeit discomfiting – opportunity for researchers to study how disinformation spreads across the internet. In July, Starbird teamed with Renee DiResta, chief researcher at the Stanford Internet Observatory in California, and others in the Election Integrity Partnership to track – and correct – disinformation on social-media platforms such as Twitter, Facebook and TikTok. The team is still sifting through their data, but Starbird says the work is illuminating how social media makes it possible for populist leaders such as Trump to build constituencies and wield power.
In one case study, the researchers tracked false claims that Sharpie pens given to voters in Illinois and Arizona resulted in damaged ballots that were unreadable by voting machines. Seeded by Trump’s narrative about election fraud, these claims originated among his supporters on Twitter and were later amplified by members of his own family and right-wing influencers, helping to spread the message much farther and bring it into the mainstream.
Efforts to set the record straight, including Twitter affixing warning labels to prominent tweets, failed as the narrative spread at the grass-roots level among smaller, unverified accounts, the researchers found.
“We see this interplay between the elites and their audiences, who are actually collaborating with each other to create false narratives,” says Starbird. Social media becomes a testing ground for ideas that then gain momentum and are often picked up by conservative media outlets such as Fox News, she adds. “What we’re learning is that mass media and social media are actually very integrated.”
Trump used this echo chamber to drive conspiratorial thinking about the US election at all levels of the Republican party: 147 congressional Republicans voted against certifying Biden’s election in the wee hours of 7 January, just after the insurrection. In a national poll conducted days later, nearly half of Republicans questioned the outcome of the election and opposed Biden’s inauguration.
“Conspiracy theories are fundamentally a form of political propaganda,” says Quassim Cassam, a philosopher at the University of Warwick in Coventry, UK. Although Trump failed to overturn the election, Cassam says the former president was very successful at mobilizing his political base — and radicalizing the Republican Party.
In the wake of the Capitol insurrection, Twitter banned Trump, disconnecting him from his nearly 89 million followers, and took down more than 70,000 accounts linked to disinformation about campaign fraud and conspiracy theories. Facebook and Google’s YouTube have also suspended Trump’s accounts.
These actions have stifled the conversation online: Starbird’s team analysed its network of influential Twitter users and found that an entire section tied to QAnon disappeared overnight. But Starbird says the extremists they’ have been following will always find new platforms to spread their dangerous ideas.
Law-enforcement agencies remain on high alert: on January 27, the Department of Homeland Security released a terrorist bulletin warning that ideologically motivated violent extremists who object to the presidential transition could continue “to mobilize to incite or commit violence” in the coming months.
Although they are still analysing mountains of data, many disinformation researchers say it’s already clear that new regulations will be needed to govern the Internet, tech giants and the content that their users post online. Donovan says the Biden administration should conduct a comprehensive review of social media, including the algorithms that drive search and recommendation engines, as well as the ways in which technology companies have profited from spreading disinformation and conspiracy theories.
“The gatekeeping power of mass media has now shifted to these platform companies,” says Donovan. “We need them to be much more transparent about what they are doing, and we need regulation so that they know what the guard-rails are.”
- A Nature magazine report