US resident Biden’s criticism of Facebook is a double win for Fox News. Not only does it draw attention away from the network’s own culpability for the vaccination gap, but it feeds a potent right-wing narrative about government and Big Tech colluding to silence conservatives. ‘
Biden and Facebook have over the past several weeks slugged it out over misinformation on the ability of the vaccines currently in use to combat coronavirus.
“I just think that this kind of coordination between big government and the big monopoly corporation, boy, that is scary stuff. And it really is censorship,” Missouri senator Josh Hawley said Thursday on – where else? – Fox News.
That sense of outrage easily sustained conservative media throughout the weekend, with both pundits and Republican lawmakers weighing in on, as Ted Cruz put it, “their willingness to trample on free speech, to trample on the Constitution, to use government power to silence you, everything we feared they might do.”
It’s easy to see why the White House would spend political capital beating up on Facebook rather than Fox News: Facebook might actually listen. Biden has no leverage over right-wing media. When a Fox News host questions the safety or wisdom of vaccination, it isn’t a lapse in enforcement; it’s tonight’s programming. Many people at Facebook, by contrast, would prefer not to be responsible for poisoning America’s public health information environment.
“TV and radio, particularly conservative TV and radio, are essentially getting a free pass right now, even though they’re doing amazing harm.”
Which, according to Facebook, they aren’t. In a blog post last week, Guy Rosen, Facebook’s vice president of integrity, argued that Facebook has been a force for good when it comes to vaccinations. He noted that “more than two billion people have viewed authoritative information about Covid-19 and vaccines on Facebook” since the start of the pandemic, while the company has “removed over 18 million instances of Covid-19 misinformation.”
And, he claimed, Facebook has already complied with all eight of the surgeon general’s recommendations, which would include Murthy’s suggestion that companies “give researchers access to useful data to properly analyse the spread and impact of misinformation.”
In fact, Facebook notoriously does not provide access to the data needed to understand what’s happening on its platform. Notice, for example, that Rosen’s blog post doesn’t mention how many times users have seen unreliable information about Covid or vaccines.
Facebook publicises statistics about engagement with posts – likes, shares, and so on – but refuses to disclose data about “reach,” meaning how many people see a piece of content. Nor does it provide any concrete details about its efforts to reduce the spread of misinformation.
“The public has no idea what Facebook is or is not doing to combat vaccine misinformation, and doesn’t have any sense of how bad or not-bad the problem is,” said Rand, the MIT professor.
“There’s lots of work being done within the company by lots of smart people to try to reduce the impact of misinformation, but they don’t really tell much about it.”
Rand said platforms like Facebook should partner with outside researchers on empirical studies about what does and doesn’t work to combat vaccine misinformation – and publicise the results. He noted that Facebook is sitting on enough data to measure how exposure to posts about vaccines affect real-world behaviours.
“They’re doing randomised controlled trials on vaccine misinformation every day, they just don’t think of it that way,” he said.
The irony is that, by providing some insight into how it approaches the problem, Facebook seems to have wandered into the worst possible balance between transparency and secrecy. YouTube makes comparatively little information available to researchers, helping it fly under the political and regulatory radar despite its massive importance.
Facebook, meanwhile, provides just enough data through CrowdTangle for researchers and reporters to bludgeon the company, but then conceals the evidence that it claims would vindicate it.
“They’ve sort of painted themselves into a corner by giving enough data to make them look bad, but then saying, ‘Well, behind closed doors we have data that makes us look OK,’” said Jenny Allen, a doctoral student at MIT who is researching the comparative influence of social media and TV news. “It’s the worst of both worlds.”
Facebook could learn something from the vaccine development process itself. The reason it’s possible to talk about vaccine “misinformation” in the first place – why it isn’t purely a matter of opinion – is that the data behind the vaccines’ efficacy and risks have all been made public.
No one with any sense will believe Facebook’s claims about its own public health interventions until it does likewise.
- A Nature report