Opinion: Facebook founder Zuckerberg can’t fix what he won’t own up to

Opinion: Facebook founder Zuckerberg can’t fix what he won’t own up to

0

In June 2017, Mark Zuckerberg changed Facebook’s mission. Speaking at the company’s first Community Summit in Chicago, he explained that the best part of Facebook is its “meaningful groups,” those that address a user’s passions or needs and connect them with others who share those interests.

At the time, there were 100 million people in meaningful groups; he wanted to grow it to a billion. Zuckerberg believed this so much that he changed Facebook’s core goal from “connecting the world” to “giving people the power to build community and bring the world closer together.”

In a post explaining this, he wrote, “Communities give us that sense that we are part of something greater than ourselves, that we are not alone, and that we have something better ahead to work for.”

More than three years later, some of those groups have done exactly what Mark Zuckerberg envisioned: They bound together for a passionately held common cause. But the “something greater than ourselves” probably wasn’t what he had in mind: overthrowing the peaceful transfer of power following a fair and certified election in the United States.

Other platforms like Parler might have been instrumental in organising extremists to assault the US Capitol building. But Parler’s members were already committed to the cause. Facebook’s community-building algorithms were effective in drawing some of its massive audience from the sidelines and into the maw of radicalism and sedition.

In fact, Facebook’s own algorithms seem to pump up membership in those groups. A Wall Street Journal article from May 2020 reported an alarming finding from Facebook’s own researchers. According to a 2016 internal study, “64 percent of all extremist group joins are due to our recommendation tools … Our recommendation systems grow the problem.”

The article also revealed that the company’s efforts to address this were stifled by interference from the company’s political wing, ever sensitive to criticisms from the right. And just this week a New York Times article outlined several cases where relatively sane people were driven deep into seditious crazy town once they discovered that Facebook widely circulated their most transgressive anti-democratic posts, winning them status and followers.

One user found that the more he posted deranged Trumpist messages, the more followers Facebook sent his way, and soon he was hosting a meaningful group based on election denial, with tens of thousands of members.

It was almost like an embodiment of what Zuckerberg had described as the feeling of “we are not alone.” By becoming an anti-democracy person, he’d found other people. All reinforcing everyone else’s awfulness.

This week Facebook COO Sheryl Sandberg gave a rare interview. As always, she cautioned that the company wasn’t perfect, but her overall message was that Facebook’s policies were by and large working.

“Was there anything you thought Facebook could have done sooner?” asked her interlocutor. Sandberg replied that while Facebook knew that the protests were being organised online, it had generally done its job by removing violent groups like Proud Boys, QAnon, and Stop the Steal.

The latter group garnered 320,000 followers before Facebook took it down, and the corresponding hashtag wasn’t banned until five days after the January 6 insurrection.

She assigned serious blame to others. “I think these events were largely organised by platforms that don’t have our ability to stop hate and don’t have our standards and don’t have our transparency,” she said.

Twitter CEO Jack Dorsey seemed more candid in admitting that his company fell short. Like Sandberg, he defended the timing of his company’s ban of Donald Trump following the riot. But he also admitted that on the absolutely critical issue of how speech can hurt society, his company blew it.

“I feel a ban is a failure of ours ultimately to promote healthy conversation,” he wrote. “And a time for us to reflect on our operations and the environment around us.”

Social media is not the only culprit. Fox News has been a conscious arsonist of the conflagration that has seared our social fabric. And of course, politicians, from Trump on down, bear tremendous culpability.

But the mechanisms that Facebook, Twitter and YouTube use for growth and engagement have been too easily exploited to feed a beast that now threatens our democracy. Embracing community is great—but not when the community is dangerous or destructive. Fixing this won’t be easy. But admitting failure is a first step.

A few months before the Community Summit in Chicago, Mark Zuckerberg released a manifesto about Facebook as a builder of communities. I got a preview and a chance to discuss it with him, and I wrote it up for Backchannel (now found in the WIRED archives):

Zuckerberg’s views on informed communities – and how they get their news – go well beyond the fake-news controversies that have bedevilled the company recently. The CEO himself admits that he did not help things by saying at a conference right after the election that he didn’t think fake news on his service affected the outcome.

“I might have messed that one up by not giving the broader context, and people thought that the narrow thing was how I think about this broadly,” he said.

“The question of common understanding and common ground is even bigger. Let’s say you can wave a magic wand and get rid of all misinformation. We could still be moving into a world where people are so polarised that they will use a completely different set of true facts to paint whatever narrative they want to fit their world view.”

I told Zuckerberg that right now my News Feed is basically … Trump, Trump, Trump, married, Trump, baby, Trump. I wondered how much of his News Feed was dominated by posts about our new president.

“It’s a good amount,” he said. But he sees it as a temporary aberration. The issue for him is not just our domestic situation but “a serious global thing” where people need to be better informed – not just by news, but by each other.

Though he touts recent tools that Facebook introduced to give lower rankings to inaccurate or overhyped news stories, he also admits that it’s a work in progress.

“I just want to make sure there’s common ground, that everyone has the ability to share what they want and that nuance doesn’t get lost,” he said.

  • A Wire report
About author

Your email address will not be published. Required fields are marked *