In the days it took Facebook CEO Mark Zuckerberg to personally respond to news that a political data firm used by President Trump’s campaign had obtained information on 50 million users without their knowledge, the media obsessed about his silence.
“Where is Mark Zuckerberg?” the BBC blared. “Missing from Facebook’s Crisis: Mark Zuckerberg,” the New York Times wrote.
The feverish focus on Zuckerberg exemplified his ridiculously outsize influence on our lives. That one 33-year-old technology CEO, who created a cool way to share information, should be perceived as critical not only to his company’s future but to the future of democracy is absurd. He should not be this powerful.
Facebook’s sense of responsibility — or its abilities — has not grown in tandem with its influence. It has failed to anticipate threats. And when it solves one security problem, another pops up. It has already shown it cannot completely prevent its service from being used for bad purposes.
Facebook has also proved how much we enjoy sharing with each other. Social media is probably an irreversible part of our culture now. The problem is, unlike in the case of Uber, where users disgusted by its sexual harassment culture could switch to Lyft, there is no alternative for people fed up with Facebook who want access to such a network.
Zuckerberg, in his first response, said the company had already taken steps to “prevent bad actors from accessing people’s information,” pledged to further restrict developers’ data access and said he’s “serious about doing what it takes to protect our community.”
But we’ve heard similar pledges before.
Like when the Federal Trade Commission found that Facebook engaged in deceptive practices by making public the data that users considered private.
Or when terrorists used Facebook to recruit.
Or as people began committing suicide on Facebook Live.
Or as fake news permeated the service.
Or as hate speech spread on Facebook in Myanmar, fueling a genocide there.
Zuckerberg pledged to have 20,000 employees working on security by the end of the year. Strikingly, he acknowledged that the company doesn’t have a handle on what happened with people’s data in the past. “Even if you solve the problem going forward, there’s still the issue of … were there apps which could have gotten access to more information, and potentially sold it, without us knowing,” Zuckerberg told the New York Times as part of a media blitz that included interviews with CNN and Recode, too. He promised a “full investigation of every app that got access to a large amount of information.”
His statements were a belated, seemingly genuine, expression of good faith to want to fix the problem.
But we can’t wait any longer for Facebook to figure out how to protect us. It’s time for those preaching more humane, responsible technology to form a social network with the algorithms and humans needed to prevent abuse, which isn’t afraid to feature credible, fact-based journalism and eradicate fake news. One that is transparent about how its business model works. Perhaps one of more modest ambition that won’t become as dominant or as hard to police, and that will help dilute Facebook’s negative impacts. I’d pay to be part of such a social network so that my likes and shares weren’t how the company made money.
Revelations that Cambridge Analytica obtained so many Facebook users’ information is only the latest example of the company’s negligence. The story hit a nerve here and in Europe, and the furor poses serious legal, financial and reputational threats to the company.
But even if Facebook wins lawsuits and avoids regulation that could undercut a business model that relies on marketing user data, the social media giant will inevitably face subsequent crises unless it finally abandons the idea that it’s just a neutral technology service without responsibility for what happens there.
Facebook clung to this definition in 2012, when it was hijacked by jihadists seeking new followers. “Social media is no longer simply a fun way to share updates on the harmless idiosyncrasies of our lives,” I wrote then. “It can undermine national security, and there ought to be a more robust discussion between the Bay Area technology world and Washington on what to do about it.”
That didn’t happen. Then came other awful behavior, like teen bullying and suicides on Facebook Live, which led to even more lawsuits but not any fundamental change.
Facebook grew to dominate more of our lives, including our news consumption, where it helped amplify conspiracy theories and fake news and propaganda by a foreign power during the 2016 election campaign.
Its executives initially played down the notion that fake news influenced the results. Then the company put out ridiculously small estimates of the number of people affected. Finally, it acknowledged that 126 million users may have seen Russian propaganda (researchers put the number even higher). General counsels from Facebook, Twitter and YouTube (sites that were similarly manipulated) testified on Capitol Hill and got out of there without pledging much. Russia will try to exploit them all again in November.
Still, the Cambridge Analytica story may be even more damning. Beyond the reporting, Facebook insiders are exposing the company’s true priorities. “They really didn’t want to know, to a certain extent, what was happening with the data once it left Facebook,” Sandy Parakilas, an operations manager at Facebook in 2011 and 2012, told NPR on Tuesday. Parakilas says he pushed the company for “more protection” of user data. But “they didn’t seem to prioritize protecting users over the growth of Facebook apps,” he added.
The risks of third-party applications have long been known. “This is a story about an ecosystem full of privacy risk and the inevitable abuse that resulted,” said Daphne Keller, director of intermediary liability at Stanford’s Center for Internet and Society.
Even if Zuckerberg moves to stem that risk, the damage has been done. “What worries me is that this data trove is out there,” Krishna Bharat, the founder of Google News, told me. “There’s not a chance it was deleted, since it is so valuable to campaigns. Who is going to use it in the future? People don’t change that much, and it will be current. Combined with fake memes, you can still inflict great harm on the democratic process.”
Sadly, I can’t entirely escape Facebook yet. As a journalist, it remains an essential reporting tool. I might still want to view information in a Facebook parent group at my daughter’s school. As a first step, I’ve downloaded an archive of my posts and deactivated my account to allow for re-log-ins should I need information I can’t get elsewhere. There’s no way to ensure that my past data aren’t misused. But at least I — and you — have the power not to post there anymore.
Janine Zacharia, a former Washington Post reporter, is the Carlos Kelly McClatchy visiting lecturer at Stanford University.