It wasn’t long after Mark Zuckerberg took the stage at the Aspen Ideas Festival Wednesday that he was heckled by the audience. Facebook’s CEO was talking to Cass Sunstein, the Harvard Law School professor who has also served as a Facebook adviser, and discussing the complexities of combatting election interference. One problem, Zuckerberg proclaimed, is that Facebook doesn’t have a mechanism for deterring foreign governments from running influence operations. It can take down posts; it can delete fake accounts; but it can’t cut the internet connections of Russia’s Internet Research Agency. “As a private company, we don’t have the tools to make the Russian government stop,” he declared. “Our government is the one that has the tools to apply pressure to Russia, not us.” As he spoke, an elderly voice hollered from the back, “Not true!”
The moment was symbolic for Zuckerberg and the trust he and his company have lost in recent years. The Aspen Ideas Festival is a quiet, thoughtful place. Heckling is rare. But Facebook has drawn anger and derision everywhere this year, even at thought-leader conferences in the mountains. Still, Zuckerberg soldiered on, making his point, which happens to be largely true.
Not long thereafter, Facebook’s leader faced his second gauntlet. Sunstein prodded him on Facebook’s decision not to remove a maliciously edited video of House Speaker Nancy Pelosi, appearing drunk, that had gone viral in May. “Why not make the policy, as of tomorrow, be that if reasonable observers could not know that it's fake, then it will be taken down?” asked Sunstein. The audience responded with enthusiastic applause that seemed to startle the professor. “That is the first time I have ever gotten applauded,” he said, smiling.
Zuckerberg began his response by noting that this is a topic of intense discussion at the company. Deepfakes may be a different category from the kind of false news that Facebook has extensive experience dealing with. Still, Facebook starts to think about its response with the principle it uses when responding to false statements: Limit their distribution but do not remove them completely. You can limit reach without stifling speech .
His easiest move would have been to leave the answer there. Instead, Zuckerberg decided to defend his company’s decision and tie it to broader principles. “We exist in a society where people value and cherish free expression, and the ability to say things including satire,” he said. He then doubled down, saying he did not think anyone should want “a private company to prevent you from saying something that it thinks is factually incorrect.” He added, “That to me just feels like it's too far and goes away from the tradition of free expression and being able to say what your experience is through satire and other means.”
By the end of the exchange, Zuckerberg had done something that has become increasingly rare in the tech industry the past few years: defend free speech with a hammer, not a shrug. Many of the hardest decisions in tech have come down in recent years to tradeoffs between safety and speech. After a decade of favoring speech, most executives have recently chosen safety. Alex Jones has been banned. Artificial intelligence systems have been set up to filter speech that is cruel. Anonymity has been made harder. Zuckerberg, though, seemed to be taking a stand, and when he finished the audience gave him a hearty round of applause.
The rest of the interview covered familiar ground. Zuckerberg noted that there are deep tradeoffs around data portability and privacy. “Part of the issue today is that we offer people so many choices over so many different things, and so many controls, that it ends up not feeling accessible. And a lot of the time, if you want to design a simple product that people understand, you just want to make choices for people that reflect what you think their best interests are.” That’s true! But it also sounded a little dark.
Predictably, he argued that antitrust would not be a good solution to the woes for which people blame Facebook. Having multiple small social media companies, he said, wouldn’t make it easier to protect privacy or defend elections. Moreover, he added, the acquisitions of Instagram and WhatsApp should not be undone, because Facebook had made those services more innovative, not less. “Yes, some mergers can be bad for innovation. These weren’t.” Sunstein, pressed for time, didn’t have a chance to truly counter. Fortunately, though, this particular topic is going to come up at the festival again. And also before Congress and the Federal Trade Commission .
The audience had ushered the previous speaker—the rapper Common—out with a standing ovation. When Zuckerberg’s talk ended, everyone seemed enthusiastic, if rather a bit less so. Still, he only got heckled once.
- Disney's new Lion King is the VR-fueled future of cinema
- The English rose, in all its Frankensteined glory
- The most delicious foods will fall victim to climate change
- YouTube's “shitty robot” queen made a Tesla pickup truck
- An all-white town’s divisive experiment with crypto
- ✨ Optimize your home life with our Gear team’s best picks, from robot vacuums to affordable mattresses to smart speakers .
- 📩 Want more? Sign up for our daily newsletter and never miss our latest and greatest stories
Facebook Moves to Limit Toxic Content as Scandal SwirlsTOM BRENNER/The New York Times/ReduxMark Zuckerberg would like you to know that despite a scathing report in The New York Times, which depicts Facebook as a ruthless and selfish corporate behemoth, things are getting better—at least, the way he sees it.In a lengthy call with reporters Thursday, and an equally lengthy "note" published on Facebook, the company's CEO laid out a litany of changes Facebook is making, designed to curb toxic content on the platform and provide more transparency into the decisions on content.