The Guy Who Wrote Facebook's Content Rules Says Its Politician Hate Speech Exemption Is 'Cowardice'

Last Tuesday, Facebook vice president Nick Clegg announced that Facebook was going to give politicians more leeway than other users in using offensive speech, and their assertions would not be fact-checked. That set Dave Willner over the edge. Two nights later, Willner posted a long explanation—on Facebook, of course—attacking the policy. The 35-year-old tech worker described the social network’s new stance as “foolish, wrong, and a significant betrayal of the original democratizing ideals of Facebook.”That essay is notable not just for its well-argued points but for who wrote it: Dave Willner is Facebook’s former head of content standards. Over 10 years ago, as part of the team monitoring content on the nascent social network, he took an ad hoc list of no-no’s and helped create a document that is the foundation for the company’s content standards. (Though the current version is longer and more detailed, Willner says Facebook’s hate speech rules haven't changed that much in the last decade. “What has changed is the willingness of politicians to say things that are clearly racist, sexist, etc.,” he says.) Willner left Facebook in 2013 and heads community policy for Airbnb. His wife Charlotte, who worked with him at Facebook, heads Pinterest’s trust and safety team—making Willner half of online speech moderation’s First Couple.
Though Facebook says it will still remove content from politicians that encourages violence or harm, Willner argues that allowing hate speech—whether it's from a politician or a private citizen white supremacist—can create a dangerous atmosphere. He cites research from the Dangerous Speech Project, which studies the types of public speech that spark violence, that backs up his claim. He also charges that Facebook’s exception now makes politicians a privileged class, enjoying rights denied to everyone else on the platform. Not only is Facebook avoiding hard choices, Willner says, it is betraying the safety of its users to placate the politicians who have threatened to regulate or even break up the company. “Restricting the speech of idiot 14-year-old trolls while allowing the President to say the same thing isn't virtue,” writes Willner. “It's cowardice.”Willner intended his post, distributed to “friends-only,” to generate discussion and attention from former colleagues, both inside and outside Facebook. That it did. One top executive who engaged with Willner was Andrew “Boz” Bosworth, Facebook’s outspoken vice-president of AR/VR, and one of the engineers who created the News Feed. Taking pains to note that he is not part of the content standards team, Bosworth described Facebook’s decision as a reasonable balance between maintaining standards and letting people know what their political leaders really think. “I just feel you aren't paying enough respect to the newsworthiness case,” he wrote in a comment under Willner’s post. “I'm not convinced the horror of the speech is greater than the horror of it going unnoticed.”
Another participant in the fray was Paul C. Jeffries, Facebook’s former head of legal operations. Speaking like someone who has spent a lot of time with lawyers—though he holds an physics degree—Jeffries says of Facebook that “they aren’t really applying different rules to regular folks and politicians. It’s the same rule, it just evaluates [sic] different.”The conflict is genuinely knotty. As Bosworth notes, it’s a classic tradeoff between two values. Twitter gives a pass to Donald Trump’s hateful tweets, because CEO Jack Dorsey believes it’s in the public good to document what the president says. And Facebook has used newsworthiness as moderation factor for several years. In December 2015, Facebook’s top executives debated what to do with Trump’s denigration of Muslims, which some employees thought clearly violated its standards. They allowed it to stand. A year later, Facebook took down a post from a Norwegian writer that included the iconic “Terror of War” photo, which depicts children fleeing a US napalm attack in Vietnam, because it showed a naked young girl. Responding to the outcry that the company was censoring the Pulitzer Prize–winning image, Facebook reversed itself. Thereafter, “newsworthiness” formally became a factor in treating speech that otherwise violated the company’s standards.