Pages and groups are the tools Facebook misinformation peddlers love the most. Creating a network of anonymous pages is one of the easiest ways to quickly spread fake news or propaganda on the social network. This tactic has most famously been used by Russian trolls—even long after the 2016 presidential election. Earlier this month, Facebook took down a cohort of deceptive pages linked to Russian state media. Now, the social network has changed its policies to better enable cracking down on not just individual pages, but entire networks of fraudulent pages and groups.
Facebook has historically played Whac-A-Mole when it comes to systems of fraudulent pages, even when they’re run by the same person. If a troll runs two fake news pages but only one of them violates Facebook’s policies, the company can’t take down the other until it breaks the rules as well. That loophole has allowed propagandists to simply shift their efforts to other existing pages after Facebook closes down one arm of their operation. But starting today, the social network will begin removing entire factions of pages and groups, even when not all of them have individually met Facebook’s criteria to be removed.
In a blog post announcing the change, Facebook said it “may now also remove other Pages and Groups with similar names that are maintained by the same person, even if that specific Page or Group has not met the threshold to be unpublished on its own.”
In some situations, the social network has already gotten more aggressive with networked pages; in August of last year for example, Facebook expunged a band of inauthentic pages that appeared to have originated in Iran. It's also repeatedly removed coordinated trolling efforts from Russia. But under the new policy, the company won't need to demonstrate infractions from every single page to justify a sweeping takedown.
Facebook will also launch a new control panel Thursday for page managers, designed to make it easier for them to understand when their posts have breached Facebook’s Community Standards. The Page Quality tab will display content that Facebook recently removed, and cite the rule it broke. For example, it might inform a page manager that their video was taken down for going against the social network’s rules forbidding hate speech. The menu won’t display all policy violations, but it does include things like graphic violence, harassment, bullying, and nudity and sexual activity. Notable omissions include spam, clickbait, and intellectual property violations.
Page Quality will also show page managers when their content has been rated “False, “False Headline,” or “Mixture” (primarily misleading, but contains some true information) by third party fact-checkers , like the Associated Press or Politifact. When fact-checkers give posts these kinds of negative feedback, Facebook reduces how many people see them in their News Feed.
Facebook has historically played Whac-A-Mole when it comes to systems of fraudulent pages.
Internal Facebook documents leaked to Motherboard last year indicated that the social network has different deletion thresholds for pages depending on the type of content violation they commit. For example, if a page manager receives five “strikes” for hate speech in a 90 day period, Facebook instructs moderators to delete their page. If a page or group has more than two “elements” of sexual solicitation, it gets deleted. That covers the page’s description, photo, or title, for instance. (It's possible these policies have been since revised, but they help color how Facebook thinks about policing pages and groups.)
The Page Quality tab will likely reduce some of the confusion page managers experience on Facebook, where it can be difficult to understand how or why the social network is moderating content. If you don’t know a fact-checker labeled your post fake news, it’s easy to think Facebook isn’t showing it to people for more sinister reasons. The tool makes these kinds of actions more transparent, especially for those who are in charge of pages with large followings that generate hundreds of notifications a day. But it likely won’t mean much for bad actors who already intend to skirt Facebook’s rules in the first place.
These new features and updates are part of wider changes Facebook has made over the past two years, which are designed to make it harder to spread misinformation and propaganda on its platform. Many of those actions have focused on tightening its advertising policies; the social network now has strict requirements for organizations that want to run so-called issue ads, for instance. These new tweaks take aim at another problem: fraudulent pages and groups that don’t need to rely on paid advertising to reach an audience.
There’s one issue, however, that Facebook has yet to address: It’s still possible to run Facebook pages anonymously. Pages can then create their own affiliated groups, allowing bad actors to erect entire communities without revealing their identity. Facebook gives page managers the ability to list their “ Team Members ,” but the functionality is optional. It’s understandable why the platform works this way; the social media manager for a nonprofit or publication might not want their work connected to their personal Facebook profile, for instance. But it makes it almost impossible for users to understand where a page or group came from. In July for example, a Facebook group that purported to be a safe space for sexual assault survivors was taken over by trolls who harassed its members. The group was run by an anonymous Facebook page, so the victims had no way to discern the identity of their harassers.
Facebook has made pages more transparent by disclosing the date they were created and whether their name has been recently changed, but so far it has stopped short of requiring users disclose when they create them. That loophole will continue to make it easy for bad actors to construct networks of fraudulent pages, but at least now Facebook has given itself the authority to take them all out in one swoop.
- Weedmaps’ grip on the high-flying California pot market
- Have phones become boring? They’re about to get weird
- Trump, a Russian agent? The alternative is too awful
- One couple’s tireless crusade to stop a genetic killer
- As tech invades cycling, are bike activists selling out?
- 👀 Looking for the latest gadgets? Check out our picks, gift guides, and best deals all year round
- 📩 Get even more of our inside scoops with our weekly Backchannel newsletter
While Facebook’s head of global policy Monika Bickert spoke, protesters from a group called Freedom From Facebook, seated just behind her, held signs depicting Sheryl Sandberg and Mark Zuckerberg’s heads atop an octopus whose tentacles reached around the planet.Freedom From Facebook has garnered renewed attention this week, after The New York Times revealed that Facebook employed an opposition firm called Definers to fight the group.