Encouraging users to sample other news sources, one study finds, can instead cause users to double down on their beliefs. Another found that logging off entirely reduces polarization but leaves people politically disengaged and disinterested. Sixteen years in, we’re still only beginning to understand how is shaping us.
Internally, the company has begun to quietly acknowledge the trade-off for users between staying informed and being algorithmically driven toward divisive content. But little has been done. Politicians remain focused on accusations of bias , and Facebook is busy proving it treats conservative and liberal users impartially. Executives routinely emphasize the matching Republican and Democratic criticism of the platform. If neither side is happy, they say, neither is being favored.
But conservative and liberal users have very different experiences when using Facebook. That’s not because of politically motivated decisions around what’s allowed on the platform. Rather, it reflects the way Facebook organizes information to reward “engaging” content. The focus on ferreting out bias obscures this and makes even practical solutions seem implausible.
While there’s little evidence to support that Facebook is biased against conservative users, University of Virginia professors Brent Kitchens and Steven Johnson found that, by maximizing for engagement and attention, Facebook’s algorithms actively push conservatives toward more radical content than liberal users. Kitchens and Johnson analyzed the news habits of over 200,000 users who agreed to share their browsing data. They found that Facebook pushed conservatives, unlike its moderate or liberal users, to read dramatically more radical content over time.“All the platforms end up providing a sort of diversifying effect,” explains Kitchens, associate director of Virginia’s Center for Business Analytics. “If you're reading more news from Facebook, you're going to get a wider variety of news. But there's also a polarizing effect. The diversity of information gets a little wider, but it also shifts more extreme.”
Facebook announced on Monday that it was going to spend $100 million to help local news outlets during the coronavirus crisis.That money was meant to support coronavirus coverage in local publications, but according to Brown, so many requests for that money came in that the company realized that a much bigger sum was needed.
The study compared respondents’ use of Facebook, Reddit , and , with their news habits. Kitchens and Johnson created a numbered political spectrum of 177 news sites, with DailyKos and Salon furthest left, Breitbart and InfoWars furthest right, and USA Today around the center. In the months when conservative users were most active on Facebook, they read news sites that were far more conservative than their average, clicking links from InfoWars and Breitbart over staples like Fox News. By contrast, news consumption by liberal users shifted far less dramatically on the authors' scale.
Also on Friday, Business Insider reported that years of Zuckerberg’s public writings had mysteriously disappeared, “obscuring details about core moments in Facebook’s history.” The missing trove included everything the CEO wrote in both 2007 and 2008, as well as more recent announcements, like the blog post Zuckerberg penned in 2012 when Facebook acquired Instagram.
This polarizing effect of Facebook is in stark contrast to Reddit. When conservative users were most active on Reddit, they actually shifted to news sites the authors judged as more moderate than what they typically read. Kitchens and Johnson hypothesize that the most salient differences between Facebook and Reddit aren’t the content itself but how platforms structure and feed news and information to users.“The impacts we’re seeing are by design. Facebook knows what's going on with its platform,” says Johnson. “If it wanted to change it, it could.”
And so simultaneously the company mounted a huge effort, led by CTO Mike Schroepfer, to create artificial intelligence systems that can, at scale, identify the content that Facebook wants to zap from its platform, including spam, nudes, hate speech, ISIS propaganda, and videos of children being put in washing machines.
The authors identified a few major differences between Facebook and other sites. First, Facebook requires reciprocal friendship, which encourages a feed of like-minded people and reduces the chance of seeing opinion-challenging content. Facebook’s algorithms create feedback loops that perpetually show users what it thinks they want to see.Second, Reddit has more anonymity than Facebook. Because users don’t necessarily have reciprocal bonds, people with different views can gather and share links in the same thread. Reddit’s algorithms prioritize interests, not friendship, and in the course of interactions on nonpartisan topics, the authors say there’s a much higher likelihood users will come across links to sites outside their typical news diet.