Facebook has become an incredibly successful advertising platform in part because it allows marketers to show people ads using fine-grained categories, which are generated based on an individual’s behavior. Facebook says this allows it to show users ads that are more relevant to their interests. But its data collection practices have also led to a series of privacy scandals over the past several years, and increased scrutiny from lawmakers around the globe.
In response to questions about its targeting practices, Facebook has said that anyone can use the platform’s ad preferences menu to see and control how Facebook has categorized them. But a new survey from Pew Research Center suggests that the vast majority of US users don't know Facebook keeps a list of their interests and traits this way. When respondents found out, most said they were uncomfortable with the assumptions Facebook had made about them.
From September 4 – October 1, 2018, Pew Research Center conducted a survey of Facebook users about the data that the social media platform collected on them.
74% of Facebook users said they did not know Facebook maintained a list of their interests
51% of users said they are not comfortable with Facebook compiling this information
27% of users said the list on their ad preferences page did not very or at all accurately represent them
From September 4 to October 1, Pew asked a nationally representative sample of 963 adult Facebook users to examine their “ Your ad preferences ” page, a menu where Facebook users can adjust ad-related privacy settings like third-party tracking. There, you can tell the social network to stop using your relationship status, employer, or where you went to school to target you with ads. If you don’t want to see ads for alcohol, children , or pets, this is also the place where you can tell Facebook that. In the Pew study, 74 percent of respondents said they didn’t know the page existed until they were directed to look at it. After being shown their ad preferences, 51 percent of respondents said they were not very or not at all comfortable with the fact that Facebook had created such a list of their traits and interests in the first place. Twenty-seven percent reported that the classifications were not accurate.
Facebook has repeatedly emphasized that its users can control how their personal data is used for targeted advertising. In a blog post from April last year, in the wake of the Cambridge Analytica scandal, Rob Goldman, Facebook’s vice president of ads, mentioned the ad preferences page four different times. But even Facebook CEO Mark Zuckerberg has admitted that most Facebook users do not adjust the settings found there. “Some people use it. It’s not the majority of people on Facebook,” he said when he testified before the Senate’s Commerce and Judiciary committees that month.
“Your ad preferences” can be hard to understand if you haven’t looked at the page before. At the top, Facebook displays “Your interests.” These groupings are assigned based on your behavior on the platform and can be used by marketers to target you with ads. They can include fairly straightforward subjects, like “Netflix,” “Graduate school,” and “Entrepreneurship,” but also more bizarre ones, like “Everything” and “Authority.” Facebook has generated an enormous number of these categories for its users. ProPublica alone has collected over 50,000 , including those only marketers can see.
According to Pew’s survey, 33 percent of US Facebook users said they were assigned 21 or more of these categories and 27 percent were given at least ten. Respondents who spent more time on Facebook or who had been using the service for longer reported that Facebook assigned them more topics. Fifty-nine percent of Pew’s participants reported the categories Facebook created for them correctly represented their interests, while 27 percent said the social network was not very or not at all correct in describing theirs.
How accurately Facebook characterized users’ traits appeared to be connected to how comfortable they are with the company’s practices: 78 percent of people who said the information was inaccurate were uncomfortable with their ad preferences, whereas only 48 percent who thought the information was correct felt the same way. Put another way, if you think Facebook has assumed something wrong about you, you’re more likely to feel uneasy about the fact that it’s making assumptions in the first place. (Eleven percent of respondents reported that they weren’t assigned any categories and told instead “You have no behaviors.”)
Pew also asked participants about whether Facebook assigned them a political category, such as “liberal” or “conservative,” and any “multicultural affinity”—which Facebook says are groups of people “whose likes and other activity on Facebook suggest they’re interested in content relating to particular ethnic communities — African American, Hispanic American and Asian American.” There is no affinity classification for whites. These groupings have gotten Facebook into trouble in the past. In 2016 and 2017 , ProPublica reported they could be used to show housing ads only to whites, possibly violating laws like the Fair Housing Act. Housing advocates including the National Fair Housing Alliance filed a federal lawsuit against Facebook in March of last year over the practice.
The Pew study shows that Facebook’s multicultural affinity assignments don’t always reflect the racial or ethnic groups people say they are a part of. Only 21 percent of respondents said they were assigned a multicultural affinity, 60 percent of whom said they have a very or somewhat strong affinity for the group they were assigned. But only 57 percent of people assigned an affinity said they considered themselves to be a member of the racial or ethnic group Facebook assigned them. In addition, half of the people Pew surveyed said the social network assigned them a political affinity, 73 percent of whom said the categorization was very or somewhat accurate.
Americans already say they don’t trust Facebook and are spending less time using the platform. Fifty-four percent say they have have also recently adjusted their privacy settings , according to another Pew study published in September. But this latest survey indicates they still don’t know how the social network's targeted advertising works. As lawmakers in the US prepare to debate more legislation around privacy this year, Facebook’s practices will likely continue to draw scrutiny. If finding out how Facebook creates ads makes its users uncomfortable, as the Pew survey suggests, that could spell trouble for the social media giant in the year ahead.
- One couple’s tireless crusade to stop a genetic killer
- Virtual reality’s latest use? Diagnosing mental illness
- Nike's new self-lacing basketball shoe is actually smart
- As tech invades cycling, are bike activists selling out?
- The rise of the Swiss Army gadget
- 👀 Looking for the latest gadgets? Check out our picks, gift guides, and best deals all year round
- 📩 Get even more of our inside scoops with our weekly Backchannel newsletter