ABOUTDr. Ysabel Gerrard is a lecturer in digital media and society at the University of Sheffield. Her research on social media content moderation has been featured in venues like The Guardian and The Washington Post. She also consults for social media companies, including Instagram.The problem TikTok has right now is that its For You page is working exactly as it should: It gives users a personalized and therefore pleasurable experience by showing them what they likely want to see. I’ve previously written about the same problem playing out on Instagram, Pinterest, and Tumblr. Recommendation algorithms like this are the bread and butter of social media platforms. The happier you are on a platform, the likelier you are to stay, and if you stay, the company can retain your profitable data-generation.But the problem—a problem most major social media companies have faced—is that recommendation algorithms aren’t really trained to make moral and health-related judgements about the kinds of content they recommend. Do you like cats? TikTok thinks you do, based on what you’re liking and searching for, so its algorithm will show you more cats. Yay cats! But the exact same formula applies to potentially harmful forms of content. Do you have anorexia? TikTok thinks you do, so here’s a bunch of triggering videos. Have at it!
In a recent BuzzFeed article, some TikTok users shared anecdotes of randomly receiving recommendations for pro-ana videos through their For You page. It is difficult to describe pro-ana behaviors without triggering readers, but they might involve sharing diet tips and purging methods, writing personal stories, and pairing up with a “buddy” to further encourage weight loss. We know from charities like Beat that eating disorder patients often report feeling “triggered” by certain images or words. If a TikTok user continuously sees triggering posts on their For You page, this could very well harm them. But one of the frustrations social media researchers have is that the inner workings of recommendation systems like the For You page are notoriously opaque , making it difficult to figure out why particular users see certain recommendations while others don’t. A recent New Media and Society article notes how social media users often create elaborate theories for figuring out how recommender systems work, what the author calls “algorithmic gossip.”
Without dismissing anyone’s claims about their For You recommendations, readers should know that users who are not engaging with videos related to eating disorders are highly unlikely to have them randomly recommended. A TikTok spokesperson explained that users can also adjust the content they see by, for example, “hearting” videos, clicking “not interested,” and following users. “In doing so, through time users will see more of the content they prefer.”Whenever stories like BuzzFeed’s appear, I always worry that social media companies will respond by panicking and prohibiting all content relating to eating disorders, even if it’s about recovery or support.
Researchers have long known that social media and older online communities can offer support for people with stigmatized conditions like eating disorders. For example, Reddit’s decision to remove the r/proED sub in 2018 was met with outcry from community members who explained that, despite its name, the sub wasn’t actually used as a space to promote eating disorders and functioned more like a support group.
TikTok subsequently announced on Wednesday that it was launching a separate portion of its app for children under 13, which “introduces additional safety and privacy protections designed specifically for this audience.” "Companies like TikTok have been all too eager to take advantage of child app users at every turn." Senator Ed Markey By essentially combining Vine with Spotify, Musical.ly captured the attention of around 100 million finicky Generation Z consumers.
When moderated more appropriately, there’s no reason TikTok can’t offer an extra space for people to express their feelings and share their experiences in a highly creative way. TikTok could also become a helpful resource for people struggling with eating disorders. Secrecy is one of the hallmarks of an eating disorder, meaning social media sometimes exists as a sufferer’s only form of support. With this in mind, TikTok could develop genuinely useful eating disorder resources beyond sending users a list of contact details for local charities, “the 2020 equivalent of handing a teen a tri-fold brochure ,” as psychiatrists Neha Chaudhary and Nina Vasan recently wrote in WIRED. Pinterest, for example, has pioneered a series of wellbeing exercises that it recommends to users searching for self-harm-related Pins.
When Algorithms Think You Want to Die