Children Stream on Twitch—Where Potential Predators Find Them

Last weekend, a young girl held up her smartphone and hit the Go Live button on Twitch’s mobile app. Her stream appeared under Twitch’s Just Chatting section, where people livestream themselves talking to their viewers. She explains that she’s about to do her morning routine. Within minutes, 11 viewers filed in, including her best friend and several strangers. One viewer asks her age. They stick around after she says she’s 10. As she gets out her toothbrush, one anonymous viewer asks in an adjoining chat window whether she has WhatsApp and says she is cute. Another tells her that she is beautiful. Her friend types into chat, “Bro I do not like this at all” and logs off. Later in the stream, the girl notes that a friend of hers got 15 followers in an hour and, comparing her own follower count, asks whether she’s ugly or weird. (The channel was active and had 15 followers until yesterday.)
According to Twitch’s Terms of Service, you have to be 13 to stream on the platform. But a WIRED investigation turned up dozens of Twitch accounts seemingly operated by children under that age, including another girl who admitted to being 11. In their videos, which crop up every few minutes under Twitch’s Just Chatting section, apparent children livestream themselves talking while playing games like Fortnite, performing dances popular on TikTok, or sitting at home and communicating with a small number of viewers. WIRED has viewed several messages from viewers to these apparent children containing inappropriate comments, questions, or demands, and identified some accounts that follow multiple apparent children.“Both our desktop and mobile apps prevent users from creating accounts if they enter a date of birth that indicates they are under 13,” a Twitch spokesperson told WIRED. Twitch pointed to its reporting system as a way to fight inappropriate behavior toward children, adding that people reporting streamers who appear too young comprises “an extremely small proportion of the reports we receive.” Twitch did not directly answer whether it has dedicated resources to combat these incidents. “We take action on content that is reported to us when it violates our rules, including issuing warnings, removing the content, and suspending accounts for various lengths of time, including and up to indefinitely," the spokesperson said.
While over half of US children own smartphones by age 11, the Children's Online Privacy Protection Rule prevents apps from collecting data from children under 13 without parental consent. Most social media platforms place age limits on who can sign up in the first place. Last year, Google and YouTube paid $170 million to settle allegations brought by the Federal Trade Commission for violating COPPA. The FTC alleged that some YouTube content was aimed at children, and because children watched, the company collected their data. “At the end of the day it comes down to, are the platforms doing everything they’re legally supposed to do to limit that from happening?” says Brad Shear, a lawyer who specializes in social media and privacy. In his view, expeditiously responding to reports about improper behavior on the platform qualifies as a “responsible” approach to COPPA.Twitch’s mobile app has relatively few barriers for children who know to input an older age in the sign-up form. They can create an account and stream within minutes after a quick email verification. Over the past few days, a half-dozen viewings of Twitch’s recently started live channels in the Just Chatting section have all turned up at least one apparent child within the top five or 10 entries. Many of them appear to be on mobile. Live videos on Twitch are archived and disappear after 14 days.