The videos all made the same claims: both Skelton and Desselle had been vaccinated for Covid-19 shortly before developing their tremors, and the vaccine, they alleged, was to blame. There is no evidence that this is the case. But, on Facebook, the truth rarely matters. For days, the videos spread unchecked, racking up millions of views and tens of thousands of comments. Devoid of context and, even now, challenging to factcheck, their spread is the latest salvo in the struggle to debunk vaccine disinformation and misinformation. To date, the videos have been shared by Facebook groups that push natural and alternative medicines, anti-vaxxers, 5G conspiracy theorists and by the far right.
According to CrowdTangle, an insights tool owned and operated by Facebook, Skelton’s first video, published on January 7, had been watched by over 4.4 million people by January 19. On Twitter, the video has been shared 10,300 times, raking in 1.4 million views, according to Lydia Morrish, a social media journalist at First Draft.CrowdTangle data shows that one of Griner’s videos, first posted on January 10, was viewed more than 5.2 million times. Although some commenters were sceptical, most of the comments left under the videos were from people who seemed worried about the alleged effects of the vaccine; some extolled the benefits of faith healing, others shared big pharma conspiracy theories and hawked products that they said might help the women recover. One commenter expressed hope that doctors will find a cure for the vaccine.
As the videos took Facebook by storm they started to seep outwards, cropping up on WhatsApp groups and on the messaging app Telegram. Here, they bounced from channel to channel, ripping through far-right and QAnon-adjacent groups that have been burgeoning on the platform in the weeks following the Capitol Hill insurrection. One version of the Desselle video circulated on Telegram has been watched more than 100,000 times. Junk news and alternative outlets featured stories about both Skelton and Desselle, and the latter’s story was reported on in a segment on RT, a news network controlled by the Russian state.
The sheer shock value of the videos perhaps made their spread understandable, but also dangerous in the midst of a pandemic, and at the very outset of a public health campaign that is already grappling with unprecedented levels of vaccine hesitancy in some countries. Especially when few of the claims made in the videos can be verified.Skelton, an employee of a care home in Oakland City, Indiana, claims to have received the Moderna Covid-19 vaccine on January 4. The tremors, she says in a Facebook Live from January 13—where she contorts on a bench wearing a pink jumper—started three days later. Skelton did not respond to multiple requests for comment. The care home where she works did not reply to multiple emails enquiring whether Skelton had indeed received the vaccine.
Skelton herself published a picture of what appeared to be a US Centers for Disease Control and Prevention (CDC) vaccination card on her Facebook profile on January 16. But the vaccine lot that she was administered according to the card does not appear to have been linked to any report of vaccine adverse reaction in Indiana, according to VAERS, the reporting system run by the CDC and the Food and Drugs Administration. In fact, all eight cases of adverse reactions to any Covid-19 vaccines reported in Indiana since the start of 2021 involved people over 60—Skelton, according to her vaccination card, is in her forties. Anyone can report adverse reactions to VAERS, including the vaccine recipients themselves, but doctors and practitioners are strongly encouraged to do so when they encounter what appears to be an adverse reaction.