MENU

USA: Twitter Is Weaponized Once Again, This Time It’s About Vaccination

1 Sep, 2018
, Source:

 By Chuck Dinerstein — August 23, 2018 Rasputin was an early adopter As it turns out, the Russians have meddled in more than simply elections. If their goal is to spread dissent and discord than there can be no more passionate a topic, at least for some, than vaccinations. A new article in the American Journal of Public Health sheds some badly needed light on how Tweeter is being used in a cultural war with established science. Twitter is a platform of expression, and anti-vaccine advocates have a significant social media presence. It is estimated that nearly half of the vaccine-related comments are anti-vaccination; the remaining are pro or neutral. For those inclined to distrust vaccines, the vaccine “hesitant,” a large number of Tweets supporting an anti-vaccination agenda increases their hesitancy. And if you think that vaccination is not a real public health issue, consider that the EU is experiencing a record outbreak of measles this year; 37 deaths and 41,000 cases so far. The study reports on a review of a 1% sample of Tweets gathered between 2014 and 2017, roughly 1,800,000 thoughts. Within this larger data set, the researcher also sampled Tweets with vaccine-related keywords. They then identified the source of Tweets based on publically known accounts from the Internet Research Agency – the Russian government agency tied to all those political dis-information campaigns. They also did a qualitative look at #VaccinateUS, a known bot of the Russian government. Before jumping into their finding let’s quickly mention some vocabulary words. A bot is an automated producer of content, it may be evident that it is automated, like much of what we know to be spam; or it can be disguised to give it a more human appearance. Trolls are actual humans who misrepresent themselves to promote discord and distress. One last player, content polluters, humans and their bots whose goal is to disseminate unsolicited content and malware; you might consider those annoying pop-up ads as a type of content polluter. The Study The study was retrospective and observational. Using a previously validated “tool” called the Botometer, the researchers categorized the source of Tweets as likely bots, likely humans or unknown. The categorization was based on a host of analytics looking at content, grammar, and linguistic patterns as well as other identifiers. Most Tweets could not be well characterized as a bot or human. They found that trolls and sophisticate bots (though serving as amplifiers of human controllers) posted equal numbers of both pro and anti-vaccine positions. Their goal was to amplify their voice – the disinformation was in the number and variety of posts, not the content. When they considered the vast majority of Tweets where characterizing the source was impossible the posts were more polarizing and more anti-vaccination. What is perhaps more telling and in turn more useful for those of us who try to separate real from the fake was their qualitative findings regarding #VaccinateUS, a known Russian source. Again, posts were not wholly pro or con vaccination, 43% favored vaccination, 38% were against, and 19% presented neutral vaccination content. As you might expect for individuals writing in a second language, they identified more unnatural word choices and irregular phrasing, spelling, and punctuation, thanks to spell checkers, was not an issue. Another “tip-off” for these sources was that messages were more explicitly linked to US politics, featuring emotionally tinged words like freedom or constitutional rights. These phrases were markedly different than the vaccine tweets of humans that more often use terms like parental choice or focused on specific vaccine-related legislation. In fact, Russian disinformation far more frequently spoke in generalities than about local specifics – another suggestion that a distant source was Tweeting. In addition to differences in word choice, Russian bots when referencing conspiracy choose once again the US government rather than secret organizations or shadowy philanthropies. Many of the Russian generated arguments were focused concerns that already were divisive, like racial or ethnic divisions. Anti-vaccination arguments by humans characterize risk regardless of socioeconomic status. Russian Tweets also include provocative or open ending questions, designed to prolong the discussion and again, amplify the debate and debaters. Content polluters represented a separate category, and it seemed that the vaccination debate was utilized as a marketing tool. Here the Tweets were far more likely to be polarizing anti-vaccination messages designed as clickbait, leading the user to unwanted content and surreptitious downloading of computer malware. The authors pointed out the irony of following anti-vaccine content may “increase the risks of infection by both computer and biological viruses.” Fake science The Russians and presumably other “state” actors have found that the most effective way to wage war, is not through the expenditure of lives and treasure, but increasingly at the computer terminal. And their weapon is not outright lies, the things that fact-checking can readily identify. Their real tool is two-fold, amplifying the voices so that noise overwhelms signal and at the same time, sowing distrust so that even a clearer signal is misidentified. The Cold War ended not because Russian surrendered, it ended because we demonstrated that we could and would outspend the Russians and in their effort to keep up, their fragile economy imploded. Today’s war features an attack on the information that informs our lives. It heightens our divisions, discredits our institutions and without a guiding compass we are adrift and increasingly vulnerable. The Russian meddling in our politics if one small front in a much larger battle. Disinformation is finding its way into our scientific debate, and it is being provided by actors far more powerful and nuanced than the concerns we might have about studies funded by corporations and their vested interest. I can often detect a conflict of interest in a study, but it is far, far harder to detect false amplification, the astroturfing that John Oliver falsely accused the ACSH as participating in. It is far harder to have a reasoned discussion, as we try to do, then a divisive rant. Source: Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debates American Journal of Public Health DOI10.2105/AJPH.2018.304567

This entry was posted on Saturday, September 1st, 2018 at 7:08 am and is filed under Latest News.

Literature Literature archive

Baalen, S. van. 2018 Research Ethics 14(4), 1–17. https://doi.org/10.1177/1747016117750312

Videos Video archive

Key figures share their perspectives on a controversy that led to the suspension of Ebola vaccine clinical trials in Ghana.

Drs. Heidi Larson and Pauline Paterson of the Vaccine Confidence Project join episode 50 of the Public Health United podcast with Nina Martin, November 2017.

Drs. Larson and Paterson join a discussion on vaccine confidence at Hong Kong University.  September, 2015.

Subscribe to our mailing list

Click here to go to our GDPR-compliant signup form.

  • Recent Posts