In brief:
- Fake news and misinformation can spread quickly online.
- In a health setting, misinformation has dangerous consequences.
- This is especially relevant during the current pandemic.
- Example: Zostavax and how misinformation can spread.
- What can you do to address misinformation quickly?
Amid the backdrop of an ongoing pandemic with scientists and analysts grappling with numbers to advise governments on the safest course of action to take, the prominence of misinformation online becomes all the more dangerous. Over the past few weeks the world has witnessed a major shake-up of the US political landscape and a breakthrough in the search for a viable COVID-19 vaccine candidate, both of which have been subject to scrupulous digital and social media attention, analysis and possible misinformation.
Such is the potential for harm, that the UK Labour party is looking for the introduction of criminal penalties for social media companies who do not remove COVID-19 vaccine fake news stories.
As the COVID-19 vaccines stride forward, it is imperative that information disseminated through online channels is carefully weighed against the evidence and the claims tested using reliable data sources. Where this doesn’t happen, the population health consequences can be severe.
A prime example of a vaccine suffering from targeted misinformation online is the Herpes Zoster vaccination.
Zostavax, an example of misinformation
To be clear, Zostavax is not without its faults – the Merck vaccine will no longer be sold in the US as of July 2020 amid numerous patient reports of side effects for which the manufacturer is being sued. Beyond that, research shows Zostavax only reduces the risk of Shingles by 51%. Shingrix is seen as the superior option and it is the GSK vaccine which is now recommended by ACIP in Zostavax’s place.
Despite valid cause for complaint, in some cases it is the seemingly innocuous content which can tarnish a reputation. Misinformation can become an instrument of harm when an individual or group knowingly or unknowingly manipulates content such that it no longer reflects the original author’s intent.
On 7 November 2017, an article by Claire Dwoskin published by Children’s Medical Safety Research Institute (CMSRI) claimed that scientists had ‘proven’ that those vaccinated with Zostavax can infect the unvaccinated with chickenpox. Dowskin’s revelations repeated those of an Newstarget article published earlier in 2017 and were based on a 2011 study in the Journal of Infectious Disease. The science cited by CMSRI and Newstarget found that in some cases, varicella zoster virus (VZV) DNA sequences were present in skin samples and saliva after immunization. The study’s authors concluded that theoretically, saliva of the recently immunized could be a potential source of transmission. In reality, the theoretical transmission has never been observed in a clinical or even experimental setting, with the CDC stating that there has never been a reported case of such an event. This pertinent context was omitted from both the CMSR and Newstarget reports.
The sensationalism contained within CMSRI and Newstarget headlines and throughout the articles was far from justified by a body of medical evidence and made claims well beyond those of the original research.
The story began to gain traction when both the CMSRI’s and Newstarget’s articles were widely shared on Twitter in late 2017. On 8 December a CMSRI Twitter post was retweeted 19 times despite Twitter profile @mcfunny pointing out the fallacy. The reach of the story picked up momentum in December, spiking on the 16th when the now suspended ‘Health Ranger’ account shared the article, receiving 111 retweets.
This social media sharing behaviour, highlights the importance of addressing the spread of misinformation. Once a story begins to develop momentum, it becomes much harder to stop, particularly in cases where the story masquerades as science-based. In the case of Zostavax, poor reporting may have tarnished public perception of successor Shingrix or vaccine counterparts more widely.
HCPs correct misinformation
As negative reports of Zostavax continued to be shared on Twitter throughout 2018, HCPs addressed misleading information and shared their own experiences of Zostavax side effects.
Redness on injection site is an adverse event. Not being able to fully lift arm over head = limits normal activity (for 1 day). For those that were actually going to get HZ zostrix and zostavax is their best and only option. For those debilitated after HZ – same answer.
— Graham MacKenzie PhC (@grahamcmackenzi) October 11, 2018
I am a general internist in practice for nearly 25 years. In that time I have administered influenza vaccines, Pneumovax, and Zostavax. I have never seen a vaccine 'reaction' more serious than some prolonged redness and swelling at the administration site.
— Karl Krohn (@KarlKrohn) October 15, 2018
What can you do to stop misinformation?
With the knowledge that information can undergo alteration in any number of ways before rapid online dissemination, vigilant ‘fact-checking’ takes on heightened meaning. One method that can help verify a story, or at the very least provide credibility, is to do a little homework on the source content author.
For example, Influence Watch describes CMSRI author Claire Dwoskin as a ‘left-of-center political activist who also supports organizations focused on promoting debunked studies suggesting that vaccines cause various autoimmune and inflammatory diseases, including autism’.
In a healthcare context, listening to medical professional voices online or in person is a valuable tool to separate fact from fiction. Understanding who the medical community is being influenced by and what they share online is of great benefit as the public looks to gain a greater knowledge of safe health practices.
To understand more about how CREATION.co is helping to stop the spread of misinformation, sign up for our monthly eJournal or get in touch, we’d love to help.