One thing the Covid-19 pandemic has brought to light is just how much of an impact fake news has on influencing people. When it comes to health information, this is particularly concerning. Individuals who believe these reports and stories put themselves and others at unnecessary risk. Who’s to blame for this? What can be done to reduce fake health news? How do you change people’s perceptions?
Did you know that 80% of people use the internet to search for health information?
That equates to more than 1 billion health-related searches on Google every day. While it’s good that people care about their health and want to learn more outside of an appointment with a provider, the concern is that many people cannot discern the validity of the information that they view.
The other issue is that fake health news stories get millions of views, likes, and shares on social media every year. The problem is that these stories do everything from claiming cures for conditions to pushing far-reaching conspiracies between governments and medical communities. Unfortunately, this happens before the medical community can respond because the speed at which traditional media moves isn’t the same as social media. So, when there aren’t reliable, vetted news stories available, misinformation fills the void.
Health misinformation isn’t new. If you look at books written in the 1990s, there have definitely been medical conspiracy theories. The issue now is the speed at which it proliferates.
It’s important to understand that misinformation and disinformation are two different things. Misinformation is the accidental spread of misleading and false information. Disinformation is the deliberate and coordinated spread of misleading and false information. The outbreak of misinformation creates an “infodemic,” which is eroding trust in the healthcare system and practitioners.
One survey showed that between 2017–2018, there was a 20% reduction in general trust in healthcare in the United States and a 4% drop worldwide. Over the last several decades, policy decisions have become more driven by ideology and politics instead of facts and evidence. This has resulted in a growing mistrust of science, institutions, and the counsel of leading experts.
There is growing concern among health professionals that false information and lack of trust could lead to people not taking the steps necessary to be healthy because people are discouraged from taking measures to prevent illness which makes them hesitant to seek care when they get sick. These factors make it more difficult for doctors to advise patients and require extra time and effort to correct patients’ inaccurate views. This can have serious long-term health effects.
One example of the concerns around the prevention of health issues is lower vaccination levels that drop the entire population below herd immunity. Vaccines are considered safe by the medical and scientific communities. Still, a few well-funded anti-vaccination activists, who don’t have medical training or expertise, promote false claims that vaccines cause harm and death.
Over two-thirds of anti-vaccine sites represent information as “scientific evidence” to support their ideas. Also, close to a third of the sites use anecdotes to reinforce their perceptions. Part of the problem with this is related to bots (automated social media accounts that use artificial intelligence to mimic the appearance and manner of a human user to promote specific narratives). Bots are being used to strengthen anti-vaccine views because they can target particular groups.
A great example of how health misinformation can be detrimental is the nationwide measles outbreak in 2019. During this, the United States experienced the largest number of reported measles cases since 1992. According to the Centers for Disease Control and Prevention, several factors contributed, including importation through international travel and low pockets of vaccination from misinformation spread about the measles vaccine. This false data has been spurred by a 1998 study that claimed there was a link between vaccines and autism.
Despite the study being debunked and retracted, its misinformation has spread over the last two decades. The data shows that depending on the area, the cost of a measles outbreak ranges from $9,862 to $1,063,936, and the median cost per case is $32,805.14. This demonstrates just how burdensome a disease outbreak can be on the public health system.
One of the biggest reasons fake news goes viral is the sensational headlines. These grab the readers’ attention and often appeal to their emotions, which means people are more likely to believe and share them. This is especially true in times of uncertainty and crisis. Also, people are more likely to believe claims that peers, colleagues, or family members make in social media posts. Social media algorithms reward engagement, which helps spread misinformation. In addition, these articles are written so they can be easily understood. Medical studies are challenging to read because they contain scientific jargon, figures, and tables that the public can’t interpret.
The Health Feedback and Credibility Coalition did a study of the ten most popular health articles of 2018. Well-known news sources published these articles. The study demonstrated that 75% of the articles “were either misleading or included some false information,” and 26% were ranked as highly scientific.
A different study done by Stanford researchers in 2019 tracked the online activity surrounding the false idea that cannabis cures cancer. Cancer is one of the most popular subjects when it comes to health misinformation, and marijuana is often alleged to cure it. The Stanford researchers found that online searches for the two have grown at 10 times the search rate for other standard medical therapies.
Other data shows that the top 50 fake health news articles garnered more than 12 million shares, comments, and reactions, mainly on Facebook, in 2019. As a result of the increasing prevalence of health misinformation, lawmakers, doctors, and health advocates started pressuring social media platforms to make policy changes to ban or limit the spread of false health information that had gone unchecked. In response, Facebook said it has been working on reducing the spread of health misinformation and issued a statement, “While we have made progress this year, we know there is more work to do. We hope to continue our partnership with health organizations to expand our work in this space.”
Findings of a survey involving 1,020 people aged 40–80 years in the United States were published in Health Psychology. Participants were asked to rate their perceived accuracy of 24 recent Facebook and Twitter posts. These posts included true and false information on cancer treatments, human papillomavirus (HPV) vaccines, and statin medications. After the participants reviewed the posts, they had to categorize them as completely false, mostly false, mostly true, or completely true.
The research team also collected data on the participants’ education level, interest in alternative medicine, income, age, and health knowledge. The results showed that people with a lower education level and less knowledge of healthcare issues were more likely to believe inaccurate information. In addition, individuals who distrust the healthcare system and have a favorable view of alternative treatments are more likely to believe health-related misinformation. Not surprisingly, if a person supported claims on one topic, they were also likely to believe misinformation presented on other health topics.
A recent study from the University of Kansas journalism & mass communication researchers observed what influences people to be susceptible to false information about health. Their findings argue that big tech companies have a responsibility to help prevent the spread of misleading and dangerous information.
For the study, the group shared a fake news story with more than 750 participants. The story claimed a deficiency of vitamin B17 (which doesn’t exist) could cause cancer. There were eight versions of the article. One version included a doctor’s byline, including a short description of medical credentials. Another version described the author as a mother of two with a background in creative writing who was a lifestyle blogger. Some versions followed a journalistic style, while others used more casual language. Some were labeled as suspicious or unverified.
Researchers then measured if how the article was presented affected the participants’ perception of its credibility and whether they would adhere to the article’s recommendations or share it on social media. The findings showed that information presentation didn’t impact how people perceived it. Individuals highly interested in health information are more likely to share news they find, whether credible or not. If the article was flagged, stating it was not verified, the participants were significantly less likely to find it credible, adhere to recommendations, or share it. The results indicate that relying on audience members to do the work to determine fake news has a long way to go because I
t requires mental work. The study also shows that companies can have an impact as far as information being believed or shared. The Kansas study isn’t alone in its findings. The Media Insight Project discovered that “people who see an article from an unknown trusted sharer, but one written by an unknown media source, have much more trust in the information than people who see the same article from a reputable media source shared by a person they do not trust.”
The COVID-19 pandemic has contributed to the infodemic significantly. During the pandemic, the public drastically increased its news consumption to learn more; this increased the risk of reading fake health news. A great example of misinformation surrounds mask, which helps prevent the spread of the virus. However, there were many stories about them not being effective. So, if you don’t like wearing them because they’re uncomfortable, you would be more likely to believe them.
When it comes to vaccines against the virus, data released in January 2021 by the Johns Hopkins Centre for Communication Programs indicate that only 63% of those interviewed, across 23 countries, would get it. This is well below the 75% recommended by public health experts for a population to reach herd immunity.
Many health experts claim that they’ve never seen anything like what’s going on with COVID, and it’s never been so widespread. Some blame the fact the response to the pandemic has been an assortment of good and bad plans that have had no coordination. Given the decline in public trust in healthcare, this created the perfect storm for things to go off the rails. In fact, recent research suggests that in the first three months of 2020, nearly 6,000 people worldwide were hospitalized, and 800 died because of misinformation surrounding the coronavirus.
While the problem of health misinformation online is becoming clear, the solution isn’t. The evidence is clear that having an educated public is key to reducing the spread of misinformation and disease. Unfortunately, fact checks for health misinformation are rare and can’t compete with the virality of the claims they’re trying to correct.
The goal should be to increase media and health literacy so that people can learn how to navigate through information from multiple sources successfully. Public health officials should use targeted messaging and outreach. Local agencies should establish and maintain relationships with local media outlets because these are more trusted than national ones. This can assist in getting accurate information to the public and increase the amount of legitimate information available through online formats.
Public health agencies should routinely engage with audiences on digital and social channels with frequent, timely, reliable, and transparent posts. This will create situational awareness, dispel rumors, and establish them as the media’s first point of contact for health information. Over time, consistent communication will encourage public trust and build an audience that can further spread trusted information.
The World Health Organization (WHO) is trying to have an impact in such a way by teaming up with the United Kingdom Government to create and distribute material to combat the spread of misinformation through a series of campaigns. The first, called Stop the Spread, started in May/June 2020 on BBC World television, website, and apps. Its goal was to raise awareness around the volume of misinformation about COVID-19 and urge people to double-check the information they’re consuming.
In August, the partners launched Reporting Misinformation, which encouraged people to not only verify information but showed them how to report misinformation on various social media platforms. It was shared in 5 international languages and reached millions of people globally. At its launch, it became the second most viewed COVID-19-related page on the WHO website. Access the page here.
The UK Cabinet Office partnered with Cambridge University to create an innovative online game, Go Viral! Players discover how real news gets discredited by exploiting fake doctors and remedies, and how false rumors get shared and promoted. The scores are shareable, and players are connected to WHO’s COVID-19 ‘mythbusters’. Research demonstrates playing it just once can reduce the perceived reliability of fake news by an average of 21%.
Another organization focused on circulating and amplifying evidence-based science around public health issues using social media and communication as a tool to spread accurate information is IMPACT. The group has three main goals: to transmit accurate information on COVID-19 to the community, communicate information and health-based information to government officials at the state and national levels, and fight the “infodemic” about COVID-19.
Organizations aren’t the only ones trying to fight misinformation about Covid-19; many healthcare professionals are taking to social media platforms to talk about the truth regarding the situation. There are several ways providers can reach individuals on a smaller scale to help fight against misinformation. One is sharing links to sites that provide reputable information, including popular news sites that tend to get health news and information correct. The primary factor is that healthcare professionals shouldn’t be afraid to speak up in social media groups if they see misinformation being spread. The key thing to remember is not to blame the individual; instead, blame the idea.
For each person, it can be hard to know what resources to trust because of the nature of science, especially science that’s completely new and is always changing. There are some key things to watch out for:
- Article overstates research findings, fails to provide context, or exaggerates a thread
- Poor writing in the article
- If there aren’t credible sources cited in the article or post
- Articles claiming findings to be irrefutable
- If numerous outlets aren’t covering the information
- Look at the comments to see if a bunch calling out a story
- Think critically about the source’s intentions, such as whether they’re linked to anything being promoted in the article
- Don’t fall for overly simplified conclusions in headlines
- Definitely be on guard when reading about emotionally charged and divisive topics
- Make sure the author interpreted the information correctly by investigating the primary sources
- Don’t use social media for your news!
An infodemic creates uncertainty, which fuels skepticism and distrust. This creates the perfect environment for fear, anxiety, finger-pointing, stigma, violent aggression, and lack of approval of proven public health measures. This means that while infodemics can’t be stopped, they must be managed through campaigns and collaborations. Each one of us has a responsibility to verify the information we’re consuming and sharing with others.