False information is spreading rapidly, causing harm to people. We’ve encountered misleading claims before, but it’s worse now. Misinformation is at its peak, and it’s impacting our health.
A recent survey showed that 20% of people in the U.S. think Covid vaccines have a microchip. That’s one-fifth of the respondents. This belief comes from the false idea that Bill Gates wants to track your actions. Another 46% were not certain if the microchip theory was false, despite there being no logical way for it to be true.
These numbers are concerning. In our fast-paced world of information, distinguishing fact from fiction is becoming increasingly challenging. It’s hard to separate what’s real from what’s not. Reason from baseless claims. Wishful thinking from microchips.
Not too long ago, I fell for a misleading headline about “Covid parties” where people were supposedly intentionally infecting themselves and others. I reacted strongly without verifying the facts. The truth is, Covid parties are mostly just a rumor. I ended up contributing to the misinformation and collective anxiety.
I research misinformation as part of my job at the University of Alberta. I’m a professor of law and public health, focusing on health policy and how science is presented to the public. Even with my expertise, I can be swayed by misleading stories that align with my values and emotions. It’s embarrassing.
We’re living in a time where misinformation is rampant—an era where harmful false information is spreading like an uncontrollable disease, as stated by the World Health Organization in early 2020.
One of the issues we face is that we’ve gotten used to accepting nonsense, sometimes in subtle ways and sometimes quite obviously. Some popular wellness figures have embraced fake science as a fundamental part of their image.
People like Andrew Wakefield, a discredited former doctor, began the harmful “vaccines cause autism” idea in a paper that was later withdrawn by The Lancet. Unfortunately, this misinformation about vaccine safety has continued to spread, finding new audiences.
A sad reality is that misinformation and men don’t mix well, and it’s affecting our health. Research from the University of Delaware shows that men tend to believe in Covid conspiracy theories more, and other studies suggest they may not worry as much about the damaging effects of misinformation.
Men are also less inclined to get the Covid vaccine. There are many reasons for this hesitation, but the fact that men are more likely to accept and be influenced by Covid conspiracy theories is a significant part of the problem.
Hence, it’s crucial for men to adopt strategies to consume a healthy diet of information and be skeptical about it.
Our information environment has turned into a chaotic, confusing mess that is harming our health and well-being. There are various factors making it increasingly hard for us to avoid or even identify the harmful falsehoods and divisive pandering. This happens precisely when we desperately need facts and some clarity in history.
The infodemic has contributed to a decline in confidence in scientific institutions, as those spreading misinformation often aim to create doubt and distrust.
The scientific community also shares some responsibility, with occasional flawed research and poorly communicated findings causing confusion.
That’s how science operates—evidence evolves, recommendations change, and transparency about these changes is crucial. Just remember that non-scientific voices will try to be definitive when real scientists don’t have all the necessary data or facts just yet. It’s better to wait until they do.
But there’s hope! By using a few critical-thinking tools and being aware of the tactics used to spread misinformation, we can navigate through the noise.
So How Did We Get Here?
Understanding why misinformation, half-truths, and misunderstandings are undermining science-backed information is not straightforward; it’s a complex interplay of various factors. However, if I had to pinpoint the most significant contributor to this era of misinformation, it’s undoubtedly social media. President Joe Biden went as far as to say that misinformation on social media is “killing people,” a concern supported by mounting evidence.
Research from McGill University in 2020 found that if you consume news through social media, you’re more likely to believe and share misinformation. Pew Research Center reached a similar conclusion through its analysis. Moreover, research has linked the spread of Covid misinformation in popular culture to specific platforms.
For instance, an analysis in 2020 found that over half of misleading claims about COVID-19 on the global Poynter Coronavirus Facts Database originated on Facebook.
Misinformation spreads rapidly and extensively. In August 2021, Facebook reported its most viewed content from January to March 2021, revealing that the most viewed post was a misleading article implying the COVID-19 vaccine caused someone’s death.
This misinformation was viewed almost 54 million times by Facebook users in the U.S. during that three-month period, significantly impacting anti-vaccine advocacy.
This noise, as highlighted by Biden, has caused significant harm, resulting in deaths, hospitalizations, increased stigma, discrimination, and distorted health and science policies. Early in the pandemic, a rumor, primarily spread through social media, about using methanol as a cure for Covid, was linked to over 800 deaths and thousands of hospitalizations.
A study by the Vaccine Confidence Project in 2021 found that online misinformation about COVID-19 vaccines significantly contributed to hesitancy, putting our ability to achieve herd immunity at risk.
Read more on: Unlock Your Productivity Potential: Meet Your AI Personal Assistant, ChatGPT
Several reasons contribute to this issue. Our current information landscape is chaotic, making it challenging to carefully consider facts, especially when headlines appeal to our emotions. We tend to react swiftly to the impressions content creates.
For instance, humans are naturally inclined to remember and respond to negative and frightening information due to a universal negativity bias. Media experiments have shown that negative headlines outperform positive ones.
Social media exposure is increasingly recognized as a source of stress. When we’re stressed, we’re more prone to believe and spread misinformation, creating a cycle of anxiety and false information. The term “death scrolling” captures this irony.
The algorithms on social media platforms play a significant role in perpetuating this cycle. They push harmful and fear-inducing misinformation into our feeds.
These misleading narratives, like microchips in vaccines or 5G causing COVID-19, are designed to align with our interests and values.
A 2020 analysis estimated that Facebook’s algorithm generated 3.8 billion views of health misinformation in a year.
Moreover, lies, fake news, and pseudoscience can often seem more captivating than the straightforward truth. Misinformation can spread rapidly while the truth is still trying to catch up.
Conspiracy theories and misinformation may offer a comprehensive narrative explaining events, which is particularly appealing when scientific answers are lacking or uncertain, as was the case during the pandemic.
Ideological alignment is another factor. Misinformation tends to align with specific ideologies, making it more appealing to those who share those beliefs.
People are more likely to share content on social media that aligns with their political views, even if it’s false. It’s important to acknowledge that misinformation can be fueled by ideology across the political spectrum.
Being aware of this connection between ideology and misinformation is vital for everyone, regardless of their political stance.
Experts have long understood that social media has both positive and negative aspects. It has the power to unite us by connecting us to communities, friends, and families. However, it can also divide us, especially concerning ideologies.
Dr. Kate Starbird, an associate professor at the University of Washington and an expert on misinformation, emphasizes that social media platforms amplify political polarization.
They worsen existing divisions and enable the exploitation of these differences for personal gain. Discussions about COVID-19 quickly became politically polarized when the pandemic began.
An analysis from the University of Cincinnati, examining social media interactions at the pandemic’s start, revealed that some influential voices were politically motivated.
Our information landscape is now largely shaped by social media, driven by a harmful mixture of fear, distrust, uncertainty, and political polarization.
Recognizing the underlying forces fueling misinformation is crucial to curbing its dissemination. Dr. Starbird’s primary advice for identifying misinformation is to pay attention to your emotions.
If a piece of content triggers a sense of political righteousness, caution should be exercised before sharing.
Read more on: Talks fail to let aid reach Gaza; Israel evacuates Lebanon border
She emphasizes that this feeling often signals a potential misinformation red flag. The rest of this series offers further guidance on how to determine the credibility of the information you come across.
Timothy Caulfield holds the Canada Research Chair in Health Law and Policy at the University of Alberta.
He has authored over 300 academic papers and written the book “Your Day, Your Way: The Fact and Fiction Behind Your Daily Decisions.” Additionally, he hosts the documentary show “A User’s Guide to Cheating Death.”