Why Facts Alone Don’t Convince People to Change Their False Beliefs

Ray Williams
19 min readFeb 7, 2024

Psychological research suggests that once our minds are made up on important matters, changing them can be as challenging as stopping a train hurtling at full speed, even when there’s danger straight ahead.

Conspiracy theories and fake news awash our society, mainly through social media. Examples are the claim that the HPV vaccine causes autism, that Democratic Presidential nominee Hillary Clinton was running a child sex trafficking ring out of a Pizza shop, that climate change was a result of natural causes and temporary, that the 5G wireless technology helps spread COVID 19; and that the earth really is flat.

In the face of these false beliefs and fake news stories, presenting the believers with factual truths about the issue would cause them to reassess and change their views.

It doesn’t. Why?

The answer lies in two areas. First, we must pay more attention to the power of false information flooding our world. The second concerns the threat that facts present to people’s false worldviews or beliefs.

The Impact of False Information

Cailin O’Connor and James Owen Weatherall, writing in Scientific American argue that misinformation spreads quickly on social media platforms, often through friends. The authors state: “Putting the facts out there does not help if no one bothers to look them up. It might seem like the problem here is laziness or gullibility — and thus, the solution is merely more education or better critical thinking skills. But that is not entirely right. Sometimes, false beliefs persist and spread even in communities where everyone works hard to learn the truth by gathering and sharing evidence. In these cases, the problem is not unthinking trust. It goes far deeper than that.”

Stephan Lewandowsky and colleagues argue in their article in Psychological Science in the Public Interest :” The widespread prevalence and persistence of misinformation in contemporary societies, such as the false belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example, the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children, have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on research and public-information campaigns aimed at rectifying the situation.”

The term “fake news” has been weaponized in political discourse, most notably by former President Trump, to discredit media outlets that fact-check false statements. This tactic underscores the broader issue of misinformation’s ability to undermine the public’s understanding of critical issues, such as climate change. Studies by Caitlin Drummond and Maxwell Boykoff illustrate how exposure to misinformation can skew public perception of the scientific consensus on climate change, highlighting the media’s role in perpetuating these misconceptions​​​​.

In another study published in 2004, Boykoff looked at coverage in major newspapers from 1988 through 2002. He found that 50% of the 636 randomly selected articles gave roughly equal attention to skeptics’ arguments about the supposedly natural causes of climate change as they did to the factual scientific consensus that humans were the major cause of climate change.

Conspiracy theorists — coronavirus deniers — have been using the hashtag #FilmYourHospital to encourage people to visit local hospitals to take pictures and videos to prove that the COVID-19 pandemic is an elaborate hoax. The belief in this conspiracy theory rests on the baseless assumption that if hospital parking lots and waiting rooms are empty, then the pandemic must not be real or is not as severe as reported by health authorities and the media. This bogus theory joins a parade of false, unproven and misleading claims about the virus that is widespread in social media, including allegations that 5G wireless technology somehow plays a role in the spread of the COVID-19 virus or consuming silver particles or drinking water with lemon prevents or cures you of the virus. These claims are false and not supported by any scientific information.

The Societal Cost of Misinformation

A healthy democracy relies on an educated and well-informed populace. If most people believe in something factually incorrect, the misinformation can form the basis for political and societal decisions that counter a society’s best interests. Also, if individuals are misinformed, they may make decisions for themselves and their families that can have negative consequences. For example, following the unsubstantiated claims that vaccinations cause autism, many parents decided not to immunize their children, which has resulted in a marked increase in vaccine-preventable diseases and hence preventable hospitalizations, deaths, and the unnecessary expenditure of large amounts of money for follow-up research and public-information campaigns aimed at rectifying the situation.

Reliance on misinformation differs from ignorance, which is the absence of relevant knowledge. Ignorance, too, can have apparent detrimental effects on decision-making, but, perhaps surprisingly, those effects may be less severe than those arising from reliance on misinformation. For example, those who most vigorously reject the scientific evidence for climate change are also those who believe they are best informed about the subject.

Cailin O’Connor and James Owen Weatherall explain: ”In recent years, how the social transmission of knowledge can fail us have come into sharp focus. Misinformation shared on social media websites has fueled an epidemic of false belief with widespread misconceptions concerning topics ranging from the COVID-19 pandemic to voter fraud, whether the Sandy Hook school shooting was staged and whether vaccines are safe . . . One consequence is the current largest mass outbreak in a generation. ‘Misinformation’ may seem like a misnomer here. After all, many of today’s most damaging false beliefs are initially driven by propaganda and disinformation, which are deliberately deceptive and intended to cause harm. But part of what makes disinformation so effective in an age of social media is that fat that people exposed to it share it widely among friends and peers who trust them, with no intention of misleading anyone.”

Governments and Politicians

In 2003, the narrative set forth by President Bush and his administration was clear and unequivocal: Saddam Hussein possessed weapons of mass destruction (WMDs), and Iraq was a pivotal battleground in the “War on Terror,” ostensibly due to connections with al-Qaida. However, the unfolding events would starkly contradict these assertions, revealing that WMDs were nonexistent in Iraq and the alleged ties to al-Qaida were unfounded. Despite these revelations, a significant portion of the American populace remained anchored to the initial claims by the Bush administration. Astonishingly, years following the invasion, between 30% of Americans persisted in the belief that WMDs had been found, while approximately half continued to endorse the idea of a link between Iraq and al-Qaida. This discrepancy persisted even in the face of widespread reporting to the contrary and the eventual bipartisan consensus in the U.S. dismissing both the presence of WMDs and the al-Qaida connection.

Media’s Role in Perpetuating Misinformation

Mainstream media’s role in disseminating and perpetuating misinformation cannot be understated. Often, media outlets can simplify, misinterpret, or sensationalize scientific findings. Additionally, the American media landscape is characterized by a tradition of striving for “balance” in reporting, which can sometimes lead to the inclusion of unfounded or debunked viewpoints to present a balanced story. This approach can mislead the public, as in instances where unqualified opinions are given the same weight as expert analysis on scientific matters.

The phenomenon of selective exposure, fueled by the expansion of cable TV, talk radio, and the internet, has further complicated the landscape. People now have the luxury of curating their news consumption to align with their existing beliefs, often leading to a reinforcement of misconceptions. Stephen Kull and colleagues, in their work published in Political Science Quarterly, highlighted how misinformation varies dramatically among the public depending on their preferred news sources, from Fox News viewers being the most misinformed to National Public Radio listeners being the least.

The internet has exacerbated this issue by fragmenting the information ecosystem into echo chambers. Political blogs and news sites often link only to sources that echo their perspective, rarely engaging with opposing viewpoints. This has led to the formation of “cyber-ghettos,” spaces where like-minded individuals reinforce each other’s beliefs, contributing to the polarization of political discourse. Surveys show that more than half of blog readers prefer sources that align with their views. At the same time, only a minority seeks out differing opinions, indicating a significant challenge in bridging the gap in political and societal discourse.

Why Factual Information Can’t Convince People to Change Their False Beliefs

In their groundbreaking 1956 work, When Prophecy Fails, psychologist Leon Festinger and his colleagues delve into the psychology of a UFO cult when their predicted event of a mother ship’s arrival did not come to pass. Rather than acknowledging their mistake, “members of the group sought frantically to convince the world of their beliefs,” engaging in “a series of desperate attempts to erase their mounting dissonance by making more predictions in the hope that one would eventually materialize.”

The exploration of cognitive dissonance extends in Carole Tavris and Elliot Aronson’s 2007 publication, Mistakes Were Made (But Not by Me). Here, the authors compile evidence from thousands of experiments showcasing how individuals tailor facts to align with their pre-existing beliefs to mitigate cognitive dissonance.

Further investigations into cognitive biases by Dartmouth College’s Brendan Nyhan and the University of Exeter’s Jason Reifler unveil a phenomenon known as the backfire effect, “in which corrections increase misperceptions among the group in question.” This effect emerges when individuals perceive their worldview or self-identity as being under threat. For instance, in experiments where participants were exposed to false articles asserting the existence of WMDs in Iraq, corrections stating the contrary led to polarized responses based on political affiliations. Liberals tended to accept the correction, while conservatives, in many cases, doubled down on the belief in WMDs, suggesting an even stronger conviction post-correction. This illustrates how “the belief that Iraq possessed WMD immediately before the U.S. invasion persisted long after the Bush administration itself concluded otherwise.”

Research has suggested that changing minds is so challenging because exposing someone to a new perspective on an issue inevitably prompts a desire to justify their beliefs.

The difficulty of changing minds is attributed to the defensive mechanisms triggered when new information challenges a person’s core beliefs. Research led by Gregory Trevors, published in Discourse Processes,proposes that the backfire effect is less about factual contestation and more about the threat to the individual’s identity, which elicits negative emotions and hampers the processing of corrective information.

A particularly telling study concerning public health involved the misconception around the flu vaccine, believed by 43 percent of Americans to cause the flu, a notion debunked by science. Yet, attempts to correct this misconception, as documented by Nyhan and Reifler in Vaccine, failed to increase vaccination intentions and paradoxically made individuals less likely to vaccinate. This was attributed to individuals with “high concerns about vaccine side effects” recalling other anxieties to justify their initial stance upon encountering corrective information, a dynamic rooted in motivated reasoning.

Why is it that as false beliefs went down, so did intentions to vaccinate? The explanation suggested by the researchers is that the participants who had “deep concerns about vaccine side effects brought other concerns to mind in an attempt to maintain their prior attitude when presented with corrective information.” Motivated reasoning is a psychological principle that might explain this behaviour: we are often open to persuasion regarding information that fits our beliefs, while we are more critical or even outright reject information that contradicts our worldview.

This is not the first time vaccine safety information has been found to backfire. Last year, the same team of researchers conducted a randomized controlled trial comparing messages from the CDC aiming to promote the measles, mumps and rubella (MMR) vaccine. The researchers found that debunking myths about MMR and autism had a similarly counterproductive result — reducing some false beliefs but also ironically reducing intentions to vaccinate.

Politicians Lie or Distort the Truth

Politicians frequently lie, distort the truth, ignore facts and create fake news or misinformation to support their beliefs.

A revealing study in Behavioural Public Policy by Dan M. Kahan and his team showed that individuals with strong mathematical abilities tend to solve problems accurately only when the solutions align with their political beliefs. For instance, “Liberals” excelled in solving a math problem that implied gun control reduces crime, whereas “Conservatives” performed well when the solution suggested that gun control increases crime.

Jay Van Bavel, an NYU psychology professor, attributes this phenomenon to an “identity-based” model of political belief: “Often, the actual consequences of particular party positions matter less to our daily lives than the social consequences of believing in these party positions. Our desire to hold identity-consistent beliefs often far outweighs our goals to hold accurate beliefs.” This preference underscores how the social benefits of aligning with a political party or group, such as a sense of belonging, often eclipse the pursuit of truth.

A study by Serge Moscovici and Marisa Zavalloni published in the Journal of Personality and Social Psychology has that in-group discussions can lead people to hold more extreme beliefs than they would on their own — a phenomenon known as group polarization. This often happens at political rallies and speeches.

People May Change Their Views But Not Their Behavior

One might expect that correcting misinformation would lead to a change in behavior, yet studies show this is often not the case. Thomas Wood found that correcting the misconception that vaccines cause autism did not significantly increase vaccination rates among parents. Similarly, Brendan Nyhan and his colleagues in Political Behavior observed that while people could update their beliefs about Donald Trump when presented with factual corrections, their voting intentions remained unchanged. This suggests that even when people acknowledge factual information, it does not necessarily impact their actions.

Researchers have noted, “Even after the evidence ‘for their beliefs has been refuted, people fail to make appropriate revisions in those beliefs,” indicating a robust resistance to changing long-held views.”

Steven Sloman and Elke U. Weber, in both the Behavioral Scientist and Cognition: International Journal of Cognitive Science, discuss strategies for improving political discourse and reducing bias. They emphasize that merely providing accurate information does not guarantee rational, non-partisan thinking due to motivated cognition. Gordon Pennycook and David Rand’s work supports this, suggesting that distinguishing real from fake news requires critical thinking, which is often bypassed for quick judgments on social media.

Sloman and Weber advocate for news reporting based on evidence rather than balancing differing opinions, especially on scientific issues, to prevent the false equivalence of opinions with facts.

The challenge lies in overcoming motivated reasoning, where people’s cognitive processes selectively support their pre-existing beliefs. This tendency indicates that more information alone cannot change deeply entrenched beliefs. Alternative approaches, such as deliberative democratic models that encourage open-minded discussions and empathy through role-playing, might offer more effective pathways for changing minds.

People deploy motivated reasoning to use their cognitive processes in ways that support their beliefs. There is considerable evidence that people are more likely to arrive at conclusions that they want to arrive at. Still, their ability to do so is constrained by their ability to construct seemingly reasonable justifications for these conclusions. This means that just providing people with more information and facts about a case is unlikely to change their conclusion if it is a long or deeply-held belief. One possible solution in this regard is to structure conversations/discourses in such a way that facilitates/encourages people to come to these discussions with an open mind. For example, in deliberative democratic models, group discussions are often held in private, giving people more freedom to express their opinions freely and keep an open mind as they do not fear immediate backlash from their group members if they change their stance. Probing each other by asking questions and not dismissing a view or argument is also helpful. Perhaps the most effective way is to force people to empathize by making them role-play as an advocate of the opposing side or as an “unbiased expert.”

Elizabeth Svoboda, writing in University of California Berkeley’s Greater Good, Our opinions are often based on emotion and group affiliation, not facts. While it’s easy to conclude that people’s views are barometers of their moral elevation, the more nuanced truth is that many factors help explain deeply entrenched beliefs. Indeed, some partisans are focused on policy issues above all else. But for others, the brain’s tendency to stay the course may play a more significant role. Psychological research suggests that once our minds are made up on essential matters, changing them can be as challenging as stopping a train hurtling at full speed, even when there’s danger straight ahead.

Jonathan Haidt’s The Righteous Mind: Why Good People Are Divided by Politics and Religion delves into the significant role emotions play in our decision-making processes. Haidt challenges the notion that we’d be better off without emotions by pointing out the moral deficiencies of psychopaths who reason without feeling and highlighting the innate morality observed in babies who think but don’t reason. This leads to the conclusion that emotions are not obstacles but essential pathways for moral and other decisions. Neurologist Antonio Damasio’s research supports this, showing patients with damage to emotional processing areas struggle with basic decision-making tasks.

We need emotions to function properly. The question is, how can we change minds when emotions seem so irrational?

The Elephant and the Rider

The psychology of changing your mind has a lot to do with how your brain is structured. For instance, the amygdala, the almond-shaped cells near the base of the brain, governs our emotions, emotional behavior, and motivation. On the other hand, the prefrontal cortex (PFC) is responsible for processing beliefs, all of which are relatively advanced: doxastic inhibition, which allows us to deliberate as if something is not true and get an intuitive feeling about being “right” or “wrong” about certain beliefs. Within the PFC is the dorsolateral prefrontal cortex, which governs executive function, reason, and logic.

According to behavioral researchers Daniel Kahneman and Amos Tversky, the more primitive parts of the brain, like the amygdala, cannot process complicated information. While the PFC can make rational decisions considering long-term consequences, making these decisions can be exhausting for that part of the brain. This is why people find it easier to make quick decisions rather than think long and hard about issues.

Haidt further explores this interplay of emotion and reason by adapting Plato’s allegory, likening rationality to a rider and emotions to an elephant. So, when the elephant moves, the rider often has no say. And, because we’re highly social creatures that need to be able to explain to others why we behaved as we did — like, why we didn’t do the dishes — the rider has evolved to justify the elephant’s actions expertly. In this sense, Haidt says, the rider often acts like the elephant’s press secretary!

So, is that it, then? Are we destined to be at the mercy of our emotions to guide our behavior? Is the role of rationality to explain our behavior to other people?

Thankfully, Haidt believes, the answer is “no.” Although our emotions significantly sway our behavior, it’s not true that reason is a slave to the passions, as the philosopher David Hume once put it. Haidt believes that the influence can go both ways, even though the elephant wins most of the time.

What Can We Do?

Now that we know why people fall into faulty thinking patterns, how do we change other people’s minds?

Addressing how to change minds, Tali Sharot in The Influential Mind: What the Brain Reveals About Our Power to Change Others outlines seven core elements, including the importance of finding common ground, using positive framing, and providing immediate rewards over threats. Sharot emphasizes the power of curiosity and the need for calmness in receptiveness to change, cautioning against the influence of group conformity.

  1. The first core element of prior beliefs involves seeking common ground. Please find out the beliefs of the other person you agree with instead of bombarding him or her with facts and figures about a debatable topic that supports your argument.
  2. The second core element of emotions involves framing our views positively rather than negatively. Positive framing is easier to process and broadens one’s thoughts and actions
  3. Thirdly, it is more effective to present an immediate positive reward than subsequently giving a threat.
  4. Fourth, to influence others, they must be given a sense of control or “agency”; otherwise, they will feel angry, frustrated, and resist attempts at persuasion. One can likewise gain others’ trust by having them gain control of the choices presented to them.
  5. The fifth core element is “curiosity,” or a desire to know. Before giving information to others, we must point out the gap in their knowledge and show them how they can benefit from it. Responsibly disseminating information about health and well-being, community safety, etc., are examples of community activities that can help bridge knowledge gaps.
  6. As for the sixth element, one must assess the other person’s mental and emotional state. People who are calm and relaxed are more receptive to being influenced versus people who are in stressful or threatening situations.
  7. Lastly, Sharot cautions against group conformity or the “knowledge and acts of other people,” which comprises the seventh component. She states that in some situations, we must be wary of being influenced by other people, especially in social media or political campaigns when the truthfulness of the information cannot be verified.

New research emphasizes the efficacy of personal stories over statistics in bridging divides. Kurt Gray’s work, published in the Proceedings of the National Academy of Sciences, suggests that personal experiences, particularly those involving harm or vulnerability, garner more respect than factual arguments. This is because experiences, especially personal or closely related ones, are perceived as more genuine and difficult to dismiss compared to facts, which are easily doubted or discounted. Rebuttals can sometimes backfire, leading people to double down on their original position.

“In moral disagreements, experiences seem truer than facts,” said Kurt Gray, a psychologist and director of the Center for the Science of Moral Understanding at the University of North Carolina.

For the new research, Gray and his colleagues focused on how facts versus experiences affected people’s perceptions of their opponent’s rationality and respect for that opponent. Over 15 separate experiments, they found that, although people think they respect opponents who present facts, they have more respect for opponents who share personal stories.

Ultimately, people can always come up with a way to doubt or discount facts, Gray said, but personal experiences are harder to argue away. “It’s just so hard to doubt when someone tells you, ‘Look, this terrible thing happened to me,’” he said.

In The Enigma of Reason, Hugo Mercier and Dan Sperber argue that reasoning primarily serves social cooperation rather than truth-seeking, leading to cognitive biases like confirmation bias. This social foundation of reasoning helps explain why political affiliations can so profoundly distort our capacity for objective thought.

Strategies for overcoming these biases include fostering a culture of scientific curiosity, emphasizing accuracy, and engaging in self-affirmation exercises to be more open to conflicting information. Peter T. Coleman’s work at Columbia University suggests that positive, nuanced discussions can lead to more satisfying and complex political dialogues.

Despite the challenges, psychological research offers hope for overcoming our predisposition toward cognitive dissonance and science denial. Understanding the pleasure derived from affirming our beliefs and the discomfort associated with changing them highlights the deeply ingrained nature of our convictions. Yet, by acknowledging our social reliance on group identity and exploring methods to mitigate the fear of exclusion, we can begin to address the root causes of our resistance to change. David Ropeik’s insights into risk perception underscore the importance of carefully navigating these social and psychological terrains to foster a more open and informed discourse.

The influence of political affiliation on our reasoning capabilities is profound, as elucidated in various studies. The essence of these findings is that our allegiance to specific political ideologies can significantly skew our ability to process information objectively. For instance, a study highlighted that individuals with strong mathematical skills only excelled in solving problems when the outcomes aligned with their political beliefs. This indicates a bias where liberals and conservatives solved a math problem correctly only when the results supported their stance on gun control, either for reducing or increasing crime, respectively.

This bias extends beyond numerical problems to perception itself. In an experiment where participants watched a video of protestors, their interpretation of the protest’s nature and intensity was shaped by whether they believed the protest aligned with or opposed their political views despite all participants viewing the duplicate footage.

The echo chambers created by social media, selective news consumption, and surrounding ourselves with like-minded individuals can intensify our views, pushing us towards more extreme positions. This phenomenon, known as group polarization, is compounded when exposure to opposing viewpoints does not open minds but entrenches beliefs further. Studies have shown that even structured exposure to the opposite political spectrum, such as following diverse Twitter accounts, often leads to increased polarization rather than understanding.

The defensive reaction to opposing opinions and facts — sometimes resulting in the ‘backfire effect’ where individuals double down on their beliefs despite contradictory evidence — underscores the challenge of changing minds. However, newer research suggests that effective fact-checking can mitigate misinformation, though changing deeply ingrained beliefs and behaviors remains complex.

Ultimately, while our capacity for reason is formidable, it is susceptible to biases, especially when aligned with our ‘tribal’ affiliations. Embracing complexity, fostering curiosity, and validating different perspectives may offer pathways to more open-minded discourse. As described by Festinger, the discomfort of cognitive dissonance and the neurological response to changing beliefs highlight the inherent challenges in altering deeply held convictions.

In Denying to the Grave: Why We Ignore the Facts That Will Save Us, Sara and Jack Gorman discuss the mental energy required to grapple with scientific information, suggesting that complexity and fear of exclusion from one’s social group can hinder the acceptance of new ideas. The neurological pleasure derived from maintaining one’s beliefs versus the discomfort associated with changing them illustrates the hardwired nature of our convictions.

Blocking out the information we disagree with — creating social media echo chambers, reading partisan news, or only surrounding ourselves with friends who agree with us — can also lead to our opinions becoming more extreme. Several psychological studies have shown that group discussions can lead people to hold more extreme beliefs than they would on their own — a phenomenon known as group polarization. Our tendency to surround ourselves with only like-minded opinions may be one of the reasons why Republicans and Democrats are rapidly becoming more polarized.

But, even if people do expose themselves to beliefs they disagree with, that won’t necessarily make things better. More exposure to the other side can sometimes backfire and cause people to become more entrenched in their beliefs. One study paid Twitter users to follow accounts that would retweet tweets from their political opponents — liberals would see conservative tweets, and conservatives would see liberal tweets. It didn’t cause people to open their minds to the other side. Instead, liberals became more liberal, and conservatives became more conservative.

We often react defensively to opinions we disagree with, viewing them as threats to our identity. We also do the same with facts: When confronted with facts we disagree with, we often do not change our perceptions. Past research suggested that fact-checking could lead to a “backfire effect,” causing people to double down and become even more stubborn in their beliefs. Facebook discovered, for instance, that warning users that an article was false caused people to share that article even more. While the notion of a “backfire effect” is alarming, more recent research undercuts the idea, suggesting that fact-checking, if done properly, can often successfully correct misperceptions.

However, research suggests that correcting misperceptions isn’t enough to change behavior. For instance, one study found that successfully correcting the false belief that vaccines cause autism didn’t encourage some parents to vaccinate their children. Other studies found that correcting false beliefs about Trump caused people to change their beliefs, but this did not change how much they supported Trump. In other words, while you can get people to understand the facts, the facts don’t always matter.

So What Can We Learn From All of This?

First, facts and scientific evidence are not the most powerful and easy way to encourage people to abandon false or inaccurate beliefs and perspectives. Second, people embrace fake news, misinformation and disinformation because of their beliefs, even if they can be proven wrong, exercising, in many cases, a demonstration of tribal loyalty. Third, engaging in a dialogue in a non-threatening manner to avoid defence mechanisms from activating with personal stories has a greater likelihood of success.

--

--

Ray Williams

Author/ Executive Coach-Helping People Live Better Lives and Serve Others