Why It is So Difficult to Change Peoples’ Minds with Facts

Ray Williams
16 min readFeb 26, 2022

There is a commonly held belief that the way to change people’s opinions or views is to present them with the facts and scientific evidence. Yet, recent research shows this strategy does not work.

Rebuttals can sometimes backfire, leading people to double-down on their original position. A new paper published in Discourse Processes suggests why: when people read information that undermines their identity, this triggers feelings of anger and dismay that make it difficult for them to take the new facts on board.

This partially explains why there is such a proliferation of fake news and false information on serious problems facing us.

The Influence of Fake News and Misinformation

It was widely reported in social media that Pope Francis endorsed Donald Trump?Pope Francis’ supposed endorsement of Trump originated on the satirical website WTOE 5 News, with the headline “Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement.” In the bogus statement, the pope is purported to comment on the FBI investigation of Democratic presidential nominee Hillary Clinton, writing, “The FBI, in refusing to recommend prosecution after admitting that the law had been broken on multiple occasions by Secretary Clinton, has exposed itself as corrupted by political forces that have become far too powerful.”

In the made-up article, the Pope then goes on to acknowledge that he is not endorsing Trump as the pope, but rather “as a concerned citizen of the world.”

WTOE 5 News owns up to being a fake news website in its “about” page. “WTOE 5 News is a fantasy news website,” it says. “Most articles on wtoe5news.com are satire or pure fantasy.”

And yet, millions of people reshared this story on social media, and many people to this day believed it. This is an example of the proliferation of fake news, stories that are presented in such a way that they appear to be legitimate “headlines” but are totally made up by groups or media personalities trying to influence the publica, without providing any evidence.

In another example, in 2016, Edgar Maddison Welch, a 28-year-old man from Salisbury, North Carolina, arrived at Comet Ping Pong and fired three shots from an AR-15 style rifle that struck the restaurant’s walls, a desk, and a door. He believed Democratic Presidential nominee Hillary Clinton was running a child sex trafficking ring based upon a fake story about the conspiracy.

Former President Trump seized upon the words “fake news” early in his term of office used them as a weapon against the media, particularly when fact checking proved his false claims.

Image: The Fact Checker, Washington Post

Climate Change

Repeated exposure to fake news and false claims about climate change can have a negative effect on peoples’ belief that the major cause of climate change is human caused. Research from Arizona State University Assistant Professor Caitlin Drummond found that xposure to fake news about climate change may impact people’s belief in human-caused climate change and weaken their perceptions of the scientific consensus on climate change.

In 2008, Maxwell Boykoff, who is now a University of Colorado professor, published a study in the journal Climatic Change that looked at news programs on ABC, CNN, NBC, and CBS from 1995 through 2004. Boykoff found that 70 percent of these networks’ global warming stories “perpetuated an informational bias” by including the unscientific views of climate skeptics.

In another study published in 2004, Boykoff looked at coverage in major newspapers from 1988 through 2002. He found 50% of the 636 randomly selected articles gave roughly equal attention to skeptics’ arguments about the supposedly natural causes of climate change as they did to the factual scientific consensus that humans were the major cause of climate change.

Despite clearly verified data supported by 97% of the world’s scientists, climate change deniers still refuse to acknowledge the truth, many of whom are in positions of power in government, politics and business.

COVID Deniers

An April 2020 article in the conservative National Post stated “in the midst of a global pandemic, conspiracy theorists have found yet another way to spread dangerous disinformation and misinformation about COVID-19, sowing seeds of doubts about its severity and denying the very existence of the pandemic.

For the past year and a half conspiracy theorists — coronavirus deniers — have been using the hashtag #FilmYourHospital to encourage people to visit local hospitals to take pictures and videos to prove that the COVID-19 pandemic is an elaborate hoax. The belief in this conspiracy theory rests on the baseless assumption that if hospital parking lots and waiting rooms are empty then the pandemic must not be real or is not as severe as reported by health authorities and the media. This bogus theory joins a parade of false, unproven and misleading claims about the virus that is widespread in social media including allegations that 5G wireless technology somehow plays a role in the spread of the COVID-19 virus, or consuming silver particles or drinking water with lemon prevents or cures you of the virus. These claims are false and not supported by any scientific information.

And then there’s the COVID Vaccine deniers and conspiracy theorists.

David Gorski writing the publication Science-Based Medicine explains: “One aspect of the COVID-19 pandemic that I haven’t really written much about much yet is the developing unholy alliance between COVID-19 deniers (who peddle in conspiracy theories and falsely claim that the disease isn’t that bad and/or that the lockdowns and social distancing are not — or no longer — necessary and should be lifted to alleviate the catastrophic damage to our economy that mitigation efforts have unavoidably caused), and the anti- vaccine movement (which predictably peddles misinformation and conspiracy theories about how COVID-19 is being weaponized as a plot to impose forced universal vaccinationor even that the disease was created by Bill Gates for that very purpose, along with H1N1 influenza and Ebola! — or how the influenza vaccine supposedly makes one more susceptible to coronavirus; spoiler alert: It doesn’t).”

Goski goes on to say: “Although it might seem odd, those of us who’ve studied conspiracy theories for a long time almost immediately realized that an alliance between the anti-vaccine movement and COVID-19 deniers would entirely natural and expected it. Both groups of conspiracy theorist share an intense distrust of government, particularly the CDC and FDA. Both share an equally intense distrust of big pharma, while glorifying individual freedom above all else, with anti-vaxxers invoking ‘health freedom’ and ‘parental rights’ and COVID-19 deniers invoking absolute bodily autonomy and the ‘right’ to do whatever they want, including violating social distancing. Both groups’ beliefs are rooted in conspiracy theories, with the central conspiracy theory of the anti-vaccine movement being that the government, big pharma, and the medical profession have evidence that vaccines cause autism and harm but are covering it up. Both have a tendency towards germ theory denial, laboring blissfully under the delusion that, because they are so ‘healthy,’ because they live such exemplary lifestyles, exercise, and eat the “right” foods, it makes sense that COVID-19 deniers don’t think that COVID-19 is a threat to them or their loved ones, just as anti-vaxxers don’t think vaccine-preventable diseases are a threat to them and their loved ones either.”

The Impact of Misinformation Online

Cailin O’Connor and James Owen Weatherall, writing in Scientific American argue that misinformation spreads quickly on social media platforms, often through friends. People tend to trust information posted online by people we are know personally fact-checking. The authors state: “Putting the facts out there does not help if no one bothers to look them up. It might seem like the problem here is laziness or gullibility — and thus that the solution is merely more education or better critical thinking skills. But that is not entirely right. Sometimes false beliefs persist and spread even in communities where everyone works very hard to learn the truth by gathering and sharing evidence. In these cases, the problem is not unthinking trust. It goes far deeper than that.”

Stephan Lewandowsky and colleagues argue in their article in Psychological Science in the Public Interest :”The widespread prevalence and persistence of misinformation in contemporary societies, such as the false belief that there is a link between childhood vaccinations and autism, is a matter of public concern. For example, the myths surrounding vaccinations, which prompted some parents to withhold immunization from their children, have led to a marked increase in vaccine-preventable disease, as well as unnecessary public expenditure on research and public-information campaigns aimed at rectifying the situation.”

The Societal Cost of Misinformation

A healthy democracy relies on an educated and well-informed populace. If a majority of people believe in something that is factually incorrect, the misinformation can form the basis for political and societal decisions that run counter to a society’s best interest. Also, if individuals are misinformed, they may make decisions for themselves and their families that can have negative consequences. For example, following the unsubstantiated claims of vaccinations causing autism many parents decided not to immunize their children, which has resulted in a marked increase in vaccine-preventable disease and hence preventable hospitalizations, deaths, and the unnecessary expenditure of large amounts of money for follow-up research and public-information campaigns aimed at rectifying the situation.

Reliance on misinformation differs from ignorance, which is the absence of relevant knowledge. Ignorance, too, can have obvious detrimental effects on decision making, but, perhaps surprisingly, those effects may be less severe than those arising from reliance on misinformation. For example, those who most vigorously reject the scientific evidence for climate change are also those who believe they are best informed about the subject.

Governments and politicians

In 2003, President Bush and his administration proclaimed there was no doubt that Saddam Hussein had weapons of mass destruction (WMDs). The Bush administration also identified Iraq as the front line in the “War on Terror” and implying that it had intelligence linking Iraq to al-Qaida. Of course, we now know that no WMDs were ever found in Iraq and its link to al-Qaida turned out to be unsubstantiated. Despite the truth, large segments of the U.S. public continued to believe the administration’s earlier claims, with some 20% to 30% of Americans believing that WMDs had actually been discovered in Iraq years after the invasion and around half of the public endorsing links between Iraq and al-Qaida .These mistaken public beliefs persisted to this day even though the invasion was followed by published corrections, and even though the nonexistence of WMDs in Iraq and the absence of links between Iraq and al-Qaida was eventually widely reported and became the official bipartisan U.S. position.

Mainstream Media

The mainstream media can oversimplify, misrepresent, or overdramatize scientific results.

Second, there is a tradition in American mainstream media for journalists to present a “balanced” story. For example, if the national meteorological service issued a severe weather warning for tomorrow, no one would — or should — be interested in what your neighor’s opinion to the contrary would be. For good reasons, a newspaper’s weather forecast relies on expert assessment and excludes lay opinions.

A major Australian TV channel recently featured a self-styled climate “expert” whose diverse qualifications included authorship of a book on cat palmistry. This asymmetric choice of “experts” leads to the perception of a debate about issues that were in fact resolved in the relevant scientific literature long ago.

Stephen Kull and his colleagues writing in Political Science Quarterly have shown that the level of belief in misinformation among segments of the public varies dramatically according to preferred news outlets, running along a continuum from Fox News (whose viewers are the most misinformed on most issues) to National Public Radio (whose listeners are the least misinformed overall).

The growth of cable TV, talk radio, and the Internet have made it easier for people to find news sources that support their existing views, a phenomenon known as selective exposure. When people have more media options to choose from, they are more biased toward like-minded media sources. The growth of Internet “news” in particular has led to a fractionation of the information landscape into “echo chambers” — that is, (political) blogs that primarily link to other blogs of similar persuasion and not to those with opposing viewpoints. More than half of blog readers seek out blogs that support their views, whereas only 22% seek out blogs espousing opposing views, a phenomenon that has led to the creation of “cyber-ghettos”. These cyber-ghettos have been identified as one reason for the increasing polarization of political discourse.

So why doesn’t factual and scientific information convince people to change or abandon their conspiracy theories or false beliefs?

Past research has suggested that one reason changing minds is so challenging is that exposing someone to a new perspective on an issue inevitably arouses prompts a desire to justify their current beliefs.

Research led by Gregory Trevors and his colleagues published Discourse Processes was motivated by the idea that the backfire effect may not be about which side is winning that mental arms race at all. Instead, these researchers believe the problem occurs when new information threatens the recipient’s sense of identity. This triggers negative emotions, which are known to impair the understanding and digestion of written information.

According to a new study, 43 per cent of the US population believes wrongly that the flu vaccine can give you the flu. In fact any adverse reaction from the vaccine, besides a temperature and aching muscles for a short time, is rare. It stands to reason that correcting this misconception would be a good move for public health, but the study by Brendan Nyhan and Jason Reifler published in Vaccine found that debunking this false belief had a seriously counterproductive effect.

Why is it that as false beliefs went down, so did intentions to vaccinate? The explanation suggested by the researchers is that the participants who had “high concerns about vaccine side effects brought other concerns to mind in an attempt to maintain their prior attitude when presented with corrective information”. A psychological principle that might explain this behaviour is motivated reasoning: we are often open to persuasion when it comes to information that fits with our beliefs, while we are more critical or even outright reject information that contradicts our world view.

This is not the first time that vaccine safety information has been found to backfire. Last year the same team of researchers conducted a randomised controlled trial comparing messages from the CDC aiming to promote the measles, mumps and rubella (MMR) vaccine. The researchers found that debunking myths about MMR and autism had a similarly counterproductive result — reducing some false beliefs but also ironically reducing intentions to vaccinate.

In their book, The Enigma of Reason, cognitive neuroscientists Hugo Mercier and Dan Sperber, we use reasoning to justify beliefs we already believe in and use it to make arguments to convince others. They say that this facilitates social cooperation, but make not be effective in establishing the “truth.” And that’s because, as psychological scientists have shown, we are susceptible to a myriad of cognitive distortions, such as confirmation bias, in which we seek out the information that confirms what we already believe, and screen out conflicting information.

Politicians Lie or Distort the Truth

Politicians frequently lie, distort the truth, ignore facts and create fake news or misinformation to support their beliefs.

One study by Dan M. Kahan and colleagues published in Behavioural Public Policy, found that people who had strong math skills were only good at solving a math problem if the solution to the problem conformed to their political beliefs. The researchers found that those in the experiment who were defined as “Liberals” were only good at solving a math problem, for instance, if the answer to that problem showed that gun control reduced crime. In the same experiment the reseachers found that those participants were defined as “Conservatives” were only good at solving this problem if the solution showed that gun control increased crime.

NYU psychology professor Jay Van Bavel explains the results of studies like these with his “identity-based” model of political belief: “Oftentimes, the actual consequences of particular party positions matter less to our daily lives than the social consequences of believing in these party positions. Our desire to hold identity-consistent beliefs often far outweigh our goals to hold accurate beliefs. This may be because being a part of a political party or social group fulfills fundamental needs, like the need for belonging, which supersede our need to search for the truth.”

A study by Serge Moscovici and Marisa Zavalloni published in the Journal of Personality and Social Psychology has that in-group discussions can lead people to hold more extreme beliefs than they would on their own — a phenomenon known as group polarization. This often happens at political rallies and speeches.

People May Change Their Views But Not Their Behavior

We would assume that in the presence of information that corrected mistaken beliefs, the individuals would alter their behavior. However, often it doesn’t. One study by Thomas Wood and Ethan found that even when people corrected their false belief that vaccines cause autism didn’t actually encourage some parents to vaccinate their children. A study by Brendan Nyhan and colleagues published in Political Behavior found that correcting false beliefs about Donald Trump caused people to change their beliefs, but they were still prepared to vote for him anyway. In other words, while you can get people to understand the facts, the facts don’t always matter.

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.

Steven Sloman and Elke U. Weber, writing in the Behavioral Scientist, and also Cognition: International Journal of Cognitive Science, shared some psychological insights on how to improve political discourse and reduce bias and polarization. They make the point that changing peoples’ views involves changing societal norms. They say “providing accurate information does not promote clear, non-partisan thinking, because people use motivated cognition to deploy the fallacious reasoning that supports their beliefs.” They cite the work of Gordon Pennycook and David Rand who argue that the ability to distinguish real from fake news is governed by one’s ability to reason. People fall for fake news when they fail to engage in sufficient critical thinking. Hence, to help people recognize and reject misinformation, we should teach them (or nudge them) to slow down and think critically about what they see on social media.

Sloman and Weber also suggest that news broadcasters should act on the basis of the balance of evidence, rather than balance of opinion, when reporting scientific issues. All too often news reports present a false equivalent where false opinion is

People deploy motivated reasoning to use their cognitive processes in ways that supports their beliefs. There is considerable evidence that people are more likely to arrive at conclusions that they want to arrive at, but their ability to do so is constrained by their ability to construct seemingly reasonable justifications for these conclusions. This means that just providing people with more information and facts about a case, is unlikely to change their conclusion, if it is a long or deeply held belief. One possible solution in this regard is to structure conversations/discourses in such a way that facilitates/encourages people to come to these discussions with an open mind. For example- in deliberative democratic models, the group discussions are often held in private, which gives people more freedom to express their opinions freely, but also to keep an open mind as they do not fear immediate backlash from own group members, in case they change their stance. It is also helpful to probe each other by asking questions, not by dismissing a view or argument. Perhaps the most effective way is to force people to empathize by making them role-play as an advocate of the opposing side, or as an “unbiased expert”.

Elizabeth Svoboda, writing in University of California Berkeley’s Greater Good, Our opinions are often based in emotion and group affiliation, not facts. While it’s easy to conclude that people’s views are barometers of their moral elevation, the more nuanced truth is that a broad range of factors help explain deeply entrenched beliefs. Certainly, some partisans are focused on policy issues above all else. But for others, the brain’s tendency to stay the course may play a larger role. Psychological research suggests that once our minds are made up on important matters, changing them can be as difficult as stopping a train hurtling at full speed, even when there’s danger straight ahead.

In his excellent book, The Righteous Mind: Why Good People Are Divided by Politics and Religion, Jonathan Haidt presents some of these scientific findings. Importantly, he shows how emotions play a critical role in decision-making.

Does this make you think we’d be better off without emotions? Not so, says Haidt. Psychopaths reason but don’t feel emotions like the rest of us, and they’re severely deficient morally as a result. On the other hand, babies feel but don’t reason and have the beginnings of morality.

What should we take away from these findings? That emotions aren’t getting in our way — they’re the avenue through which we make decisions, both moral and otherwise. Neurologist Antonio Damasio found that patients with brain damage in areas that processes emotions often struggle to make even the most routine decisions.

We need emotions to function properly.The question is, how can we change minds when emotions seem so irrational?

The elephant and the rider

Jonathan Haidt builds on Plato’s allegory with his own and compares the relationship between rationality and emotions to a rider sitting atop an elephant. The rider represents rationality and the elephant, emotions. Although it seems like the rider is in charge and directing the elephant, the elephant has much more control over the situation than you might think.

Emotions, Haidt argues, are strong motivators of behavior. So, when the elephant moves, the rider often has no say in the matter. And, because we’re highly social creatures that need to be able to explain to others why we behaved as we did — like, why we didn’t do the dishes — the rider has evolved to expertly justify the actions of the elephant. In this sense, Haidt says, the rider often acts like the elephant’s press secretary!

So, is that it, then? Are we destined to be at the mercy of our emotions to guide our behavior? Is the role of rationality simply to explain our behavior to other people?

Thankfully, Haidt believes, the answer is “no”. Although our emotions do significantly sway our behavior, it’s not true that reason is a slave to the passions, as the philosopher David Hume once put it. Haidt believes that the influence can go both ways, even though the elephant wins out most of the time.

--

--

Ray Williams

Author/ Executive Coach-Helping People Live Better Lives and Serve Others