Here’s a book worth reading: Misbelief: What Makes Rational People Believe Irrational Things (HarperCollins, 2023) by Dan Ariely, a professor of psychology at Duke. It’s a straightforward analysis of how we as thinking humans are essentially hard-wired to believe a lot of things that aren’t true, and it’s the best book on this subject that I have come across. That problem seems to have gotten worse lately, and it’s not just social media that is to blame. Why are we so often suckers for misinformation? Read on. I’ll highlight a few ideas from the book, then think a bit about Mormon misbelief.

The author describes the term misbelief as “a distorted lens through which people begin to view the world, reason about the world, and then describe the world to others” (p. 15). The more you read in this book, the less confidence you have in our human ability to discern facts and reliably digest them. Or perhaps your confidence in human rationality is already pretty low, in which case the book will explain to you why this is the case. Here’s a few sentences from the publisher’s page:

Misinformation, it turns out, appeals to something innate in all of us—on the right and the left—and it is only by understanding this psychology that we can blunt its effects. Grounded in years of study as well as Ariely’s own experience as a target of disinformation, Misbelief is an eye-opening and comprehensive analysis of the psychological drivers that cause otherwise rational people to adopt deeply irrational beliefs. 

First, this is not a recent development, and he notes a few examples. When the Roman emporer Nero died in 68 AD, there were rampant rumors that his death was faked and he would return. In the 1960s, there were persistent rumors the Paul McCartney had died and been replaced by a look-alike. There were (and still are?) claims that the Apollo moon landings were faked. So it’s not the emergence of social media or Trump’s popularization of the term and concept of “fake news” that created the problem. It’s rooted in the human psyche and how we process or misprocess information. The author looks at the emotional, cognitive, personality, and social elements that create and sustain the problem.

Emotional Elements and Stress. Stress is a big factor here. There is predictable stress (big project deadline, gotta file those taxes, a looming root canal) that we can generally handle. Then there is unpredictable stress, which can push us over the edge: unexpected job loss, unexpected death of a loved one, a cancer diagnosis. The Covid pandemic is another example, but this was not a personal challenge but a broad global event that created unpredictable stress for just about everyone on the planet at the same time and for an extended period. No wonder it generated so many conspiracy theories. Stress is cumulative, and it affects how you think. It takes up some of your cognitive bandwidth, leaving you fewer mental resources to accurately judge and analyze information. It’s unfair, really: at the moment you get an unwelcome diagnosis from the doctor, the imposed stress limits your ability to think clearly when you go to the Internet and look up facts and various treatments (some reliable, some not) for your serious condition. So big stress makes people more vulnerable to adopt misbeliefs and affirm conspiracy theories.

One way to deal with such a crisis is to identify a villain, because any explanation, any speculated cause, regardless of validity, is better than having no idea what is threatening your health or your life. Can’t get the mortgage you want because interest rates went up? It’s easier to blame the Fed than to read and digest a twenty-page essay on how social and economic forces interact in complex ways to determine market interest rates. Picking a villain (Dr. Fauci or the Chinese) gives us a sense of control when faced with unpredictable stress caused by a complex or confusing event (the Covid pandemic).

Cognitive Elements. A big factor here is confirmation bias, where we look for info that supports our desired belief and tend to ignore any conflicting information. I’m sure you’ve chewed on this concept a time or two already. An interesting discussion was how we use search engines. If you read an article that claims the mob engineered the assassination of Kennedy, then search “mafia killed Kennedy,” you’ll pull up plenty of links confirming that idea. The author suggested we try searching the opposite side of the thesis as well: “the mafia did not kill Kennedy” will pull up an entirely different set of links and information. Social media algorithms are a big culprit here. You don’t enter search terms; Facebook (or whatever) does it automatically, feeding you posts and stories that confirm the story or two you recently read. What if those algorithms were tweaked to feed you some stories supporting and some stories contesting the data you feed the app through your clicks and reading? I think that would make the world a better place.

Here’s another problem: motivated reasoning. It seems to be the case that rather than learning facts, analyzing them, and arriving at rational conclusions, we humans tend to quickly leap to a desired conclusion, then seek facts and arguments to support that conclusion. That’s why so many conclusions and opinions that people cling to seem so resistant to contrary facts and persuasion. Here’s another helpful concept: solution aversion. Say there’s a big problem and a proposed solution, let’s say a credible and effective solution. If we don’t like the solution, possibly because it conflicts with other beliefs or our ideology, we are more likely to pivot back and simply deny the problem rather than adopt the solution. And then there is the Dunning-Kruger effect, where people often, when getting a little knowledge on a subject, become quite overconfident in their views and more willing to broadcast those views. That generates a lot of noisy misbeliefs on social media and the Internet. The bottom line on the author’s sixty pages of discussion on the cognitive pitfalls of the human psyche: we’re not as smart or as rational as we think we are. Not by a long shot.

Personality and Social Factors. I haven’t read the second half of the book yet, but I won’t let that stop me. On personality, the author summarizes: “Why are some people more susceptible [to misbelief] than others? Individual differences and certain personality traits such a patternicity, the tendency to trust one’s intuitions, certain decision-making biases, and narcissism add to the picture” (p. 221). Patternicity is our habit of seeing patterns where none exist. The decision-making biases he refers to are the conjunction fallacy, illusory correlations, and hindsight bias. You’ll like this quote about narcissism:

Stressed-out narcissists have a particularly strong need to explain what is going on in general, and specifically the lack of attention they are receiving, and with that need comes a higher tendency to move down the funnel of misbelief. (p. 221)

For social factors, I’ll just tell you the chapter title: “Ostracism, Belonging, and the Social Attraction of Misbelief.” That kind of says it all, along with a section title in the chapter: “How Social Groups Solidify Misbeliefs.” The Internet and social media certainly makes it easier for those holding strong opinions on this or that topic, including of course various conspiracy theories, to get together online and reinforce each other’s misbeliefs.

Mormon Misbeliefs

You could take the concepts noted above and apply them to the misbeliefs of any denomination, religion, or ideology. It’s a general problem, not just a Mormon problem. But I’m going to consider Mormon misbeliefs, because that’s what we do here. And wow, there’s a lot of low-hanging fruit, but I’m going to consider just a few and rely on readers to chime in with other points. And let’s be quite clear about one thing right up front: misbelief is not a good thing. Believing false ideas may sometimes be harmless, but can often be dangerous to oneself and to others. And I think most people would agree with the statement, “I would like to identify and correct my own misbeliefs.” So here are some links between the earlier discussion and Mormon culture and practice. I’ll just take one from each of the four categories.

Mormon Villains. Mormons love villains, possibly because there is so much bad stuff that comes our way or that is self-inflicted. Satan and “the world” come to mind as frequently cited villains. Mormons are trained to think this way. And sometimes the membership becomes the villain. When senior leadership plans and puts in place a new program or practice, then it turns out to not work well or be a total failure, they’ll blame the standard villains Satan and “the world,” but they’ll also blame the members for not being diligent or committed or faithful or whatever. And Mormons love the stories where villains meet their just rewards, such as the Korihor story. Once you recognize this concept and habit, you’ll start seeing it everywhere in LDS discourse.

Confirmation Bias. And I’m thinking here of just one aspect, the regular directives by senior leadership to get information about LDS doctrine and history only from LDS sources, which are deemed to be reliable and credible. You don’t have to go the trouble of ignoring conflicting information if you never encounter conflicting information. So in the Church, confirmation bias isn’t simply a feature of human reasoning, it has become a feature of the institution. Let me note a contrary view. I recall Richard Bushman recommending that when members encounter a troubling LDS issue, they should read further but also read across the whole spectrum: read an LDS publication, read an apologist, read a non-LDS academic (or a non-apologist LDS academic), read a critic. Get information from all sides. That’s good advice.

Narcissism. Well, leaders in every social institution tend to be a little full of themselves. You rarely get humble, self-effacing, shy, unconfident people who rise through any hierarchy and become leaders. But reading through the DSM-5 definition of narcissism on page 219 does set off some warning bells: “pervasive pattern of grandiosity.” “Needs and requires excessive admiration.” “Has a sense of entitlement.” The point, of course, is not to point fingers at this or that leader. The point is that narcissists are particularly liable to adopt convenient misbeliefs. The top-down nature of LDS church governance means that misbeliefs embraced by LDS leaders, particularly LDS Presidents, quickly become incorporated into LDS publications, curriculum, and discourse by other LDS leaders. They quickly become institutional misbeliefs.

Ostracism and Belonging. Flip sides of the same coin. The LDS community of believers and most ward congregations have a strong sense of community and belonging. Step outside a few Mormon boundaries and a person will generally find themselves ostracized, simply by the workings of Mormon interpersonal relations rather than some formal order from a local leader. Remember the link here to misbelief: a strong sense of community provides a social incentive to adopt and retain misbelief. Correct or refute a Mormon misbelief in a Sunday class and see what happens. You won’t get many thank-yous (although you may get one or two who approach you privately and agree with you or at least appreciate the comment).

Conclusion. Forewarned is forearmed, they say, so recognizing these elements that play into our misbeliefs (in all areas of life) can be very beneficial. Of course, identifying and rejecting a misbelief doesn’t happen in a vacuum. It can be very jarring to acknowledge and reject a misbelief one has earlier embraced. A previously trusted source from whom some misinformation came may now have less trust or become completely unreliable.

  • What do you think about the author’s points that I summarized in the first half of the post?
  • Any Mormon misbeliefs (or Mormon practices that encourage misbeliefs) that you’d like to add to my list?
  • Here’s a tough one: any success stories in helping a friend or family member or class member recognize a Mormon misbelief and move beyond it in a positive way?