Here’s a book worth reading: Misbelief: What Makes Rational People Believe Irrational Things (HarperCollins, 2023) by Dan Ariely, a professor of psychology at Duke. It’s a straightforward analysis of how we as thinking humans are essentially hard-wired to believe a lot of things that aren’t true, and it’s the best book on this subject that I have come across. That problem seems to have gotten worse lately, and it’s not just social media that is to blame. Why are we so often suckers for misinformation? Read on. I’ll highlight a few ideas from the book, then think a bit about Mormon misbelief.
The author describes the term misbelief as “a distorted lens through which people begin to view the world, reason about the world, and then describe the world to others” (p. 15). The more you read in this book, the less confidence you have in our human ability to discern facts and reliably digest them. Or perhaps your confidence in human rationality is already pretty low, in which case the book will explain to you why this is the case. Here’s a few sentences from the publisher’s page:
Misinformation, it turns out, appeals to something innate in all of us—on the right and the left—and it is only by understanding this psychology that we can blunt its effects. Grounded in years of study as well as Ariely’s own experience as a target of disinformation, Misbelief is an eye-opening and comprehensive analysis of the psychological drivers that cause otherwise rational people to adopt deeply irrational beliefs.
First, this is not a recent development, and he notes a few examples. When the Roman emporer Nero died in 68 AD, there were rampant rumors that his death was faked and he would return. In the 1960s, there were persistent rumors the Paul McCartney had died and been replaced by a look-alike. There were (and still are?) claims that the Apollo moon landings were faked. So it’s not the emergence of social media or Trump’s popularization of the term and concept of “fake news” that created the problem. It’s rooted in the human psyche and how we process or misprocess information. The author looks at the emotional, cognitive, personality, and social elements that create and sustain the problem.
Emotional Elements and Stress. Stress is a big factor here. There is predictable stress (big project deadline, gotta file those taxes, a looming root canal) that we can generally handle. Then there is unpredictable stress, which can push us over the edge: unexpected job loss, unexpected death of a loved one, a cancer diagnosis. The Covid pandemic is another example, but this was not a personal challenge but a broad global event that created unpredictable stress for just about everyone on the planet at the same time and for an extended period. No wonder it generated so many conspiracy theories. Stress is cumulative, and it affects how you think. It takes up some of your cognitive bandwidth, leaving you fewer mental resources to accurately judge and analyze information. It’s unfair, really: at the moment you get an unwelcome diagnosis from the doctor, the imposed stress limits your ability to think clearly when you go to the Internet and look up facts and various treatments (some reliable, some not) for your serious condition. So big stress makes people more vulnerable to adopt misbeliefs and affirm conspiracy theories.
One way to deal with such a crisis is to identify a villain, because any explanation, any speculated cause, regardless of validity, is better than having no idea what is threatening your health or your life. Can’t get the mortgage you want because interest rates went up? It’s easier to blame the Fed than to read and digest a twenty-page essay on how social and economic forces interact in complex ways to determine market interest rates. Picking a villain (Dr. Fauci or the Chinese) gives us a sense of control when faced with unpredictable stress caused by a complex or confusing event (the Covid pandemic).
Cognitive Elements. A big factor here is confirmation bias, where we look for info that supports our desired belief and tend to ignore any conflicting information. I’m sure you’ve chewed on this concept a time or two already. An interesting discussion was how we use search engines. If you read an article that claims the mob engineered the assassination of Kennedy, then search “mafia killed Kennedy,” you’ll pull up plenty of links confirming that idea. The author suggested we try searching the opposite side of the thesis as well: “the mafia did not kill Kennedy” will pull up an entirely different set of links and information. Social media algorithms are a big culprit here. You don’t enter search terms; Facebook (or whatever) does it automatically, feeding you posts and stories that confirm the story or two you recently read. What if those algorithms were tweaked to feed you some stories supporting and some stories contesting the data you feed the app through your clicks and reading? I think that would make the world a better place.
Here’s another problem: motivated reasoning. It seems to be the case that rather than learning facts, analyzing them, and arriving at rational conclusions, we humans tend to quickly leap to a desired conclusion, then seek facts and arguments to support that conclusion. That’s why so many conclusions and opinions that people cling to seem so resistant to contrary facts and persuasion. Here’s another helpful concept: solution aversion. Say there’s a big problem and a proposed solution, let’s say a credible and effective solution. If we don’t like the solution, possibly because it conflicts with other beliefs or our ideology, we are more likely to pivot back and simply deny the problem rather than adopt the solution. And then there is the Dunning-Kruger effect, where people often, when getting a little knowledge on a subject, become quite overconfident in their views and more willing to broadcast those views. That generates a lot of noisy misbeliefs on social media and the Internet. The bottom line on the author’s sixty pages of discussion on the cognitive pitfalls of the human psyche: we’re not as smart or as rational as we think we are. Not by a long shot.
Personality and Social Factors. I haven’t read the second half of the book yet, but I won’t let that stop me. On personality, the author summarizes: “Why are some people more susceptible [to misbelief] than others? Individual differences and certain personality traits such a patternicity, the tendency to trust one’s intuitions, certain decision-making biases, and narcissism add to the picture” (p. 221). Patternicity is our habit of seeing patterns where none exist. The decision-making biases he refers to are the conjunction fallacy, illusory correlations, and hindsight bias. You’ll like this quote about narcissism:
Stressed-out narcissists have a particularly strong need to explain what is going on in general, and specifically the lack of attention they are receiving, and with that need comes a higher tendency to move down the funnel of misbelief. (p. 221)
For social factors, I’ll just tell you the chapter title: “Ostracism, Belonging, and the Social Attraction of Misbelief.” That kind of says it all, along with a section title in the chapter: “How Social Groups Solidify Misbeliefs.” The Internet and social media certainly makes it easier for those holding strong opinions on this or that topic, including of course various conspiracy theories, to get together online and reinforce each other’s misbeliefs.
Mormon Misbeliefs
You could take the concepts noted above and apply them to the misbeliefs of any denomination, religion, or ideology. It’s a general problem, not just a Mormon problem. But I’m going to consider Mormon misbeliefs, because that’s what we do here. And wow, there’s a lot of low-hanging fruit, but I’m going to consider just a few and rely on readers to chime in with other points. And let’s be quite clear about one thing right up front: misbelief is not a good thing. Believing false ideas may sometimes be harmless, but can often be dangerous to oneself and to others. And I think most people would agree with the statement, “I would like to identify and correct my own misbeliefs.” So here are some links between the earlier discussion and Mormon culture and practice. I’ll just take one from each of the four categories.
Mormon Villains. Mormons love villains, possibly because there is so much bad stuff that comes our way or that is self-inflicted. Satan and “the world” come to mind as frequently cited villains. Mormons are trained to think this way. And sometimes the membership becomes the villain. When senior leadership plans and puts in place a new program or practice, then it turns out to not work well or be a total failure, they’ll blame the standard villains Satan and “the world,” but they’ll also blame the members for not being diligent or committed or faithful or whatever. And Mormons love the stories where villains meet their just rewards, such as the Korihor story. Once you recognize this concept and habit, you’ll start seeing it everywhere in LDS discourse.
Confirmation Bias. And I’m thinking here of just one aspect, the regular directives by senior leadership to get information about LDS doctrine and history only from LDS sources, which are deemed to be reliable and credible. You don’t have to go the trouble of ignoring conflicting information if you never encounter conflicting information. So in the Church, confirmation bias isn’t simply a feature of human reasoning, it has become a feature of the institution. Let me note a contrary view. I recall Richard Bushman recommending that when members encounter a troubling LDS issue, they should read further but also read across the whole spectrum: read an LDS publication, read an apologist, read a non-LDS academic (or a non-apologist LDS academic), read a critic. Get information from all sides. That’s good advice.
Narcissism. Well, leaders in every social institution tend to be a little full of themselves. You rarely get humble, self-effacing, shy, unconfident people who rise through any hierarchy and become leaders. But reading through the DSM-5 definition of narcissism on page 219 does set off some warning bells: “pervasive pattern of grandiosity.” “Needs and requires excessive admiration.” “Has a sense of entitlement.” The point, of course, is not to point fingers at this or that leader. The point is that narcissists are particularly liable to adopt convenient misbeliefs. The top-down nature of LDS church governance means that misbeliefs embraced by LDS leaders, particularly LDS Presidents, quickly become incorporated into LDS publications, curriculum, and discourse by other LDS leaders. They quickly become institutional misbeliefs.
Ostracism and Belonging. Flip sides of the same coin. The LDS community of believers and most ward congregations have a strong sense of community and belonging. Step outside a few Mormon boundaries and a person will generally find themselves ostracized, simply by the workings of Mormon interpersonal relations rather than some formal order from a local leader. Remember the link here to misbelief: a strong sense of community provides a social incentive to adopt and retain misbelief. Correct or refute a Mormon misbelief in a Sunday class and see what happens. You won’t get many thank-yous (although you may get one or two who approach you privately and agree with you or at least appreciate the comment).
Conclusion. Forewarned is forearmed, they say, so recognizing these elements that play into our misbeliefs (in all areas of life) can be very beneficial. Of course, identifying and rejecting a misbelief doesn’t happen in a vacuum. It can be very jarring to acknowledge and reject a misbelief one has earlier embraced. A previously trusted source from whom some misinformation came may now have less trust or become completely unreliable.
- What do you think about the author’s points that I summarized in the first half of the post?
- Any Mormon misbeliefs (or Mormon practices that encourage misbeliefs) that you’d like to add to my list?
- Here’s a tough one: any success stories in helping a friend or family member or class member recognize a Mormon misbelief and move beyond it in a positive way?

First, am I the only one who, while reading the piece above, thinks of a person in his/her life who embodies the various characteristics described above? It’s almost like the author (Dave B) knows the person I am thinking of.
Second, am I the only one here who is horrified when thinking how many of these characteristics applied to me in my Mormon past? All I can say is I’m sorry.
I may still believe in some unbelievable myths, who knows. But I guarantee you the myths I might believe in have nothing to do with religion generally or Mormonism specifically. That’s progress, no?
I’m not sure how this type of analysis applies in a religious context. As practicing Mormon I believe in a God who isn’t seen and can’t be proved, miracles that are irrational, an afterlife that is unprovable, etc. That is the nature of faith. “Rational” thought that relies on provable facts just doesn’t apply to faith, arts, music, or love. That’s always been my problem with the “scientific method”: it thinks its Queen.
I’ve done a deep dive into cognitive biases and how they affect us and I find the topic fascinating. To Josh H., I would say that I can almost guarantee that you still believe in some unbelievable myths that you’re not aware of. We all do, whether we were raised in Mormonism or not, and whether we’ve left Mormonism or not.
Based on what I’ve read on the topic, I’m still susceptible to all the same biases that I was susceptible to when I was a TBM, now I’m just open to being influenced by other sources besides the LDS church. I still have confirmation bias, I still look for villains, I still engage in us vs. them thinking. ect…. just from a different viewpoint. Since becoming aware of biases, I try my best to see them in myself and eliminate them… but that’s the thing about our biases and misbeliefs, it’s REALLY hard for us to see them in ourselves.
Even still, I do think it’s beneficial to examine our biases as best we can- particularly through the lens of: “Do my biases and misbeliefs cause harm to others? Do they cause harm to myself?” And then make changes as needed so that we are not causing harm to others or ourselves.
Lily,
There is a difference between the scientific method and “scientism.” Scientism is the absolute belief in the power of science. It sees any other academic or cultural endeavor (philosophy, the arts, religions, morality, etc.) as incapable of revealing truth. In other words, one can highly value the scientific method and what the various sciences have revealed and still look to the arts, philosophy and religion to develop and convey truths and meanings outside of the scientific method. Plenty of scientists and others who use the scientific method are not guilty of scientism.
I am prone to scientism, because science has given us some of my favorite stuff: my car, my phone, a world free from smallpox, antibiotics, the lights, the plumbing, flights, the Internets. I don’t have to take anybody’s word for science. Either it’s observable/replicable, or it’s bad science. Science can make horrors too, and hasn’t always done a good job checking itself. Still, a big net win.
Having said that, I love the arts. To be honest, the line between art and science is kind of blurry in my head. Historically, some of the best artists were scientists (or vice versa). Good art moves me. You and I might not like the same things, but if enough of us love the same art, it lasts so that more of us can be moved. Another win for humanity.
Where I think we’ve done a not so good job is on the questions of religion and morality. I have no personal doubt that the sacred exists. I’ve experienced it as a category distinct from art or science. But for others it doesn’t. Or we get into arguments about whose sacred trumps whose. We all have very different moral priorities that seem to come into conflict all the time, and everyone is always trying to make sure their moral view wins. Ugliness results.
I’m not denying the need for religion or a moral system; I just think that despite our great teachers over the ages, humanity has done a really poor job with these epistemologies relative to art and science. It’s almost like we aren’t mature enough for religion or morality, maybe because of all the biases and fallacies that Dave B walks us through above.
Great post!
Latter-day saints believe in angels and gold plates–a fact that I’m not ashamed of. Even so, we can certainly be gullible at times–buying into goofy ideas about anything from moneymaking to prophetic lore.
I think this little bit of counsel from Moroni 7 is useful here:
18 And now, my brethren, seeing that ye know the light by which ye may judge, which light is the light of Christ, see that ye do not judge wrongfully; for with that same judgment which ye judge ye shall also be judged.
Case in point: Two weeks ago a survey showed that 33% of Republicans believe that Taylor Swift is a secret government operative. Commentators said that she was going to take the field at Super Bowl halftime – with Travis Kelce in tow (whom, I believe, was working at the time) – and announce that all Swifties were to unite behind President Biden and reelect him. Still holdin’ my breath on that one.
I’ve been reading Homo Deus by Yuval Noah Harari and boy oh boy has my worldview been shattered several times. For example, I used to believe that I consist of a single indivisible self. But it turns out there is good evidence that this is a misbelief (look up cases of split brain patients).
Over the course of my life I have also believed such incorrect notions as:
– The Mormon prophet speaks face to face with God in the temple
– We are angelic beings of refined matter piloting corrupt mortal vessels
– Joseph Smith was a paragon of monogamy and his relationship with Emma was healthy and exemplary
– The physics of the universe obey the commands of the Priesthood
These beliefs eventually led me into a whole lot of cognitive dissonance and anxiety. I’m learning more and more that most things in life can be explained by one simple fact: we are simply primates whose chief evolutionary advantage (the ability to cooperate in large groups using language and shared mythology) has catapulted us into a civilization our primate brains are not fit to handle in a wholly responsible manner.
Yuval Noah Harari is always fun.
Someone else I have enjoyed is Robert Sapolsky, He has some really interesting lectures on Youtube, including some on the biology of shamanism and religiosity.
I will always recommend his book,”Behave: The Biology of Humans at Our Best and Worst”
“And Mormons love the stories where villains meet their just rewards, such as the Korihor story.”
Yet again, I find myself wondering what kind of Mormons permabloggers hang around or have been exposed to. I also know this isn’t the first time you’ve brought Korihor up over the years.
I personally, along with most people I know, find the Korihor story sad. For all I know, the two of us could have been best friends in the premortal existence. I find his situation very saddening. As much as many may hate to admit it, it’s really only three or four bad decisions that separate an active member of the Church from a serial killer, or in this case, a spiritual serial killer. He really could have been any one of us. But it’s this verse that really hits me hardest in the chapter:
55 But Alma said unto him: If this curse should be taken from thee thou wouldst again lead away the hearts of this people; therefore, it shall be unto thee even as the Lord will.
Korihor had the entire chapter to sincerely repent, and given Alma’s background, I really don’t think he would have made this pronouncement unless he knew without a doubt that God knew without a doubt that this wouldn’t be the case. Korihor was just a plain bad man who did get what he deserved. That doesn’t mean I’m happy about his outcome. I can still mourn for him.
The book sounds very interesting, and reminds me of BBs Christmas post. I think some of this thinking can lead to mild paranoia, but on the other hand, I can see the necessity. I also wonder if we need to apply bias checking multiple times. The Primary narrative doesn’t always add up, but when I apply the same standards of questioning, the critics’ narrative doesn’t always hold up either. And whether people care to admit it, I think there will always be a need to place some amount of faith in whatever narrative you’ve finally come to accept.
A common misbelief of my father was that living in the Telestial Kingdom wouldn’t be pleasant with murderers and other sinners. He didn’t realize that the millennium was actually time for them to repent and become great people. Living with people in the Telestial Kingdom will be far more pleasant than living with the people we associate with now.
I think most people miss the whole point of the Korihor story. The point is what he was preaching that is identified as anti Christ. He was preaching that people who do well do so because of their own good management and those who do poorly do so because of their own bad management. That is an anti Christ philosophy, whereas accepting that some of us are blessed with certain advantages and others aren’t, and we don’t get to control who gets what blessings when, is a Christ like philosophy. The rain falls on the just and the unjust and good people suffer and bad people still get blessings. That’s reality. Believing people deserve what they get and control what they get is anti Christ and contrary to reality.
“But I’m going to consider Mormon misbeliefs, because that’s what we do here.”
Perhaps more germane and introspective would be to identify Exmo Misbeliefs.
Kirkstall wrote:
“We are angelic beings of refined matter piloting corrupt mortal vessels”
I have no idea what refined matter is. Ether? Quarks? Multi-dimensional? Unfortunately, I know that few are truly angelic. And my body is more corruptible as I age. It hurts.
“The physics of the universe obey the commands of the Priesthood”
If this was an absolute, BYU would win far more athletic contests. If the 1980 Holiday Bowl was a result of priesthood command, it apparently only works when Catholics play.
Eli,
Whenever I hear the Korihor story I think of Sean Hannity, Alex Jones and Tucker Carlson and wish we had a modern-day Alma to make them mute. However, I don’t want them trampled to death.
Old Man,
I doubt anyone here would believe me, but I really was thinking more in terms of not only being cut off from any support from the Lord, but incurring the wrath of not being able to do any more damage. I find that extremely frightening and yet still merciful in coming up short of any number of other things the Lord could have done. Korihor deserved that. Being trampled isn’t something I’d wish on anyone either, nor do I believe the Lord would, but being left to your own devices, and in this case without your main weapon (speech), could naturally lead to some dire consequences.
Kirkstall:
“These beliefs eventually led me into a whole lot of cognitive dissonance and anxiety. I’m learning more and more that most things in life can be explained by one simple fact: we are simply primates whose chief evolutionary advantage (the ability to cooperate in large groups using language and shared mythology) has catapulted us into a civilization our primate brains are not fit to handle in a wholly responsible manner.”
While I certainly understand the desire to simplify our philosophy of meaning, my guess is that the vast majority of people — those who have lived as paupers and peasants or servants and slaves — have hoped that their mortal lives didn’t amount to the sum total of their existence. And like Charlie Brown who was happy to get a secondhand valentine–they’d be happy with any good news that might come their way about a second chance at life regardless of how strange or out of sync with reality it may seem.
“misbelief is not a good thing”
Is it possible that the above statement, taken as a universal truth, is itself a misbelief? For example, throughout history many humans have believed in a deity that rewards or punishes us depending on how ethically we behave. It could be argued that this belief has benefited society even if the belief isn’t true.
From a secular viewpoint, it seems possible that evolution has wired our brains in such a way that we’re better off believing certain falsehoods. Unrealistic optimism may be, on the whole, a healthier worldview than 100% accuracy..
We like to say we believe in the truth but in reality, most of our beliefs are probably based on misbeliefs or a lack of knowledge.
Look at the complications of the Mountain Meadows Massacre. Truth would have been to leave the settlers alone and let them pass through but who would have or could have said that truth in those times when the beliefs were the settlers were from Missouri, killed the prophet, and we needed to follow the church leaders or any number of other justifications for doing what was done, dressing up like Indians, and years before any form of justice was done for accountability and then years again for any form of honest explanation which still probably missed the point.
A more secular example would be either election denial, justification for Jan. 6th, or Nikey Haley misrepresenting why the Civil War was fought.
Decisions we make about the cars we drive, the houses we live in, the communities we live in, and to a greater extent the ones we avoid are all based more on misbeliefs than they are on hard cold truth and facts.
Excellent piece, Dave. Same to the follow up.
I wrote four essays and presented twice at Sunstone addressing the subject of “why are conspiracy theories so prevalent in Mormon communities?” You might find some of it worthwhile: https://tokensandsigns.org/in-the-mind-of-a-mormon-conspiracist/