I was recently listening to an interview with actor / comedian Tig Notaro. She talked about a mentor she had who really supported her and believed in her when she was a teen struggling with learning disabilities. Her mentor believed in her and was kind when she needed it, but she also said several really hurtful things about LGBTQ people that were particularly harmful to Tig, a young lesbian grappling with her identity. Decades later, now that Tig is pretty famous and successful, in a loving relationship, a cancer survivor, she happened upon this former mentor on social media. She noticed that this prior mentor was very pro-LGBTQ on social media, and she asked her to lunch. The woman was as supportive of her as ever and as enthusiastic about her success. Tig mentioned the mentor’s anti-LGBTQ views that she had shared back in the day, and the mentor explained that she knew she had said those things, and that she wished she hadn’t. She said that at the time she didn’t know that they weren’t really her views–they were views she was given by others that she mistakenly accepted without really thinking about it. She apologized and felt remorseful.
This was such an interesting story to me. Why do people profess to believe something they don’t actually believe? That’s a very salient question to anyone raised Mormon, although it’s obviously got much broader application as well. Being raised in the church, I remember a war in my thoughts at times between the idea that I should always be totally honest, and the idea that I was supposed to “bear testimony” of church doctrines, some of which I either didn’t believe or didn’t really care about or didn’t really know what I thought about. The church, particularly at that time was teaching the youth that it wasn’t enough to say you “believe”–you had to say you “know.” Rather than embracing the uncertainty inherent in faith, we were told that belief was a temporary stop on the way to absolutely certain knowledge, and that only when we had achieved knowledge (of things that are essentially unknowable) would we have a “real” testimony.
“A testimony is found in the bearing thereof,” we were told. It occurred to me that this meant that this could either mean that you could tell if something was true based on how you felt when you said it. We would have said that the spirit would testify that the thing was either true or false. The thing is, you can feel something’s true because you want it to be true or because you fear it to be true, and ruminating about your fears or desires may make your emotions stronger, but it has no bearing on reality.
But it also occured to me as an adult that the more you said something, even if it wasn’t true, the more you came to believe it was true, like Stuart Smalley’s daily affirmations of “I’m good enough. I’m smart enough. And doggone it, people like me” or like Trump’s claims that any election he lost was stolen. If everytime I make a mistake I hit myself in the head and say “Stupid! Stupid! Stupid!” it might reinforce the belief that I’m stupid. Neither of these things (using our feelings to test statements or saying something until we believe it more strongly) really has anything to do with how true something is, though.
There are reasons that religions require attestations of belief. It’s the sort of thing that draws a boundary between insiders and outsiders. If you declare belief in a specific doctrine, you are showing that you accept the group’s worldview and authority. It also makes it difficult for you to distance yourself later on if you realize you don’t believe it. You will not only lose face socially, but you will lose face to yourself. It’s embarrassing to realize that you said or did something harmful that you didn’t even believe in the first place but just basically did from peer pressure, group identity, or to signal that you were part of the group.
That’s not to say that all attestations are negative–on the contrary, they are often aspirational reminders of values that encourage good behavior or moral clarity, not just the more questionable values of loyalty and conformity. Consider the Hippocratic Oath to do no harm or the ethics pledges required by the bar association. These are also attestations. We talked about vocational awe in last week’s post–the mission statements we use in companies and other organizations can link us to a higher purpose. A doctor’s belief in first doing no harm may not provide a clear roadmap for how to respond to every situation, but it will give the doctor pause in making ethical choices. That’s a good thing.
There is a risk with some of these attestations of belief that less virtuous psychological needs like belonging or staying out of trouble can outweigh the weightier ones like “doing what’s right.” Here’s how it works when we attest a belief that really aren’t sure we believe:
- A person publicly affirms a belief.
- They experience tension if their private beliefs don’t match.
- To reduce that tension, they gradually adjust their internal beliefs.
Conformity + repetition = internalization. That’s a formula that makes it much harder over time to let go of beliefs we didn’t even believe in the first place.
It also has social consequences. When many people publicly state the same thing, whether they believe it or not, it creates:
- a sense that the belief is widely accepted or “common sense”
- social pressure for others to align
- a stable framework for interpreting events that may not accord with factual reality
The collective affirmation supports and sustains the alternate reality. Group cohesion depends more on commitment from members than it does on the sincerity of their belief. For example, if you were a Russian soldier conscripted into an unjust war, the war relies on your actions and commitment, not on whether you really believe in the war or not.
- Have you ever attested something that you knew you didn’t believe? How did you know you didn’t believe it?
- What happened when you had to change what you said you believed? Did you lose face socially? How did you feel about your former statements of belief?
- Do you see the social consequences of attestations of belief in things that are either untrue (e.g. stolen elections) or unknowable (e.g. belief in God)? Does this differ from attestations about moral values (e.g. human rights)?
Discuss.
