We are taught that we should lead others by “persuasion and long-suffering.” How do we do that when we live in such polarized times when we can’t even agree on one set of facts? How do we do that when in and out group thinking overrides our ability to listen to those on the other side of the divide, whether religion or politics?

I was reading a neuroscience article on seven factors that can help you be more persuasive, taken from a book called The Influential Mind. The brain operates using several rules that help it evaluate new information, and if we understand those rules, we are more likely to make an argument that is persuasive to others.

Prior Beliefs. Don’t start by telling others why they are wrong. New beliefs are easier to accept when they build on established beliefs. Information that contradicts our beliefs will be viewed with skepticism, or even a knee-jerk reaction.

Emotion. When two people feel the same emotion, this sync can make their mind more likely to agree. If you can laugh with someone, you are more likely to be on the road to persuasion.

Incentives. Avoid warnings and instead focus on rewards. What’s in it for them? Why should they care?

Agency. People don’t respond well to orders; they like options. If the other person feels that their choice is compelled, they will shut down and become stubbornly opposed.

Curiosity. If you simply tell someone something, that’s less persuasive than adding a hook, a “what’s in it for me” to get them to think more about it and see why it matters. Find some fun facts that are actually fun facts, and share those. Share case studies that get them to question rather than lead them to your conclusion.

State of Mind. When people are in a depressed mood, they are more susceptible to conservative ideas. When they are upbeat, they are willing to take more risks and consider progressive thoughts. Consider what happens in sports. If your team is losing, they are more likely to reduce their risks. If they are winning, they might try new or creative solutions. To change your ideas, you have to feel secure enough to open your mind.

Other people. When you make an idea sound like it is popular or widely accepted, it’s easier for the brain to accept it. In sales, we use a tactic called “social vetting” in which you take a customer’s preference and explain that it’s the most popular choice. In politics, the term “Overton window” refers to the breadth of options that are considered socially acceptable; outlier opinions can move that window further in one direction or the other. A broader exposure to social groups leads to more acceptance of new ideas.

Bear in mind that people don’t admit they are wrong or even perceive that they were mistaken in most cases. The brain re-writes their memories to the new belief. They usually think that they always believed it, even if that belief was hidden, secret or a “gut feeling.” That’s just how the brain works. Even if they remember the fact that they believed something different, that’s not the same as actually believing or experiencing that they believed differently.

On the You Are Not So Smart podcast, David McRaney has talked about how to persuade others who may have false beliefs, particularly conspiracy theory thinking, without being a disagreeable bully (which is also ineffective). The method he uses is very non-confrontational. The key to helping someone else be open-minded and curious is to be open-minded and curious yourself. He uses a method of asking the person to rate their belief on a scale of 1-10 (or a % works too). Then, you ask the person why that number isn’t higher, and you listen. Then you ask them why the number isn’t lower, and you listen. It’s a good start to letting someone actually assess their own views. At the end of the day, we might know what we believe, but we seldom know why we believe it. This type of inquiry helps people determine what the basis is for that belief. People can’t question their beliefs when they feel unsafe.

For example, in politics or religion, a lot of our “reasons” for what we believe can be boiled down to group loyalty. In a sense this is like Stockholm Syndrome. Belonging to this group is familiar and hasn’t killed us, so we end up identifying with it. It reminds me of the “home pride” that exists in people who have never really travelled. That home pride is real, and worthwhile. But it’s also based on a lack of exposure to other options. And yet, nobody is persuaded when you tell them “You just don’t know what better options are out there!” Instead, they have to come to their own conclusions.

If you inquire about someone else’s beliefs using this method, it sometimes prompts them to do likewise, and you can introduce your own thinking that might broaden the options for them. But even if not, they may become curious and look for new information on their own when they realize their commitment to an idea is lower than they thought. To reiterate, here are the steps:

  1. Rate your belief on a scale of 1-10 with 10 being you believe it to the point of claiming you “know” it, and 1 being that you don’t believe it at all or think the opposite is true.
  2. If it’s less than 10, explain why your rating isn’t higher. What’s holding you back?
  3. If it’s more than 1, explain why your rating isn’t lower.

In Steve Hassan’s book about Cults and Mind Control, he uses similar techniques when attempting to deprogram someone from a cult. He first makes sure that the individual will be safe and supported, but a few other tips he provides to the family members are helpful, including to help them understand that it’s up to the individual, and they should not imagine the cult to be more powerful than it is. They can still try to have a relationship with the person. Cults are ideas, and wrong ideas don’t have to separate us from our loved ones, even if that’s what the “cult” wants (including political cults, per Hassan’s book which applies the term cult very broadly).

A leadership coach I worked with 20 years ago said that the person in a relationship who has the most understanding bears the most responsibility for how the relationship goes, and that’s a thought I’ve held since then. Of course, everyone who has now heard that bears more responsibility for how their relationships go, because otherwise, you are saying you have less awareness than the other person.

I was at a prog-mo meetup 15 years ago where someone asked what % you believed in the BOM. Nobody was at 100%. A few were very low. Some were higher. Most were in the middle somewhere. My number now would be much lower than it was then. Today’s task is to use this method to examine your own beliefs, whatever they may be.

  • Have you used this 3-step method? Did your belief in whatever it was increase or decrease as a result?
  • Can you think of something you hold as a strong belief and use this method in the comments?
  • Can you think of a time when your belief level changed over time? Did it increase or decrease? What was the thing you believed? Why did it change?

Discuss.