I recently learned about something called the continued influence effect. I read about in a newspaper article here. This effect refers to the observation that outdated or incorrect information can continue to influence our beliefs even after it’s been corrected. From the article:

The continued influence effect manifests when we are presented with evolving information, such as with news updates around current events or with a changing scientific understanding around topics related to health. Knowing about this effect can help us understand how we’re all susceptible to the influence of outdated or problematic information, especially when trying to navigate the deluge of news and information we receive nowadays.

They give an example of MSG in Chinese food. It all started with a flawed 1968 report, and involves a set of symptoms cause by MSG including headache, throat swelling and stomach pain developing shortly after the ingestion of the food additive, used for its delicious umami flavor. Later studies have shown that MSG in normal serving size has little or no effect in the vast majority of people, yet the idea still persist that MSG is bad for you, even earning a name called the “Chinese Restaurant Syndrome.”

So why is it so difficult to update our attitudes when new information is presented? In trying to answer this question researchers conducted a study in 1994, where participants were asked to read a series of reports about a warehouse fire. From the article:

These reports included information about volatile materials, like oil paint and pressurized gas canisters, found in a closet. Later, participants were given updated information that the closet was actually empty; there were no volatile materials found. However, despite this correction, participants still relied on the original reports when making later judgments, saying, for example, the reason they thought the fire was particularly intense was that oil fires are difficult to extinguish. 

This continued reliance on the corrected information was not due to failure to understand the new information; the majority of participants were able to accurately report that the closet was indeed empty when asked directly. Yet, they still relied on the retracted information in their future judgments related to the fire’s cause.

Instead, the continued influence effect can be thought of as result of how our brains store and update information. When we receive a correction about something we already know, we don’t simply erase the old information from our minds. Rather, representations of both the old information and its correction coexist in our brain’s knowledge networks, playing a role in guiding our future judgments and beliefs. We’re especially likely to rely on old information when it’s the only explanation we have, or the one that comes to mind the easiest. When searching for reasons why our stomachs hurt, the warnings of fellow Yelpers make a compelling argument when they’re the most plausible explanation we can think of.

The good news is that this tendency also lends itself to a solution: When giving a correction, also provide people with new information to take the place of the old. In the case of the warehouse fire study, researchers found that when, along with the corrections about the contents of the closet, they also shared an alternative potential cause for the fire — arson materials found elsewhere — participants relied less on the outdated information. Similarly, you might explain to people that perhaps the reason for their post-orange chicken funk isn’t the MSG, but instead that the dish they chose had a lot of oil, a culprit for many of the same symptoms. 

You can see this problem happening real time in the current political climate in the USA. Musk will say that $50 million was spent on condoms for Hamas, Trump will then say it was $100 million (this all really happened), then it is debunked over and over again, but now both thoughts are living rent free in your head. The $50 million number will influence your ideas on the need for foreign aid even though you know it is false. Add vaccines cause autism, or any other number of bad science that gets stuck in our heads.

Since this is a Mormon themed blog, lets steer this towards the Church. It was said over and over that Blacks did not hold the priesthood because they were less valiant in the pre-mortal life, and that the dark skin was a curse. Then the Church comes along, first trying to gaslight you into thinking they never taught that, and then saying that it is wrong. Now you have both ideas in your head, with both influencing how you think about your fellow Black church members.

According to the article, the way to get the bad information out of your head is to offer an alternative reason for the prohibition of Blacks holding the priesthood. The way to do that is to tell the truth, that Brigham Young and many of the leaders were racist. This is too hard, so many members live with continued influence effect. The only solution for the Church that they see viable is to just wait until all the members that had the old reasons die out.

You can add any number of Church examples here. Joseph Smith translated the BofM looking through glasses called an urim and thummim, or he just looked at the pages directly and translated them. This has all been replaced by the real truth that he used a stone in a hat for all the pages that we have today as the BofM. The church even published a photo of the stone he used. But those pictures from the Ensign are still in our head, and still influencing our thoughts.

What are some examples of continued influence effect you see in the church?