It’s impossible to go through life without meeting people who have such deeply entrenched beliefs that no amount of discussion or evidence can convince them to change their minds. If you’ve experienced this you’re not alone, research confirms it. In a 2014 study published in Pediatrics, Nyhan et al studied if intensive education on the safety of vaccines could change the attitudes of anti-vaxxers. They found that refuting claims of a link between vaccines and autism actually decreased their intent to vaccinate!
Closed-Circuit Thinking is one of the most puzzling of cognitive processes. It’s characterized by an impenetrable defence system capable of denying reality and even the evidence of one’s experience. Its three simple rules are:
- My belief is always right.
- When my belief appears to be wrong…see Rule #1.
- Anyone who tries to tell me otherwise proves Rule #1.
While some may blame Closed-Circuit Thinking for many conflicts and for holding back progress, it’s also highly adaptive. Closed-Circuit Thinking fills a basic need for certainty while reducing the stress of ambiguity. Certainty has survival value by enabling rapid decisions and action in the face of danger or opportunity. Closed-Circuit Thinking is highly empowering…even if it’s sometimes wrong!
As early as the 1920’s Gestalt psychologists noticed that human beings consistently looked for the simplest, most orderly and symmetrical explanations. They called this the Law of Prägnanz. In 1972, psychologist Jerome Kagan proposed that the drive to resolve ambiguity and achieve certainty is hard-wired within the personality. In 1996, Kruglanski and Webster wrote about the need for “cognitive closure” as part of a stable personality dimension that also varies situationally.
This is in contrast to other dimensions such as Introspectiveness, Defensiveness, Narcissism, etc., where one is consistently somewhere on the dimension, independent of the situation. With the Need for Certainty an individual may tolerate ambiguity and uncertainty about most issues; yet engage in Closed-Circuit Thinking about others. A rigorously empirical, critically thinking scientist can simultaneously hold very traditional religious views about the nature of the Universe. A person may be very open-minded about new experiences yet rigidly closed-minded about a personal food orientation, e.g., veganism, gluten-free, etc.
It’s tempting to believe that people using Closed-Circuit Thinking may not be too bright. This would be a mistake. Research going as far back as the early 1960’s consistently shows that there isn’t any correlation between intelligence and Closed-Circuit Thinking . Some very smart and successful people hold very illogical and indefensible beliefs. There is still, after all, a Flat Earth Society of some 500 members in the West, which supports its views with “scientific” evidence.
Closed-Circuit Thinking is a result of the interplay between perceived personal vulnerability and how individuals understand and accept truth.
How Do We Know What’s True?
Epistemology is the branch of Philosophy that concerns itself with truth.
The secular world embraces Empiricism (the reliance on evidence) and The Scientific Method with its three pillars of Observation, Reproducibility, and Measurement. This approach has led to astonishing developments in technology, medicine, and physics, from which we benefit and at times, suffer unintended consequences.
But long before Science, human beings knew things about their world. They relied on what epistemology calls Phenomenology, direct personal experience and consciousness. We are fast learners. “Fool me once shame on you…fool me twice, shame on me” goes the old adage. Think of the brain as a vast prediction machine. It takes the memories of a few experiences and combines them with present situations to predict potential strategies and outcomes. And it does this constantly! Jeff Hawkins, founder of the Redwood Neuroscience Institute writes, “Prediction is not just one of the things your brain does. It is the primary function of the neo-cortex, and the foundation of intelligence“. Experience remains a powerful source of personal truth.
Some beliefs are so culturally embedded that they become what epistemology calls, Self-Evident Truths. They are true without the need for further proof and are ostensibly the result of “common human reasoning”. The US Declaration of Independence is a notable example. It states, “We hold these Truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness“. We’re often shocked that many cultures don’t share these values, and even feel empowered politically to “help” them embrace these truths.
Revealed Truth is related to self-evident truth in that it too requires no further proof. Its legitimacy comes from the experiences of one or more people, to whom truth was revealed through spiritual experience or a special connection to the Universe. These revelations were then codified, written down, and transmitted across many generations of believers.
All truth sources can be co-opted by Closed-Circuit Thinking. Empiricists may smugly believe that Science provides a degree of immunity from Closed-Circuit Thinking. This is a delusion. Conflicting research studies, vested interests, and failed consensus among scientists, can all lead to the cherry picking of research in support of virtually any belief. A recent article in Leadership Insights addressed this issue.
Truth sources that don’t invite questioning, challenge, new evidence, debate, critical thinking, and discussion, fuel Closed-Circuit Thinking.
Implications for Managers and the Rest of Us
- No matter how smart, self-aware, and open-minded you think you are, everyone is vulnerable to Closed-Circuit Thinking. It’s a unique feature of the Need for Certainty that it picks and chooses the areas it defends. The most important question you can ask yourself about anything you feel strongly is, “Why do I believe this to be true?” The answers may surprise you.
- In any discussion, try to understand the other’s truth source. Is it based on personal experience? Can they back up their claims with some empirical sources, even if these are subject to motivated reasoning? Do they make frequent reference to the truth being “obvious”, “clear”, “traditional”, or, “It’s always been this way”? These latter are often code for not having challenged the validity of the belief. Once again, a critical statement may be, “Help me to understand why this is true?”
- If you’re in a discussion involving Closed-Circuit Thinking, you’re better off changing the subject rather than engaging further. Any attempt to change another’s belief, even with overwhelming empirical evidence, is only likely to strengthen it. It’s a waste of time and counterproductive. When dealing with a colleague or subordinate, it’s more effective to modify the work context or their responsibilities, than to try to change their behaviour through discussion, performance appraisal, counselling, etc. These are likely to be perceived as adversarial and further proof that their belief must be true (see Rule #3 at the beginning of this article).
Closed-Circuit Thinking, when it does change, usually does so as a result of overwhelming personal experience. The popular media is rich with anecdotal stories of anti-vaxxers who have changed their minds once their own children came down with a preventable disease. In a 2015 study, Horne et al found that introducing anti-vaxxers to children who had been injured or scarred by preventable diseases resulted in a much higher rate of attitude and behavioural change than education alone.
By Dr. Steve Courmanopoulos PhD./ Image courtesy of Ashley Batz and Unsplash.
Dr. Courmanopoulos is the Senior Partner and CEO of Medius International Inc, a global consulting firm providing expertise in three areas: Intelligence, Strategy, and Organizational Development.
 Nyhan, B., Reifler, J., Richey, S., and Freed, G.L. (2014). Effective Messages in Vaccine Promotion: A Randomized Trial. Pediatrics, Feb 2014, pp. 2013-2365.
 Gestalt Psychology. Retrieved February 19, 2017 from, https://en.wikipedia.org/wiki/Gestalt_psychology
 Kagan, J. (1972). Motives and development. Journal of Personality and Social Psychology, 22(1), pp. 51-66.
 Sorentino, R.M. and Roney, C. (2013). The Uncertain Mind: Individual Differences in Facing the Unknown. Psychology Press.
 The Flat Earth Society. Website retrieved February 21, 2017 from http://theflatearthsociety.org/home/
 Empiricism. Retrieved February 21, 2017 from https://en.wikipedia.org/wiki/Empiricism
 Phenomenology. Retrieved February 21, 2017 from https://en.wikipedia.org/wiki/Phenomenology_(philosophy)
 Hawkins, J. and Blakeslee, S. (2005). On Intelligence. St. Martin’s Griffin Press.
 Single Studies Are A Fool’s Paradise. Leadership Insights. Retrieved February 21, 2017 from http://mediusinternational.com/main/index.php/2017/01/17/single-studies-are-a-fools-paradise-and-a-marketers-dream-or-nightmare/
 Motivated Reasoning: The Most Dangerous Defense. Leadership Insights. Retrieved February 21, 2017 from http://mediusinternational.com/main/index.php/2015/11/19/motivated-reasoning-the-most-dangerous-defense/
 Horne, Z., et al. (2015). Countering antivaccination attitudes. PNAS, 112 (33) pp. 10321-10324; published ahead of print August 3, 2015, doi:10.1073/pnas.1504019112