2008 : WHAT HAVE YOU CHANGED YOUR MIND ABOUT? WHY? [1]

jamshed_bharucha's picture [5]
Psychologist; President Emeritus, Cooper Union
Education as Stretching the Mind

I used to believe that a paramount purpose of a liberal education was threefold:

1) Stretch your mind, reach beyond your preconceptions; learn to think of things in ways you have never thought before.

2) Acquire tools with which to critically examine and evaluate new ideas, including your own cherished ones.

3) Settle eventually on a framework or set of frameworks that organize what you know and believe and that guide your life as an individual and a leader.

I still believe #1 and #2. I have changed my mind about #3. I now believe in a new version of #3, which replaces the above with the following:

a) Learn new frameworks, and be guided by them.

b) But never get so comfortable as to believe that your frameworks are the final word, recognizing the strong psychological tendencies that favor sticking to your worldview. Learn to keep stretching your mind, keep stepping outside your comfort zone, keep venturing beyond the familiar, keep trying to put yourself in the shoes of others whose frameworks or cultures are alien to you, and have an open mind to different ways of parsing the world. Before you critique a new idea, or another culture, master it to the point at which its proponents or members recognize that you get it.

Settling into a framework is easy. The brain is built to perceive the world through structured lenses — cognitive scaffolds on which we hang our knowledge and belief systems.

Stretching your mind is hard. Once we've settled on a worldview that suits us, we tend to hold on. New information is bent to fit, information that doesn't fit is discounted, and new views are resisted.

By 'framework' I mean any one of a range of conceptual or belief systems — either explicitly articulated or implicitly followed. These include narratives, paradigms, theories, models, schemas, frames, scripts, stereotypes, and categories; they include philosophies of life, ideologies, moral systems, ethical codes, worldviews, and political, religious or cultural affiliations. These are all systems that organize human cognition and behavior by parsing, integrating, simplifying or packaging knowledge or belief. They tend to be built on loose configurations of seemingly core features, patterns, beliefs, commitments, preferences or attitudes that have a foundational and unifying quality in one's mind or in the collective behavior of a community. When they involve the perception of people (including oneself), they foster a sense of affiliation that may trump essential features or beliefs.

What changed my mind was the overwhelming evidence of biases in favor of perpetuating prior worldviews. The brain maps information onto a small set of organizing structures, which serve as cognitive lenses, skewing how we process or seek new information. These structures drive a range of phenomena, including the perception of coherent patterns (sometimes where none exists), the perception of causality (sometimes where none exists), and the perception of people in stereotyped ways.

Another family of perceptual biases stems from our being social animals (even scientists!), susceptible to the dynamics of in-group versus out-group affiliation. A well known bias of group membership is the over-attribution effect, according to which we tend to explain the behavior of people from other groups in dispositional terms ("that's just the way they are"), but our own behavior in much more complex ways, including a greater consideration of the circumstances. Group attributions are also asymmetrical with respect to good versus bad behavior. For groups that you like, including your own, positive behaviors reflect inherent traits ("we're basically good people") and negative behaviors are either blamed on circumstances ("I was under a lot of pressure") or discounted ("mistakes were made"). In contrast, for groups that you dislike, negative behaviors reflect inherent traits ("they can't be trusted") and positive behaviors reflect exceptions ("he's different from the rest"). Related to attribution biases is the tendency (perhaps based on having more experience with your own group) to believe that individuals within another group are similar to each other ("they're all alike"), whereas your own group contains a spectrum of different individuals (including "a few bad apples"). When two groups accept bedrock commitments that are fundamentally opposed, the result is conflict — or war.

Fortunately, the brain has other systems that allow us to counteract these tendencies to some extent. This requires conscious effort, the application of critical reasoning tools, and practice. The plasticity of the brain permits change - within limits.

To assess genuine understanding of an idea one is inclined to resist, I propose a version of Turing's Test tailored for this purpose: You understand something you are inclined to resist only if you can fool its proponents into thinking you get it. Few critics can pass this test. I would also propose a cross-cultural Turing Test for would-be cultural critics (a Golden Rule of cross-group understanding): before critiquing a culture or aspect thereof, you should be able to navigate seamlessly within that culture as judged by members of that group.

By rejecting #3, you give up certainty. Certainty feels good and is a powerful force in leadership. The challenge, as Bertrand Russell puts it in The History of Western Philosophy, is "To teach how to live without certainty, and yet without being paralyzed by hesitation".