For better understanding, expose your beliefs for others to criticize

A few nights ago, I was about to make a right turn through a green light in icy conditions at a dark intersection when I suddenly noticed a woman with a baby carriage running across the intersection, right in front of my path. I slammed on my brakes, and my car slid slightly on the ice, but I avoided hitting her. I honked, rolled down my window and shouted at the woman that I could have killed her and that she shouldn’t be crossing an icy street in the dark with a “Don’t Walk” signal. To my great surprise, a few seconds later, a voice from across the street started swearing at me. After picking up my wife at the train station, I returned to the store from which someone had shouted and told them I was that guy and wanted to hear their perspective. One of the guys wasn’t interested in having a discussion and just started cussing me out. But another guy was very reasonable and engaged me in a good discussion. He said pedestrians in an intersection always have the right-of-way (which I knew, which is why I slammed on my brakes and didn’t enter the intersection) and said that if I had hit the woman, it would have been her fault (because she didn’t have a walk signal and was very hard to see) but that I shouldn’t have honked and shouted because she barely spoke English and because I probably scared her baby. He said I had “scared her half to death.” I explained that I had very intentionally tried to scare her because what she did was so incredibly dangerous that, if she did that ten or twenty times, some car would eventually hit and kill her and/or her baby. After politely exchanging views, we shook hands, each the wiser after seeing the incident from another’s perspective. If a similar situation ever arises again, I will still try to make my point forcefully, but with less honking and shouting.

I mention this in light of new research demonstrating the power of exposing your beliefs to criticism from others.

It has long been clear that people (especially older people and people with greater experience/expertise in a field) tend to suppress — by ignoring or rationalizing away — facts that don’t fit our beliefs. Effectively, we filter our observations of the world through our theories of how the world works. We often fail to detect that our theories/beliefs are incorrect because we systematically suppress information that might undermine those theories/beliefs.

This fabulous article describes the location in our brain responsible for our intellectual blindness… a location that matures only in early adulthood. Scientists showed real and fake videos of balls falling to physics majors (who know balls should fall at equal speeds, regardless of their sizes) and non-physics majors (who generally believe larger balls fall faster):

when Dunbar monitored the subjects in an fMRI machine, he found that showing non-physics majors the correct video triggered a particular pattern of brain activation: There was a squirt of blood to the anterior cingulate cortex, a collar of tissue located in the center of the brain. The ACC is typically associated with the perception of errors and contradictions — neuroscientists often refer to it as part of the “Oh shit!” circuit — so it makes sense that it would be turned on when we watch a video of something that seems wrong… [Physics majors'] education enabled them to see the error, and for them it was the inaccurate video that triggered the ACC.

But there’s another region of the brain that can be activated as we go about editing reality. It’s called the dorsolateral prefrontal cortex, or DLPFC. It’s located just behind the forehead and is one of the last brain areas to develop in young adults. It plays a crucial role in suppressing so-called unwanted representations, getting rid of those thoughts that don’t square with our preconceptions. For scientists, it’s a problem.

When physics students saw the Aristotelian video with the aberrant balls, their DLPFCs kicked into gear and they quickly deleted the image from their consciousness. In most contexts, this act of editing is an essential cognitive skill. (When the DLPFC is damaged, people often struggle to pay attention, since they can’t filter out irrelevant stimuli.) However, when it comes to noticing anomalies, an efficient prefrontal cortex can actually be a serious liability. The DLPFC is constantly censoring the world, erasing facts from our experience. If the ACC is the “Oh shit!” circuit, the DLPFC is the Delete key. When the ACC and DLPFC “turn on together, people aren’t just noticing that something doesn’t look right,” Dunbar says. “They’re also inhibiting that information.”

The lesson is that not all data is created equal in our mind’s eye: When it comes to interpreting our experiments, we see what we want to see and disregard the rest. The physics students, for instance, didn’t watch the video and wonder whether Galileo might be wrong. Instead, they put their trust in theory, tuning out whatever it couldn’t explain. Belief, in other words, is a kind of blindness.

As the article hints, blinding ourselves to facts at odds with our existing beliefs isn’t necessarily “bad”; it helps us cope with life. If we constantly reconsidered every established belief, we wouldn’t be able to deal with day-to-day life (“Can I trust my spouse?” “What if the sun doesn’t come up tomorrow?” “Am I really allergic to peanuts?” “Why can’t I drive on the left side of the road?”) But the stickiness of our beliefs means we had better hope we have wise parents and teachers early in life because false beliefs planted early in life can be hard to overcome.

The article also highlights the findings of a professor who studied how real scientists make breakthroughs and found that success comes from exposing your opinions to others' comment and criticism, especially others who possess knowledge and backgrounds you lack:

While the scientific process is typically seen as a lonely pursuit — researchers solve problems by themselves — Dunbar found that most new scientific ideas emerged from lab meetings, those weekly sessions in which people publicly present their data. Interestingly, the most important element of the lab meeting wasn’t the presentation — it was the debate that followed. Dunbar observed that the skeptical (and sometimes heated) questions asked during a group session frequently triggered breakthroughs, as the scientists were forced to reconsider data they’d previously ignored. The new theory was a product of spontaneous conversation, not solitude; a single bracing query was enough to turn scientists into temporary outsiders, able to look anew at their own work…

This is why other people are so helpful: They shock us out of our cognitive box. “I saw this happen all the time,” Dunbar says. “A scientist would be trying to describe their approach, and they’d be getting a little defensive, and then they’d get this quizzical look on their face. It was like they’d finally understood what was important.”

Posted by James on Thursday, February 18, 2010