We like to think of ourselves as rational beings: faced with a decision or choice, we evaluate the options objectively and logically and proceed with the one that makes the most sense or is most practical. But research is showing that rather than behaving rationally, we are “predictably irrational,” as behavioral economist and Duke University Professor Dan Airely says. What he means is our behavior and decisions are far more related to our emotions and values than to reasoning and logic.
One of Ariely’s experiments that demonstrates this involves a fake medication or placebo to combat pain. In this experiment, Ariely’s research team gave pain relievers to participants recruited for the research study (in actuality, the pills were Vitamin C). One pill had a high price tag while one was cheap. The experiment revealed that the more expensive pill was measured to be more effective at combating pain than the less expensive one. Remember: the pills were Vitamin C, not known as a pain reliever.
VIdeo: Dan Ariely - Why Do Placebos Work? (6:03)
Another scientist who is studying the role of beliefs and values on shaping perception is Dan Kahan, Yale University. He was curious about why some people are skeptical about several issues around which there is scientific consensus. His study involved 1,500 participants who were asked about whether “most expert scientists agree with the statement, most expert scientists disagree with the statement, or expert scientists are divided in their views” on three issues including whether global climate change is occurring, whether nuclear power waste can be disposed of safely and whether carrying concealed handguns will reduce violent crime.
What Kahan and his research group discovered was that participants’ values played the most significant role in their evaluations of scientific information. If people were uncertain about global climate change, for instance, then they choose the statement that “most expert scientists are divided in their views”on climate change, and they found reasons to reject those with opposing viewpoints. Similarly, participants against carrying concealed weapons rejected any information suggesting this might deter or reduce violent crime. In other words, people downplay or discount those expert scientists whose views differ from theirs while seeking experts whose views support or conform to theirs.
Even one or two contrary voices is enough to convince us that the science is unsettled if that supports our values and viewpoints. This is because as Willingham says, “We want our beliefs to be accurate—to align with what is really true about the world—and we know that science is a reliable guide to accuracy….But this desire to be accurate conflicts with other motives, some of them unconscious. People hold beliefs to protect important values. People also hold beliefs that are rooted in their emotions. Because we want to see ourselves as rational beings, we find reasons to maintain that our beliefs are accurate” (Willingham, 2011).
Cognitive scientists have a term for this—biased assimilation. What that means is that we evaluate information selectively. If information agrees with what we think or believe, we tend to value it for affirming our positions. If information disagrees with what we think or believe, we tend to dismiss it. We also hold very different standards for evaluating information that agrees with our positions and information that challenges our positions.
This is not to say that we are incapable of acting rationally or engaging in rational and logical decision-making. But it does explain some of why there can be public disagreement over areas from childhood vaccinations to climate change where scientific consensus exists.