Why We Prefer Stories to Cold, Hard Numbers

Motivated Reasoning

by Above the Market

There is a wide body of research on what has come to be known as “motivated reasoning” and – more recognizably for those of us in the investment world – its “flip-side,” confirmation bias.  While confirmation bias is our tendency to notice and accept that which fits within our preconceived notions and beliefs, motivated reasoning is the complementary tendency to scrutinize ideas more critically when we disagree with them than when we agree.  We are also much more likely to recall supporting rather than opposing evidence.  The Simmelweis Reflex is a reflection of this phenomenon. Upton Sinclair offered perhaps its most popular expression:  “It is difficult to get a man to understand something, when his salary depends upon his not understanding it!”

The classic study in this regard concerned smoking and cancer.  If you are old enough, you may recall that in 1964 the U.S. Surgeon General famously issued a report linking smoking and cancer.  It was a very big deal at the time and was extremely controversial.  Shortly thereafter, two scientists interviewed smokers and nonsmokers alike and asked them to evaluate the Surgeon General’s conclusions. Nonsmokers generally agreed with the Surgeon General.  Smokers, however, who clearly had something significant to them at stake, were not nearly so sanguine.  In fact, they concocted a variety of dubious challenges, including “many smokers live a long time” (anecdotal evidence to the contrary does not undermine the impact of the data in total) and “lots of things are hazardous” (So?). Bringing true believers together in a group tends to compound the problem and ratchet up the denialism.

One study by the late Ziva Kunda should really ring true to those who pay attention to our polarized political sphere. A group of subjects were brought into a lab and told that they will be playing a trivia game.  Before they play, they get to watch someone else (the “Winner”) play, to get the hang of the game.  Half the subjects are told that the Winner will be on their team and half are told that the Winner will be on the opposing team.  The game they watch is rigged and the Winner proceeds to answer every question correctly.  When asked about the Winner’s success, those who expect to play with the Winner are extremely impressed while those who expect to play against the Winner are dismissive.  They attribute the good performance to luck rather than skill (self-serving bias, anyone?). Thus the exact same event receives diametrically opposed interpretations depending upon whose side you’re on. Sounds like post-political debate spin, doesn’t it?

This problem also explains why it can be so hard for us to find, admit and respond to our mistakes — why we hang on to bad trades so long and even refuse to see them as bad (evidence be damned).  As Tadas Viskanta frequently points out, investing is hard.  We will make frequent errors. And when we have something significant at stake — money, ego, family, etc. — it’s really hard to see errors, even if (when!) our positions are pretty nutty.

Some general conclusions and extrapolations should be pretty obvious and plenty familiar.  We choose ideology over facts, especially when it is in our interest to do so.  We all develop overarching ideologies as intellectual strategies for categorizing and navigating the world. Psychological research increasingly shows that these ideologies reflect our unconscious goals and motivations rather than anything like independent and unbiased thinking. This reality fits conveniently together with our tendency to prefer stories to data and our susceptibility to the narrative fallacy, our tendency to look backward and construct a story that explains what happened along with what caused it to happen, more consistent with what we already think and expect than with the facts, especially when the story supports our overall interests.

We all like to think that our outlooks and decision-making are rationally based processes — that we examine the evidence and only after careful evaluation come to reasoned conclusions as to what the evidence suggests or shows. But we don’t. Rather, we spend our time searching for that which we can exploit to support our pre-conceived commitments, which act as pre-packaged means of interpreting the world.  We like to think we’re judges of a sort, carefully weighing the alternatives and possibilities before reaching a just and true verdict.  Instead, we’re much more like lawyers, looking for anything – true or not – that we think might help to make our case and looking for ways to suppress that which hurts it.

We inherently prefer words to numbers and narrative to data — often to the immense detriment of our understanding.  Indeed, we know from neurobiology that when presented with evidence that our worldviews are patently false, we tend to refuse to engage our prefrontal cortex, the very part of the brain we need most to make sense of the new.  This aspect of our make-up is potentially damaging to every decision we make, and especially in investing.  It’s why we concoct silly conspiracy theories. We see what we want to see and act accordingly.  And if it’s in our interest to see things a certain way, we almost surely will.

As always, then, information is cheap while meaning is expensive.

 

Total
0
Shares
Previous Article

Too Big To Fail Takes Another Body Blow (Taibbi)

Next Article

Microsoft and Intel Leave Google and Amazon in the Dust

Related Posts
Subscribe to AdvisorAnalyst.com notifications
Watch. Listen. Read. Raise your average.