Thursday, October 31, 2013

Science well sort of….Eureka, I’ve figured it all out

I have to start by thanking you all for your patience.  Since I began this blog a couple of years ago I have been trying to answer what I thought was a straight forward question:  How do we decide what to believe? Those of you who stuck with me watched me muddle through various theories and explorations in earlier blog posts, my core idea all along being that if I could somehow show how I came to my beliefs, it might impact the way other people  arrive at theirs. …Well, I was wrong… very, very wrong.  But the good news, at least  as it relates to me and my fragile little ego,  is that this is not all on me as an ineffectual  blogger, but rather it’s simply a function of how we are wired…of how our brains work.

We think of ourselves as rational actors, using judgment, education and reason to come to our conclusions. But as I learned, this is really not the case.  My good friend Anthony Pratkanis calls us “cognitive misers” meaning that we are cheap in our use of rational reasoning.  I think what he’s pointing to is what Daniel Kahneman has brilliantly summarized in his seminal work “Thinking Fast and Slow”.  Kahneman won the Nobel Prize in Economics for his work explaining how individuals evaluate risk and make decisions often contrary to the evidence and their self-interest.  Complex sciency stuff, but as a psychologist it led him to further explorations on the very question of how our brain works in determining our beliefs.

In his book (and check out the SALT talk here) he makes a compelling argument for how our brain functions determine these outcomes.  He describes two methods of thinking, system one and system two. His best example of the difference  is system one is what kicks in when we are asked, what is 2+2?  The answer comes to us almost instantly and without struggle.  But when we are asked what is 17 x 28 system two has to come into operation  We pause, our pupils dilate (literally) and we have to expend mental energy to come up with the answer.  It is much slower and it takes work.

System one is where our brain is working almost on auto pilot, navigating us though the day without conscious effort.  It’s largely intuitive and enables us to assimilate vast amounts of inputs and stimuli in a way that we can process to our benefit.  It instantly categorizes and associates data into frames that we are comfortable with.  It is heavily influenced by our previous experience, associations, biases and ideologies, even if we are unaware of it.  System two is where we slow down and do the hard work of reasoning. We use it to check and validate our system one conclusions.  It takes effort, and we do it relatively rarely, regardless of the fact that we perceive ourselves to be using it much of the time.

And this is not a bad thing…we rely on system one to be effective, to survive, and to process the fire hose of visual, aural, and olfactory stuff that is coming at us non-stop…and generally it works really, really well.  If we tried to apply the slower system two to all these inputs, we’d simply be unable to function.

But wait…if we can understand all this, can’t we simply apply system two analysis to the more complex problems - like how we come to our beliefs -  taking the time to question our assumptions and regularly checking our initial system one conclusions?

Well no  - at least not often, if at all.  In fact it appears the opposite is the case, and that the more aptitude we have for scientific reasoning, (for system two), the more likely we are  to be swayed by system one processes and biases…. and I think it’s been proven, using….science.

Yale Law professor Dan Kahan is in my estimation a genius.  His latest research is the best designed yet to demonstrate the fallacy that if we just had more information, and were more skilled in our reasoning, we would rationally come up with the correct objective answer.

While not at all complicated, I won’t be able to do justice to his work and fully describe his methodology in the space I allow myself here.   But trust me, his results show in a very compelling fashion that contrary to what one might expect,  the more sophisticated a person is, the more likely they are to misinterpret scientific data based on their political ideology.  And the more information they are given, the harder they hold on to their beliefs.  It should come as no surprise that this holds equally true for folks on both sides of the political spectrum.  If you have any interest at all in this topic, I highly recommend you go here and here for the details.

So, problem solved….I’ve answered to my satisfaction how we decide what to believe…now off to begin my next major inquiry, seeking the answer to that age old question… how do they make paper? I’ll report back…so what do you believe?

8 comments:

  1. We decide with emotions and justify it with reason, which Daniel Kahneman called "cognitive bias."

    ReplyDelete
    Replies
    1. Yes and I would add that we infrequently use reason to check our decisions, and even when we do, we rarely change them on that basis even if the data is contrary to our decisions...in fact, Kahan elegantly shows the opposite occurs, that we tend to harden our beliefs in the face of contrary evidence...funny old world.

      Delete
  2. I think Thomas Kuhn answered this quite well in his Structure of Scientific Revolutions. We use data and "facts" to support our already internalized model of the world - not the other way around. I am a bit shocked that we have to still debate this. The more any thinking person sees how the world actually behaves, the more obvious it is that we are all acting out our internalized paradigms. Very little objective thinking going on here!! The human condition is a bit of a bummer!!

    ReplyDelete
    Replies
    1. You are so right about the fact that this all seems so little understood and under-appreciated....but we're in good company...I've misplaced the exact quote, but the other day Paul Krugman (another Nobel whose name start with a K...hmmmmm) of the NYT wrote something to the effect of,"...I may be coming to this a bit late, but it seems that evidence does not change peoples opinions, it hardens them..." So even the best and the brightest of us seem to miss this fundamental observation of our human condition, and frankly most often in the observation of our own personal behavior. Like I said...funny old world.

      Delete
  3. And rationale (evidence based) beliefs are made even more difficult by the way we create our identities. We are our beliefs -- both individually and collectively. Hence it is no easy matter to change our believes as it involves changing what we believe to be ourselves. How many of us are willing to have believes that put is in the camp of our political affiliates? Beliefs are ideas we are attached to... (beloved ideas literally). So evidence based decisionmaking is even more unlikely...

    ReplyDelete
    Replies
    1. So true...our beliefs form our identity and effect our social world. Very tough to challenge. But a friend of mine had a good one that I think may have some promise in terms of how we frame the power of beliefs on our identity. He worked in a science center in the south, where many folks hold to the belief that the earth was created by god six thousand years ago. At the Center he would put a fossil in their hand and ask, "now if it were true that this fossil is actually 10 million years old, you wouldn't allow that fact to shake your faith in god, would you? " Of course everyone said no...it's a great way to re-frame, showing how faith can at once be more powerful , and yet at the same time coexist with science....I kinda liked it...

      Delete
  4. It is certainly not news that our belief systems bias our interpretation of events-- that has been well understood both scientifically and even generally for a long time. But the study Alex cites (regarding the impact of party affiliation on the interpretation of a relatively simple statistical problem) IS news in two ways. First, it is striking that the bias extends even to basic math and the straightforward reading of simple ratios. Second, and even more interesting, is that the bias effect INCREASES with numeric intelligence.

    This has profound implications for many things, of course, most notably strategies for effecting change. It is apparent that attempting to persuade with "rational" argument alone is a futile in many-- perhaps most-- circumstances. I have been thinking about a related problem a great deal lately-- that being why human societies universally seem to divide into conflicting sub-societies (usually two predominant ones and often based in religion). Whether it be Republicans and Democrats in the US, Protestants and Catholics in Ireland, Jews and Muslims in the Middle East, or any of a thousand other examples, it appears to be a universal phenomena.

    My "working" theory is that this is the result of two fundamental human psychological needs. The first is to understand the world around us. (Here I believe lies the origin of religion-- it explains mysteries). The second is to feel good about ourselves. Good is relative term; one can only be good relative to someone else that is bad. Stated differently, if everyone is "good", then "good" becomes meaningless and the psychological need is unfulfilled. To meet these two basic needs, I believe we adopt a belief system (often religious) to explain the world and then adopt a view that those who share that belief system are superior to those that have a different one.

    We then interpret the world-- including apparently even basic statistics, let alone more ambiguous things-- to support our belief system and our allegiance to our "tribe."

    If any of this is true, then it suggests that effecting change must combine rational argument (which helps explain the world) with addressing these basic psychological needs in those whose views we seek to change. We need to learn to present the arguments in ways that, whenever possible, do not challenge their essential belief systems nor seek to "defeat" their "tribe." One example of this approach is Bill Clinton's speech to the Democratic Convention in 2004, in which he addressed issues by first emphasizing for each issue that Republicans and Democrats seek to achieve the same outcomes (e.g. public safety) before discussing how they differ on tactics for getting there (e.g. more police and gun control versus cutting government and more gun rights).

    Another example is the discussion above regarding the 10 million year old fossil not shaking faith in god.

    In any event, I found the study fascinating and not merely proof of the obvious.

    ReplyDelete
    Replies
    1. Thanks for your thoughtful comment Anon...my whole journey has been to seek ways to make change in beliefs possible...and as you say, Kahan's experiment is jarring to say the least. I think at least one of the mistakes we make is to hope that once the facts are laid out, the person will have a eureka moment of their own, and immediately see the situation as we would like them to...the mistake is that often these things take time ...time for others in their affinity group to change their position so that it is less threatening for them to do so...a great example of this is the environmental community re-examining their opposition to nuclear power in the face of climate change. As with all change it is slow, but clearly happening. Funny old world.

      Delete