Thursday, October 31, 2013

Science well sort of….Eureka, I’ve figured it all out

I have to start by thanking you all for your patience.  Since I began this blog a couple of years ago I have been trying to answer what I thought was a straight forward question:  How do we decide what to believe? Those of you who stuck with me watched me muddle through various theories and explorations in earlier blog posts, my core idea all along being that if I could somehow show how I came to my beliefs, it might impact the way other people  arrive at theirs. …Well, I was wrong… very, very wrong.  But the good news, at least  as it relates to me and my fragile little ego,  is that this is not all on me as an ineffectual  blogger, but rather it’s simply a function of how we are wired…of how our brains work.

We think of ourselves as rational actors, using judgment, education and reason to come to our conclusions. But as I learned, this is really not the case.  My good friend Anthony Pratkanis calls us “cognitive misers” meaning that we are cheap in our use of rational reasoning.  I think what he’s pointing to is what Daniel Kahneman has brilliantly summarized in his seminal work “Thinking Fast and Slow”.  Kahneman won the Nobel Prize in Economics for his work explaining how individuals evaluate risk and make decisions often contrary to the evidence and their self-interest.  Complex sciency stuff, but as a psychologist it led him to further explorations on the very question of how our brain works in determining our beliefs.

In his book (and check out the SALT talk here) he makes a compelling argument for how our brain functions determine these outcomes.  He describes two methods of thinking, system one and system two. His best example of the difference  is system one is what kicks in when we are asked, what is 2+2?  The answer comes to us almost instantly and without struggle.  But when we are asked what is 17 x 28 system two has to come into operation  We pause, our pupils dilate (literally) and we have to expend mental energy to come up with the answer.  It is much slower and it takes work.

System one is where our brain is working almost on auto pilot, navigating us though the day without conscious effort.  It’s largely intuitive and enables us to assimilate vast amounts of inputs and stimuli in a way that we can process to our benefit.  It instantly categorizes and associates data into frames that we are comfortable with.  It is heavily influenced by our previous experience, associations, biases and ideologies, even if we are unaware of it.  System two is where we slow down and do the hard work of reasoning. We use it to check and validate our system one conclusions.  It takes effort, and we do it relatively rarely, regardless of the fact that we perceive ourselves to be using it much of the time.

And this is not a bad thing…we rely on system one to be effective, to survive, and to process the fire hose of visual, aural, and olfactory stuff that is coming at us non-stop…and generally it works really, really well.  If we tried to apply the slower system two to all these inputs, we’d simply be unable to function.

But wait…if we can understand all this, can’t we simply apply system two analysis to the more complex problems - like how we come to our beliefs -  taking the time to question our assumptions and regularly checking our initial system one conclusions?

Well no  - at least not often, if at all.  In fact it appears the opposite is the case, and that the more aptitude we have for scientific reasoning, (for system two), the more likely we are  to be swayed by system one processes and biases…. and I think it’s been proven, using….science.

Yale Law professor Dan Kahan is in my estimation a genius.  His latest research is the best designed yet to demonstrate the fallacy that if we just had more information, and were more skilled in our reasoning, we would rationally come up with the correct objective answer.

While not at all complicated, I won’t be able to do justice to his work and fully describe his methodology in the space I allow myself here.   But trust me, his results show in a very compelling fashion that contrary to what one might expect,  the more sophisticated a person is, the more likely they are to misinterpret scientific data based on their political ideology.  And the more information they are given, the harder they hold on to their beliefs.  It should come as no surprise that this holds equally true for folks on both sides of the political spectrum.  If you have any interest at all in this topic, I highly recommend you go here and here for the details.

So, problem solved….I’ve answered to my satisfaction how we decide what to believe…now off to begin my next major inquiry, seeking the answer to that age old question… how do they make paper? I’ll report back…so what do you believe?