Why Smart People get Things Wrong: Cognitive Biases and Logic

 

A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?

 

Have you ever tried to figure out a problem in your head, jump to a quick conclusion, and then later, after further thinking, realised your conclusion was incorrect?

If so, you’re not alone (and if you think you haven’t you may be lying to yourself!).

For example, you probably thought the answer to that first question was 10 cents, didn’t you? But the answer is actually 5 cents[1]

One of the classes I’ve been taking for my MA in Apologetics is Logic and Critical Thinking, and it’s inspired me to ask all sorts of questions about the nature of arguments and questions. Of course, when I say arguments here I don’t mean two people yelling about their coffee order, but rather a truth claim that is backed up by evidence.

Logic is integral to helping us understand the world around us. It helps to order our thinking, to be critical in how we evaluate arguments, and to be more intellectually honest with ourselves and others.

So why don’t we use it?

Is it a lack of knowledge about logic as a discipline? Laziness? A cultural landscape where science is emphasised over philosophy? The old joke that philosophy grads tend to end up just asking the question, “Do you want fries with that?”

But I want to explore one reason in particular, as it’s a problem that endangers even the brightest of minds: cognitive biases. We all have them, and they affect everyone from lay people to MIT students.

The riddle above was an example highlighting one of the cognitive biases we are subject to. Let’s try another one, and try to think about the answer quickly.

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

What’s happening here? Why do smart people make mistakes like this?

Our brain is a funny thing. It’s so good for so many different activities. But there’s a problem: pretty much all of humanity is to some extent cognitively lazy. We jump to the quickest solution by taking a mental shortcut, rather than slowly and methodically working through a problem. It’s not just shortcuts though. Our brain tricks us by helping us believe things we want to be true such as confirmation bias, and tries to make our beliefs consistent through cognitive dissonance. This creates issues when we come and try to do logic, because we’re very easily influenced by these cognitive biases.

The answer to the lily pad problem, by the way, is 47.

So what are some of these biases?

 

Attribution error

You are an employer and your employee is late to work a day each week. You think to yourself that the employee is late because they’re an untrustworthy individual. You’ve also arrived at work late in the past, when you had car problems and when you got stuck in traffic. But you don’t see yourself as an untrustworthy person due to those days, you see them as one-off’s. You’ve just committed the fundamental attribution error, where you attribute other people’s mistakes by who they are as a person, but you explain your mistakes by talking about external factors.

What’s the problem?

By making allowances for ourselves that we don’t make for others, we create a more negative view of the people around us than is necessary. We don’t evaluate situations carefully by considering both internal and external factors, we just jump to the conclusion that other people’s problems are internal.

 

Anchoring effect

Do you think the height of the tallest redwood tree in the world is taller than 85 feet? How tall do you think it is?

STOP.

I know you’ve been tricked once before, but that’s the whole purpose of this article. Go back and answer the question.

Because I gave you an anchor of 85 feet, your guess was far more likely to be closer to the anchor. Other people who were asked this averaged 117 feet with their answer. But when they were asked if the tree was taller than 1000 feet or not, their answers averaged about 820 feet – 7x higher! (In case you wanted to know, the answer is 379.7 feet).

Basically, what’s happening here is you get introduced to a stimulus, and when your brain can’t figure out the answer it will associate the recent stimuli with the answer.

 

Cognitive dissonance

For this one, I’ll use an example very close to my heart: coffee. Say I purchase a coffee for $6 (the average cost is about $3) and all of my friends have told me this particular coffee shop has the best coffee in the world. The coffee ends up being terrible, but I try to justify it as I’ve spent more than usual for this particular coffee (plus I might not want friends to think I have poor taste). I reduce the dissonance by rationalizing, changing my mind about the quality of the coffee to fit with my existing belief that the coffee was worth the $6.

 

Bias blind spot

You may be starting to think about your experiences and life at this point, and starting to recognise situations where other people have fallen victim to these cognitive biases.

If you did, you probably were just subject to another cognitive bias – our tendency to see bias in others but not recognise our own[2].

 

Confirmation bias

I was going to give you an example of confirmation bias, but sometimes a comic is more fun.

confirmation-bias

 

Gambler’s fallacy

Someone flips a coin 49 times, and each time they’ve flipped the coin it has come up heads. The coin is normal, of average weight, width, and balance. What is the probability that the next coin flip will be tails?

The gambler’s fallacy is the belief that for a random event, runs of a particular outcome will be balanced by a tendency for the opposite outcome[3]. Just because you’ve flipped a coin and gotten heads a lot doesn’t mean the next flip will be tails. The probability remains the same – 50%. If you’re ever going to spend any time at all in a casino you need to know this one.

 

A brief excurses:

One of the things I’ve sensed as I’ve read science and logic and philosophy over the past few years is that the human being is seen as a coldly rational logical being, who can progress and improve their lot by the embrace of reason and science. Science will lead to the world improving, religion will die out, and society will reach enlightenment. Of course, I’m generalising, but I’ve seen the attitude enough that I think this summary is fair. The problem with this, of course, is it seems to fly in the face of a decent amount of empirical evidence (ironically the very goal of this view). Religion is increasing, not decreasing, and humans aren’t cold rational beings but complex, emotional, and sometimes irrational, as some of these studies demonstrate.

 

Lessons to be learnt:

What’s the point of all this? Why does it matter if we’re irrational – we can’t remove our brains and do surgery to remove our biases, so why bother discussing this? I think there are a number of important lessons we can reflect on.

Despite all the above cognitive biases, there is hope! By studying similar problems, some participants were found to be able to overcome some biases and succeed at problems like the lily pad problem above[4]. Sometimes, being careful and systematic can help minimize the biases of our brain.

Learning about the occasional irrationality of the human mind has given me much more grace and patience for both myself and others. It’s helped me look more closely at myself, and be less critical of others.

The most important lessons, however, are for how we reason. We can sometimes be so intolerant towards other ideas and other arguments, especially if they oppose what we already believe. But how much of that is due to our cognitive biases? We need to treat other people’s ideas with the same respect we give our own, and not privilege our ideas just because we hold them. Intellectual fair-mindedness is one of the more important virtues, and it is vital for clear communication and respect between opposing parties.

Logic is a precise, technical field of philosophy which is careful and ordered to try and avoid these kinds of errors.  And yet, the brain is still involved. We all still make mistakes. When we do logic we need to remember that we’re so easily manipulated by our brain. We want to believe certain conclusions are true, we’ll try our best to reduce cognitive dissonance, we’ll anchor to certain ideas and won’t adjust accurately, and we are far more charitable to ourselves than others. Being aware of our cognitive biases is vital to being a logician, and will result in us being more fair-minded, charitable, and intellectually honest – and may help us avoid making significant mistakes.

 

 

References

[1]http://www.businessinsider.com/question-that-harvard-students-get-wrong-2012-12

[2] West, R. F., Meserve, R. J., & Stanovich, K. E. (2012). Cognitive sophistication does not attenuate the bias blind spot. Journal of personality and social psychology, 103(3), 506.

[3] Laplace, 1796

[4] Białek, M. (2016). What Color are the Lilies? Forced Reflection Boosts Performance in the Cognitive Reflection Test. Forced Reflection Boosts Performance in the Cognitive Reflection Test.(August 30, 2016).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s