We humans like to believe that we are rational beings, fully in control of our decisions and actions. But psychologists and behavioral economists disagree. They argue that we are — by nature — irrational, but that we are irrational in predictable ways.
One of the ways that we are irrational is by being susceptible to a wide range of biases in our thinking. These are known as “cognitive biases.” Now here’s the problem: everyone has a number of these biases operating at any given time; and we are all generally unaware of how our biases rule our thoughts and actions.
That’s a big issue because we feel like we are making entirely rational, well-reasoned decisions; but the reality is that we are making decisions based almost entirely on instinct and emotion… and then trying to back-up those decisions with logic after the fact. It’s really only one step removed from a parent saying the classic line: “Because I said so.”
So, how do we begin the process of counteracting these biases? To quote the old G.I. Joe series (and associated merchandising) from the 1980s, “Knowing is Half the Battle.”
For an idea of just how many biases have been catalogued, have a look at this great graphic from Jeff Desjardins (actually—read the whole article. It’s great).
Don’t let the sheer number of distinct biases overwhelm you. Cognitive biases are easy to understand once you’ve seen a few good examples So, to help you prepare to battle against these biases within your own mind and to help others avoid such biases, let’s take a look at just a few examples as well as some tips for dealing with these biases. To do so, I’m going to show you how lazy I am by taking the opportunity to quote myself. :) The following excerpt on cognitive bias is taken from chapter 4 of my book, Transformational Security Awareness: What Neuroscientists, Storytellers, and Marketers Can Teach Us About Driving Secure Behaviors. Enjoy:
-------- Begin Excerpt --------
MENTAL NOTE: COGNITIVE BIAS
Our minds are a bit quirky. We aren’t as rational as we think we are. Rather, our System 1 thinking is easily led astray by heuristics and several of what psychologists and behavioral economists call cognitive biases. Framing and priming effects, as well as the mere exposure effect discussed in Chapter 3, are just a few examples of the over 100 cognitive biases that have been cataloged.
The following are a few other interesting cognitive biases:
- Confirmation bias: The tendency to search for, interpret, focus on, and remember information in a way that confirms one’s preconceptions.
- Dunning–Kruger effect: The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.
- IKEA effect: The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result.
- Restraint bias: Overestimating one’s ability to show restraint in the face of temptation.
- Sunk cost fallacy: The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. This is also known as Irrational Escalation.
With more than 100 being cataloged, there are more cognitive biases than you can easily remember. However, these biases can be grouped into a few different categories.
- Decision-making and behavioral biases
- Biases in probability and belief
- Social biases
- Memory errors
If it feels like your executive team just doesn’t “get it” or your employees continually exhibit viewpoints and priorities that feel alien to you, then you are likely dealing with one or more cognitive biases. And the reality is that these biases exist not only in the person (or people) you are communicating with but you as well.
It’s important to acknowledge and account for cognitive biases in everyday life and in our awareness programs. This can also get into some of the struggles that the program manager will have whenever developing a business case.
In your program, consider these two things:
- Some information might be more readily accepted when it plays into an existing cognitive bias within your audience.
- There will be times when you need to find ways to help your audience debias (the process of actively working to avoid falling into cognitive biases) so that they can think more clearly about your information
-------- End Excerpt --------
Want to know more about cognitive biases?
There is a ton of research and resources available about cognitive biases. As a starting-point, I’d like to refer you to an article that Buster Benson created in 2016, the “Cognitive Bias Cheat Sheet.” This article has sense received so much attention that he was able to grow the content into a full book about why people have such polarized opinions and how to have more productive conversations. It’s titled Why Are We Yelling? — The Art of Productive Disagreement.