The human brain is an amazing thing, capable of processes that even our most advance robotics can’t touch. It’s also really freaking bad at perceiving reality. We tend to form a narrative of things the way we want them to be, or in many cases, in the way that best shelters us from mental or physical harm.
Consider our Cro-Magnon ancestors. If while gathering some nuts and berries, I hear a rustling in the bushes, I have 2 options to consider: The first is that it was caused by some predator, and I must prepare immediately to fight or flee. The second is that it was caused by something benign like the wind, or a muskrat. Dare I take the chance that it was nothing to be worried about and go on about my business, or do I get the hell out of there? Well, those who chose the latter had a much higher rate of survival than those who sat and contemplated or investigated what the more likely scenario was before deciding. This means the types to play it safe and suspect danger were able to pass along their genes and their tales to their offspring. After many, many generations, precaution becomes an inheritable trait.
However, we live in a period of time now where every day choices don’t hang in the balance of a split-second decision. We have time to observe, question, form a hypothesis, and test it. In other words, we have time to science that shit. Yet our evolutionary, reactionary brain has not yet caught up with objective rationality, and we still cling to many cognitive biases whether we want to or not. So you could say that there are some glitches in the beta, and the result is many prejudiced decisions and reasoning errors.
Cognitive dissonance is a term used by psychologists a lot to describe the feeling when a fact and a belief clash against each other and just…rub us the wrong way. It’s dissonant, so it sucks, and we try to avoid it by choosing one or the other to make space in our brains, but not both. The phrase “face the facts” can figuratively be interpreted to mean that facts are something that we must force ourselves to let in, even when they’re scary or we don’t want them to be true.
We have many different ways that we avoid cognitive dissonance, mostly by applying filters and biases to a situation based on what will make us feel good about ourselves, and protect us from negative emotions. Here are a few of the most common ones to reflect on:
1.) Confirmation Bias- In the age of Google, confirmation bias is so easy. Anyone can think they are competent so long as they’ve found an article or meme that agreed with their preconceived notions. Then if they really want to do it up, they can block anyone who disagrees or questions them. I once chimed in on a forum where a lady was asking for input on using essential oils in CF because she was thinking of adding it to her child’s regimen. Plenty of other people gave pleasant anecdotes and hopeful inspiration. She thanked them for all the great advice. I was the first person to respond who did not say what she wanted to hear. I asked if she’d talked to her child’s doctor and how did she know her source (a YouTube video) was reliable. I then put forward a few meta-analyses for her to peruse. This did not go over well. She chewed me out for being negative and rude (though I was not), and eventually deleted the thread. This woman was engaging in confirmation bias: Thoughtfully accepting the words she wanted to hear, and bitterly ignoring what she didn’t want to hear. She was not looking for input. She was looking for confirmation.
Expanding upon this bias and we start to live in an echo chamber. We like to agree with those who agree with us, so we keep like-minded company. We may unfriend the people who speak from outside those beliefs, or stick to only watching a certain news medium. We think, “I don’t need that kind of negativity in my life. I deserve to live stress-free and away from the people who give me the yucks.” This is what is known as an “ingroup bias” and it’s an extension of the innate tribalism we are wired for. (For a really interesting study concerning the neurotransmitter oxytocin’s role in ingroup bias, click here).
2.) Probability bias- also known as gambler’s bias. In a game of chances, we like to think we have some measure of fortune that will heed a favorable result. We’re also notoriously bad at assessing risk. Take the classic example of a roulette wheel: 50% chance of being red, 50% chance of being black. Every time. But, say after observing 10 spins with nothing but red as a result, we decide to place our bet. Either we will think, “There’s something about red, it hasn’t failed yet, I’m putting my money on red.” Or we will think, “10 reds in a row? Surely, black is due to make an appearance this time.” Have the odds changed? Of course not, they’re the same as they always were, but our perception of the odds has changed.
Similarly, we regularly justify risky behavior when it’s something we are accustomed to. For example, statistically speaking, one is more likely to be killed by (or as) a drunk driver than in a terrorist attack (at least in most major countries). Yet, it’s the fear of terrorism that is more likely to alter our behavior, not the fairly routine death toll of intoxicated driving tragedies.
3.) Observation bias- Ever get an idea in your head, and then suddenly you start seeing examples of it everywhere? And you think, “What is going on? Why does this keep happening? It can’t be coincidence. There must be something in the water.” We are pattern seeking people. We tend to find what we are searching for, even if it’s in a sea of random and unconnected data. This sort of subconscious selection of seeing what is already on our mind has led to a lot of fallacious reasoning concerning the correlation of two unrelated events.
Another related bias is known as the negativity bias, or as I sometimes put it, the “bad news” bias. As the name suggests, humans tend to focus on the bad news. There’s more drama in the negative, and it evokes more emotions which are then expressed through gossip, and extended news coverage, likes and shares on social media. Because of this, it may seem that a series of tragedies or outrageous rumors are happening more frequently and that everything is going to hell in a hand basket.
4.) Self-serving bias- this is when our thoughts tend to favor our own ego in order to maintain or raise self esteem. We reject negative criticisms and focus on how brilliant we are. We may attribute successes as the result of our hard work and ingenuity, but failures are explained away as others’ incompetence or malevolence. Accordingly, we might rationalize a bad purchase in order to avoid the guilt or embarrassment we have upon realizing our fault. We can go to great lengths to convince ourselves that it was a good idea and a necessary expense.
5.) Status-quo bias– The preference that things remain the same as they’ve always been, because if it ain’t broke, don’t fix it. Thing is, a lot of time it is broken, or could be improved with changes. Many people are resistant to change, especially when it’s something that is familiar to them and a belief they’ve held onto for a long time. This is also related to the bias of first-impression which generally states that the first time you encounter a viewpoint, it’s likely to stick, and even with compelling arguments, you are less likely to move from that vantage point, simply because it’s uncomfortable to change your mind.
These are just a few of the many biases our brains perform involuntarily on a daily basis. Nobody is immune from them. We are all guilty, and I’ve been using “we” throughout the post because you do these things, I do these things, your friends do these things, your doctors do these things. We are human. For example, my “post-purchase rationalization” is off the charts! (I have never made a bad financial decision 😉 ). Evolutionarily speaking, cognitive biases have their place, and we are not likely to outgrow them any time soon. But familiarizing yourself with the ones you are most often guilty of, can help keep you in check when processing new information. Old habits die hard, but with training and reinforcement exercises, we can start to power up the logical part of our brain (without fear of a dire wolf lunging from the brush if we hesitate).
 Hastleton, Martie G et al., The Paranoid Optimist: An Integrative Evolutionary Model of Cognitive Biases. 2005. www.sscnet.ucla.edu/comm/haselton/webdocs/HaseltonNettle.pdf
Wikipedia has a very thorough page dedicated to naming and describing cognitive biases that is well sourced and interesting for anyone wanting to learn more.