5 types of brain bias that mess with your mind
The Confirmation Bias Effect
We tend to favour evidence that backs up ideas we already believe in.
You know that turbo shout session you had with [insert person] at [insert place] at [insert time], after which you furiously whipped out the internet to find any article that backed you up? Yep, I’ve been there. Opinions make us. Our beliefs, inextricably shaped by those we spend time with, define who we believe we are. They give us meaning, so it’s no wonder we’re so precious about them.
It makes sense that we prefer the company of people whose opinions match our own; although an echo chamber of agreement doesn’t always make for the most stimulating debate, it’s a good source of comfort to find out that your fellow folks have got your back. (Tribalism for sure comes into it, but that’s for another blog.) Birds of a feather flock together, which makes it easy to feel righteous. Even when actively trying to research against belief, we overlook evidence that challenges our opinion in favour of that which backs it up—and that’s Confirmation Bias.
The Illusory Correlation also comes into play here—we perceive relationships between variables that demonstrate no such thing, especially to support our argument. If you find yourself auto-knee-jerking in the face of confrontation, there’s a chance you’re ignoring the facts. Confirmation Bias leads us to favour data which backs up prior beliefs, even in the face of contradictory evidence.
An offshoot of this that goes even deeper is the proposition of identity-protective cognition—that we cannot trust our own opinions, especially since our brains are biased towards protecting those opinions. Think gun control, conspiracy theories, and climate change—compelling data seems only to result in heels—not heads— being driven further into the ground, especially since those topics are highly politically charged.
The Barnum Effect
Generalisms become prophetical and profound.
An offshoot of Confirmation Bias, so named after the infamous hoaxer, Barnum. Here we have something a little more specific: people will give high accuracy to descriptions of their personality that they believe to be tailored to themselves, only those “personality traits” are general enough to be applicable to anyone.
It’s the equivalent of “yeeesss, I’d love an omelette right about now!” (Alright, maybe that was a stretch, but it’s one of the best Simpsons quotes.) You only hear what you want to hear, and you only read what you want to read between the lines.
Take horoscopes: the premonitions with each piece of “wisdom” aren’t written obliquely by accident. Because people want to believe in the advice received, they’ll track back in their minds favourably to make it so, only taking into account where the story fits.
In the same way that our brains are programmed to recognise faces, and therefore people might see Jesus on a grilled cheese, so we are programmed to find favourable evidence towards our own beliefs. It’s the definition of wishful thinking, but think about this: even Charlie Brooker won’t lay claim to being a fortune teller. Maybe it was a sixth sense.
The Backfire Effect
Even in the face of hard evidence, especially so, we won’t change our minds.
Remember identity-protective cognition? This is pretty similar. Given that the press presents most information in a 50/50 argument in order to strive towards objectivity, we tend to pick the side we err toward—just as with confirmation bias. And, yes, for a lot of media it’s barely the facade of being objective.
This ambiguous way of delivering information means that we’re well trained to think in polar opposites—it’s either right or it’s wrong—which isn’t easy to escape or useful in finding truth. (Let’s ignore the philosophy of what ‘truth’ really is, for now.) It stands that with many debates there are grey areas, so misplaced objectivity is good to bear in mind. TL;DR: Just because a person disagrees with you, don’t assume they’re wrong. Especially when you consider this:
When confronted with solid evidence that counters strongly held beliefs, the overwhelming reaction is to get defensive. This is known as the “backfire effect”, as studied by researchers at Dartmouth who found that “citizens are likely to resist or reject arguments and evidence contradicting their opinions” in the face of evidence from an omniscient source—a single person.
For example, if you argue against someone who believes that cannabis cures cancer, and offer data to support that, chances are “big pharma” will get the blame. The classic response of, “well, I’ve not read enough about it” (read: “I am not going to change stance based on what you’ve said”), can be quite a conversation stopper and a means to hold position—the very definition of shutting it down. Sound familiar? You’re probably treading on toes too. That being said, bringing up The Backfire Effect probably won’t be helpful.
The Not Invented Here Syndrome
We tend to be more critical of someone else’s ideas than our own.
This goes beyond the shutdown to a new level of “why don’t we just do what you wanna do?”. Even though everyone knows that “two minds are better than one”, when one’s ideas are criticised, it’s a hard pill to swallow. Not to get all sonder-philosophical, it’s easy to get why it feels unreasonable to ignore your own wealth of data—anecdotal or otherwise.
Empathetic or not, the main character syndrome is real. Sure, this one’s more philosophy than syndrome, but we’ve all been there. Let’s say someone rudely stomped over your very legitimate idea to replace currency with interpretive dance styled solely on Wacky Waving Inflatable Arm-Flailing Tubeman—plus they did it with the remnants of slug on their Crocs.
The reaction is pretty automatic: You assume they are entirely incorrect. Whether you consider it an inherent form of tribalism, like being unwilling to adopt a foreign culture, or simply rejection based on a lack of understanding, we’ve all had moments of NIH. “Let’s not reinvent the wheel”, they said.
But radial tires made it over eventually. Although it’s hard not to apply a harsher standard to everyone else’s ideas over your own, it doesn’t hurt to have a quiet word… with yourself. Humility is rad and dismissing change for the sake of holding on to your own ideas—or the whole “but we’ve always done it this way” thing—can only get in the way of personal growth. Because, luddites.
The Gambler’s Fallacy
We misunderstand the maths.
Oh, you thought you had the odds, but it’s a dicey bet. (Ugh, sorry.) We’ve all done it, and it’s on account of our inability to understand the complex maths of chance. We put extra weight on previous outcomes as if they’ll affect future ones because it just feels right. Consider this: you’ve just thrown four heads in a row. It feels as though that means it’s less likely you’ll throw another.
But, each throw is still 50% chance of either outcome: the past coin toss does not effect the future outcome of a fair coin toss. Why? Because the slim chances of that happening only existed before you tossed the coin—before your five coin tosses, it was a 1/32 probability. Right? Each individual time you throw the coin, there’s still a 50/50 of heads or tails.
We’re hardwired to see patterns. We find meaning from nothing to evidence our beliefs. It seems instinctive to assume previous coin tosses must have effect. Even though most of us don’t truly understand the concept of random, we even demanded that iTunes shuffle mode be made less random to make it feel more random.
True randomness meant that two songs from the same album could well crop up next to each other, just as the same song can be repeated over and over. What we really wanted was to hear each song on a playlist shuffled completely and played once, maybe twice here and there to throw us, so it was changed. Spotify had the same problem. Seeing patterns helped us become so ruddy clever, but also built in a few flaws along the way. TL;DR: Gambling is addictive.
Being a human is fascinating.