The human brain is capable of 1016 processes per second,
which makes it far more powerful than any computer currently in
existence. But that doesn't mean our brains don't have major
limitations. The lowly calculator can do math thousands of times better
than we can, and our memories are often less than useless — plus, we're
subject to cognitive biases, those annoying glitches in our thinking
that cause us to make questionable decisions and reach erroneous
conclusions. Here are a dozen of the most common and pernicious
cognitive biases that you need to know about.
Before
we start, it's important to distinguish between cognitive biases and
logical fallacies. A logical fallacy is an error in logical
argumentation (e.g. ad hominem attacks, slippery slopes, circular
arguments, appeal to force, etc.). A cognitive bias, on the other hand,
is a genuine deficiency or limitation in our thinking — a flaw in
judgment that arises from errors of memory, social attribution, and
miscalculations (such as statistical errors or a false sense of
probability).
Some
social psychologists believe our cognitive biases help us process
information more efficiently, especially in dangerous situations. Still,
they lead us to make grave mistakes. We may be prone to such errors in
judgment, but at least we can be aware of them. Here are some important
ones to keep in mind.
Confirmation Bias
We
love to agree with people who agree with us. It's why we only visit
websites that express our political opinions, and why we mostly hang
around people who hold similar views and tastes. We tend to be put off
by individuals, groups, and news sources that make us feel uncomfortable
or insecure about our views — what the behavioral psychologist B. F.
Skinner called cognitive dissonance.
It's this preferential mode of behavior that leads to the confirmation
bias — the often unconscious act of referencing only those perspectives
that fuel our pre-existing views, while at the same time ignoring or
dismissing opinions — no matter how valid — that threaten our world
view. And paradoxically, the internet has only made this tendency even
worse.
Ingroup Bias
Somewhat
similar to the confirmation bias is the ingroup bias, a manifestation
of our innate tribalistic tendencies. And strangely, much of this effect
may have to do with oxytocin — the so-called "love molecule." This
neurotransmitter, while
helping us to forge tighter bonds with people in our ingroup, performs
the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others.
Ultimately, the ingroup bias causes us to overestimate the abilities
and value of our immediate group at the expense of people we don't
really know.
Gambler's Fallacy
It's
called a fallacy, but it's more a glitch in our thinking. We tend to
put a tremendous amount of weight on previous events, believing that
they'll somehow influence future outcomes. The classic example is
coin-tossing. After flipping heads, say, five consecutive times, our
inclination is to predict an increase in likelihood that the next coin
toss will be tails — that the odds must certainly be in the favor of
heads. But in reality, the odds are still 50/50. As statisticians say,
the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%.
Relatedly, there's also the positive expectation bias — which often fuels gambling addictions. It's the sense that our luck has to
eventually change and that good fortune is on the way. It also
contribues to the "hot hand" misconception. Similarly, it's the same
feeling we get when we start a new relationship that leads us to believe
it will be better than the last one.
Post-Purchase Rationalization
Remember
that time you bought something totally unnecessary, faulty, or overly
expense, and then you rationalized the purchase to such an extent that
you convinced yourself it was a great idea all along? Yeah, that's
post-purchase rationalization in action — a kind of built-in mechanism
that makes us feel better after we make crappy decisions, especially at
the cash register. Also known as Buyer's Stockholm Syndrome, it's a way
of subconsciously justifying our purchases — especially expensive ones.
Social psychologists say it stems from the principle of commitment, our
psychological desire to stay consistent and avoid a state of cognitive dissonance.
Neglecting Probability
Very
few of us have a problem getting into a car and going for a drive, but
many of us experience great trepidation about stepping inside an
airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly
unnatural and seemingly hazardous activity. Yet virtually all of us
know and acknowledge the fact that the probability of dying in an auto
accident is significantlygreater than getting killed in a plane crash — but our brains won't release us from this crystal clear logic (statistically,
we have a 1 in 84 chance of dying in a vehicular accident, as compared
to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]).
It's the same phenomenon that makes us worry about getting killed in an
act of terrorism as opposed to something far more probable, like
falling down the stairs or accidental poisoning.
This is what the social psychologist Cass Sunstein calls probability neglect — our inability to properly grasp a proper sense of peril and risk —
which often leads us to overstate the risks of relatively harmless
activities, while forcing us to overrate more dangerous ones.
Observational Selection Bias
This
is that effect of suddenly noticing things we didn't notice that much
before — but we wrongly assume that the frequency has increased. A
perfect example is what happens after we buy a new car and we
inexplicably start to see the same car virtually
everywhere. A similar effect happens to pregnant women who suddenly
notice a lot of other pregnant women around them. Or it could be a
unique number or song. It's not that these things are appearing more
frequently, it's that we've (for whatever reason) selected the item in
our mind, and in turn, are noticing it more often. Trouble is, most
people don't recognize this as a selectional bias, and actually believe
these items or events are happening with increased frequency — which can
be a very disconcerting feeling. It's also a cognitive bias that
contributes to the feeling that the appearance of certain things or
events couldn't possibly be a coincidence (even though it is).
Status-Quo Bias
We
humans tend to be apprehensive of change, which often leads us to make
choices that guarantee that things remain the same, or change as little
as possible. Needless to say, this has ramifications in everything from
politics to economics. We like to stick to our routines, political
parties, and our favorite meals at restaurants. Part of the
perniciousness of this bias is the unwarranted assumption that another
choice will be inferior or make things worse. The status-quo bias can be
summed with the saying, "If it ain't broke, don't fix it" — an adage
that fuels our conservative tendencies. And in fact, some commentators
say this is why the U.S. hasn't been able to enact universal health care, despite the fact that most individuals support the idea of reform.
Negativity Bias
People
tend to pay more attention to bad news — and it's not just because
we're morbid. Social scientists theorize that it's on account of our
selective attention and that, given the choice, we perceive negative
news as being more important or profound. We also tend to give more
credibility to bad news, perhaps because we're suspicious (or bored) of
proclamations to the contrary. More evolutionarily, heeding bad news may
be more adaptive than ignoring good news (e.g. "saber tooth tigers
suck" vs. "this berry tastes good"). Today, we run the risk of dwelling
on negativity at the expense of genuinely good news. Steven Pinker, in
his book The Better Angels of Our Nature: Why Violence Has Declined,
argues that crime, violence, war, and other injustices are steadily
declining, yet most people would argue that things are getting worse —
what is a perfect example of the negativity bias at work.
Bandwagon Effect
Though
we're often unconscious of it, we love to go with the flow of the
crowd. When the masses start to pick a winner or a favorite, that's when
our individualized brains start to shut down and enter into a kind of
"groupthink" or hivemind mentality. But it doesn't have to be a large
crowd or the whims of an entire nation; it can include small groups,
like a family or even a small group of office co-workers. The bandwagon
effect is what often causes behaviors, social norms, and memes to
propagate among groups of individuals — regardless of the evidence or
motives in support. This is why opinion polls are often maligned, as
they can steer the perspectives of individuals accordingly. Much of this
bias has to do with our built-in desire to fit in and conform, as
famously demonstrated by the Asch Conformity Experiments.
Projection Bias
As
individuals trapped inside our own minds 24/7, it's often difficult for
us to project outside the bounds of our own consciousness and
preferences. We tend to assume that most people think just like us —
though there may be no justification for it. This cognitive shortcoming
often leads to a related effect known as the false consensus bias where
we tend to believe that people not only think like us, but that they
also agree with us. It's a bias where we overestimate how typical and
normal we are, and assume that a consensus exists on matters when there
may be none. Moreover, it can also create the effect where the members
of a radical or fringe group assume that more people on the outside
agree with them than is the case. Or the exaggerated confidence one has when predicting the winner of an election or sports match.
The Current Moment Bias
We
humans have a really hard time imagining ourselves in the future and
altering our behaviors and expectations accordingly. Most of us would
rather experience pleasure in the current moment, while leaving the pain
for later. This is a bias that is of particular concern to economists
(i.e. our unwillingness to not overspend and save money) and health
practitioners. Indeed, a 1998 study showed that,
when making food choices for the coming week, 74% of participants chose
fruit. But when the food choice was for the current day, 70% chose
chocolate.
Anchoring Effect
Also
known as the relativity trap, this is the tendency we have to compare
and contrast only a limited set of items. It's called the anchoring effect because
we tend to fixate on a value or number that in turn gets compared to
everything else. The classic example is an item at the store that's on
sale; we tend to see (and value) the difference in price, but not the
overall price itself. This is why some restaurant menus feature very
expensive entrees, while also including more (apparently) reasonably
priced ones. It's also why, when given a choice, we tend to pick the middle option — not too expensive, and not too cheap.
Images: Lightspring/Shutterstock,
Tsyhun/Shutterstock, Yuri Arcurs/Shutterstock, Everett
Collection/Shutterstock, Frank Wasserfuehrer/Shutterstock, George
Dvorsky, Barry Gutierrez and Ed Andrieski/AP, Daniel
Padavona/Shutterstock, wavebreakmedia/Shutterstock.
No comments:
Post a Comment