58 Cognitive Biases That Are Screwing Up Everything You Do

58 Cognitive Biases That Are Screwing Up Everything You Do

58 Cognitive Biases That Are Screwing Up Everything You Do

We like to think we’re rational human beings.

In fact, we are prone to hundreds of proven biases that cause us to think and act irrationally. In fact, even believing we’re rational despite evidence of irrationality in others is known as blind-spot bias.

The study of how often human beings do irrational things was enough for psychologist Daniel Kahneman to win the Nobel Prize in Economics, and it opened the rapidly expanding field of behavioral economics. Similar insights are also reshaping everything from marketing to criminology.

Hoping to clue you — and ourselves — into the biases that frame our decisions, we’ve collected a long list of the most notable ones.

This is an update of an article that was previously published with additional contributions by Drake Baer and Gus Lubin . The affect heuristic describes how humans sometimes make decisions based on emotion .

The psychologist Paul Slovic coined this word to describe the way people let their feelings colouring their beliefs about the world. For example, your political affiliation often decides which arguments you find persuasive.

Our emotions also affect the style we perceive the risks and benefits of different activities. For example, people tend to dread developing cancer, so they see activities related to cancer as much more dangerous than those linked to less dreaded forms of death, illness, and trauma, such as accidents.

Anchoring bias entails people rely too heavily on the first piece of information they hear when making decisions .

People are over-reliant on the first piece of information they hear.

In a wage negotiation, for instance, whoever builds the first offer establishes a range of reasonable prospects in each person’s mind. Any counteroffer will naturally react to or be anchored by that opening offer.

“Most people come with the very strong belief they should never make an opening offer, ” said Leigh Thompson, a prof at Northwestern University’s Kellogg School of Management. “Our research and lots of corroborating research shows that’s wholly backwards. The guy or gal who makes a first offer is better off.”

Availability heuristic describes a shortcut where people make decisions based on information that’s easier to remember .

In one experiment, a professor asked students to list either two or 10 ways to improve his class. Students that had to come up with 10 routes dedicated the class much higher ratings, likely because they had a harder time thinking about “whats wrong” with the class.

This phenomenon could easily apply in the case of job interviews. If you have a hard time recalling what a candidate did wrong during an interview, you’ll likely rate him higher than if you can recall those things easily.

The bandwagon consequence describes when people do something simply because others are also doing it .

The probability of one person adopting a faith increases based on the number of people who hold that belief. This is a powerful kind of groupthink — and it’s a reason sessions are often so unproductive.

Bias blind spots describes how individuals can see bias in others, but struggle to see their own biases .

Failing to recognize your cognitive biases is a bias in itself.

Notably, Princeton psychologist Emily Pronin has found that “individuals watch the existence and operation of cognitive and motivational biases much more in others than in themselves .

58 Cognitive Biases That Are Screwing Up Everything You Do
Flickr/ Tristan Bowersox

Choice-supportive bias describes the tendency to have positive positions about the things or notions we choice, even when they are flawed .

When you choose something, you tend to feel positive about it, even if the choice has flaws. You think that your dog is awesome — even if it bites people every once in a while — and that other dogs are stupid, since they’re not yours.

The clustering illusion happens when we see trends in random events that happen close together.

This is the tendency to see patterns in random events. It is central to various lottery fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds.

Confirmation bias describes the tendency to only listen to information that confirms our preconceptions .

We tend to listen only to the information that confirms our preconceptions. Once you’ve formed an initial opinion about person, it’s hard to change your mind.

For example, researchers had participants watch a video of a student taking an academic test. Some participants was informed that the student came from a high socioeconomic background; others were told the student came from a low socioeconomic background. Those in the first condition believed the student’s performance was above grade level, while those in the second condition believed the student’s performance was below.

If you know some information about a undertaking candidate’s background, you might be inclined to use that information to stimulate false judgments about his or her ability.

Conformity describes how people tend to behave similarly to other people .

This is the tendency of people to conform with other people. It is so powerful that it may lead people to do ridiculous things, as shown by the following experiment by Solomon Asch.

Ask one subject and several fake topics( who are really working with the experimenter) which of lines B, C, D, and E is the same length as A. If all of the fake topics say that D is the same length as A, the real subject will agree with this objectively false answer a shocking three-quarters of the time.

“That we have found the tendency to conformity in our society so strong that reasonably intelligent and well-meaning young people are willing to call white black is a matter of concern, ” Asch wrote. “It raises questions about our ways of education and about the values that guide our conduct.”

Conservatism bias occurs when people believe prior proof more than new proof .

Conservatism bias is where people believe prior proof more than new proof or information that has emerged. People were slow to accept the fact that the Earth was round because they maintained their earlier understanding the planet was flat.

Curse of knowledge means that when people know something, it’s hard to imagine not knowing it .

People who are more well-informed cannot understand the common man . For instance, in the Tv present “The Big Bang Theory, ” it’s difficult for scientist Sheldon Cooper to understand his waitress neighbor Penny.

Decoy effect is a phenomenon in marketing where customers have a specific change in preference between two choices after being presented with a third choice .

In his TED Talk, behavioral economist Dan Ariely explains the ” decoy effect” using an old Economist ad as an example.

The ad featured three subscription levels: $59 for online merely, $159 for publish merely, and $159 for online and publish. Ariely figured out that the option to pay $ 159 for print merely exists so that it induces the option to pay $ 159 for online and publish look more tempting than it would if it was just paired with the $59 option.

58 Cognitive Biases That Are Screwing Up Everything You Do
http :// en.wikipedia.org/ wiki/ File 😀 ecoys.jpg

Denomination impact is when people are less likely to spend large bills than their equivalent value in small bills or coins .

The phenomenon is typically seen with currency.

Duration neglect occurs when the duration of an event doesn’t factor enough into the way we consider it .

For instance, we remember momentary pain just as strongly as long-term pain.

Kahneman and colleagues tracked patients’ pain during colonoscopies( they used to be more uncomfortable) and found that the end of the procedure pretty much ascertained patients’ evaluations of the entire experience. One set of patients undergo a shorter procedure in which the end was relatively painful. The other set of patients underwent a longer procedure in which the end was less painful.

Results showed that the second defined of patients( the longer colonoscopy) rated the procedure as less painful overall.

Empathy gap occurs when people in one state of mind fail to understand people in another state of mind .

If you are happy, you can’t imagine why people would be unhappy. When you are not sexually aroused, you can’t understand how you act when you are sexually aroused.

Frequency illusion occurs when a word, name or thing you only learned about suddenly seems everywhere .

Now that you know what that SAT word means, you see it in so many places!

Fundamental attribution mistake is where you attribute a person’s behavior to an intrinsic quality of her identity rather than the situation she’s in .

For instance, you might think your colleague is an angry person, when she is really just upset because she stubbed her toe.

Galatea impact occurs when people succeed — or underperform — because they think they should .

Call it a self-fulfilling prophecy. For example, in schools it describes how students who are expected to succeed tend to excel and students who are expected to fail tend to do poorly.

Halo impact is when we take one positive attribute of someone and associate it with all else about that person or thing .

It helps explain why we often presume highly attractive individuals are also good people, why they tend to get hired more easily, and why they earn more fund.

Hard-easy bias occurs when individuals underestimate their ability to perform easy undertakings, yet overestimate their ability to perform more difficult ones .

Hard-easy bias occurs when everyone is overconfident on hard problems and not confident enough for easy problems.

Herding occurs when individuals mirror the sometimes irrational any act of working group .

People tend to flock together, especially in difficult or uncertain times .

58 Cognitive Biases That Are Screwing Up Everything You Do
Hypervision Creative/ Shutterstock

Hindsight bias is when people claim to have predicted an outcome that was impossible to predict at the time .

Of course Apple and Google would become the two most important companies in telephones — but tell that to Nokia, circa 2003.

One classic experiment on hindsight bias took place in the 1970 s, when President Richard Nixon was about to depart for journeys to China and the Soviet Union. Researchers asked the participants to predict various outcomes. After the journeys, researchers asked participants to recall the likelihoods that had initially assigned to each outcome.

Results showed that participants recollected having rated the events unlikely if the event have not been able to passed, and recollected having rated the events likely if the event had occurred.

Hyperbolic discount happens when people make decisions for a smaller reward sooner, rather than a greater reward later .

Hyperbolic discounting is the tendency for people to want an immediate payoff rather than a greater gain later on.

Ideomotor consequence occurs when the body reacts to ideas alone .

Where an idea causes you to have an unconscious physical reaction, like a sad thought that constructs your eyes tear up. “Hes also” how Ouija boards seem to have intellects of their own.

Illusion of control is when people overestimate how much control they have over certain situations .

Illusion of control is the tendency for people to overrate their ability to control events, like when a sports fan supposes his thoughts or actions had an effect on the game.

Information bias is the tendency to seek information when it does not affect action .

More information is not always better. Indeed, with less information, people can often make more accurate predictions.

In one study, people who knew the names of basketball teams as well as their performance records constructed less accurate predictions about the outcome of NBA games than people who only knew the teams’ performance records. However, most people believed that knowing the team names was helpful in making their predictions.

Inter-group bias is when we opinion people in our group differently from how see we person in another group .

This bias helps illuminate the origins of racism and discrimination.

Unfortunately, researchers say we aren’t always aware of our preference for people in our social group.

Irrational escalation is when people induce irrational decisions based on past rational decisions .

It may happen in an auction, when a bidding war spurs two bidders to offer more than they would otherwise be willing to pay.

58 Cognitive Biases That Are Screwing Up Everything You Do
hxdbzxy/ Shutterstock

Negativity bias is the tendency to put more emphasis on negative experiences rather than positive ones .

People with this bias feel that “bad is stronger than good” and will perceive threats more than opportunities in a given situation.

Psychologists argue it’s an evolutionary adaptation: it’s better to mistake a rock for a bear than a bear for a rock.

In modern times, the negativity bias has meaningful implications for our relationships. John Gottman, a relationship expert, found that a stable relationship requires that good experiences result at least five times more often than bad experiences.

The observer-expectancy impact is when a researcher’s expectations impact the outcome of an experiment .

A cousin of confirmation bias, here our expectations unconsciously influence how we perceive an outcome. Researchers looking for a certain result in an experiment, for example, may unknowingly manipulate or interpret the results to reveal their expectations.

That’s why the “double-blind” experimental design was created for the field of scientific research.

Omission bias is the tendency to opt inaction to action, in ourselves and even in politics . Psychologist Art Markman gave a great example back in 2010 😛 TAGEND The omission bias crawlings into our judgment calls on domestic debates, run mishaps, and even national policy discussions. In March, President Obama pushed Congress to enact sweeping healthcare reforms. Republicans hope that voters will blame Democrats for any problems that arise after the law is enacted. But since there were problems with healthcare already, can they genuinely expect that future outcomes will be blamed on Democrats, who passed new laws, rather than Republican, who opposed them? Yes, they can — the omission bias is on their side . The ostrich effect is the decision to ignore dangerous or negative information by “burying” one’s head in the sand, like an ostrich .

Research suggests that investors check the value of their holds significantly less often during bad markets.

But there’s an upside to behave like a big bird, at least for investors. When you have limited knowledge about your holdings, you’re less likely to trade, which generally translates to higher returns in the long run.

Outcome bias refers to judging a decision based on the outcome, rather than how exactly the decision was stimulated in the moment .

Just because you won a lot in Vegas doesn’t mean lottery your fund was a smart-alecky decision.

Research illustrates the power of the results of the assessment bias on the way we evaluate decisions.

In one study, students were asked whether a particular city should have paid for a full-time bridge monitor to protect against debris getting caught and blocking the flow of water. Some students merely insured the information that was available at the time of the city’s decision; others find the information that was available after the decision was already made: rubbles had blocked the river and caused inundate damage.

As it turns out, 24% of students in the first group( with limited information) said the city should have paid for the bridge, compared to 56% of students in the second group( with all information ). Hindsight had affected their judgment.

Overconfidence is when some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives .

Perhaps surprisingly, experts are more prone to this bias than laypeople. An expert might build the same inaccurate prediction as someone unfamiliar with the topic — but the expert will probably be convinced that he’s right.

Overoptimism occurs when individuals believe they are less likely to encounter negative events .

When we believe the world is a better place than it is, we aren’t prepared for the hazard and violence we may encounter. The inability to accept the full breadth of human nature leaves us vulnerable.

On the flip side, overoptimism may have some benefits — hopefulness tends to improve physical health and reduce stress. In fact, researchers say we’re basically hardwired to underestimate the probability of negative events — meaning this bias is especially hard to overcome.

Pessimism bias occurs when individuals overestimate how often negative things will happen to them .

This is the opposite of the overoptimism bias. Pessimists over-weigh negative consequences with their own and others’ actions.

Those who are depressed are more likely to exhibit the cynicism bias.

58 Cognitive Biases That Are Screwing Up Everything You Do
Library of Congress

Placebo impact is when simply believing that something will have a certain impact on you causes it to have that impact .

This is a basic principle of stock exchange cycles, as well as a supporting feature of medical treatment in general. People given “fake” pills often experience the same physiological impacts as people given the real thing.

Planning fallacy is the tendency to underestimate how much hour it will take to complete a undertaking .

According to Kahneman, people generally think they’re more capable than they actually are and have greater power to influence the future than they truly do. For example, even if you know that writing a project report typically takes your coworkers several hours, you might believe that you can finish it in under an hour because you’re especially skilled.

Post-purchase rationalization is when we overlook an expensive item’s flaws to justify the purchase .

Post-purchase rationalization is when we attain ourselves believe that a purchase was worth the value after the facts of the case.

Priming is when you more readily identify ideas related to a previously introduced notion . Let’s take an experiment as an example, again from Less Wrong 😛 TAGEND Suppose you ask subjects to press one button if a string of letters sorts a word, and another button if the string does not form a word.( E.g ., “banack” vs. “banner” .) Then you show them the string “water.” Later, they will more quickly identify the string “drink” as a word. This is known as “cognitive priming” …

Priming also exposes the massive parallelism of spreading activation: if find “water” activates the word “drink, ” it probably also activates “river, ” or “cup, ” or “splash.”

Pro-innovation bias occurs when a proponent of an innovation tends to overvalue its usefulness and undervalue its restrictions.

Sound familiar, Silicon Valley?

Procrastination occurs when you decide to act in favor of the present moment over investing in the future .

For example, even if your goal is to lose weight, you might still go for a thick slice of cake today and say you’ll start your diet tomorrow.

That happens largely because, when you define the weight-loss goal, you don’t take into account that there will be many instances when you’re is dealing with cake and you don’t have a plan for managing your future impulses.

Reactance refers to the desire to do the opposite of what someone wants you to do, in order to prove your freedom of choice .

One study found that when people assured a sign that read, “Do not write on these walls under any circumstances, ” they were more likely to deface the walls than when they insured a sign that read, “Please don’t write on these walls.” The study authors say that’s partly because the first sign posed a greater perceived menace to people’s freedom.

Recency is the tendency to weigh the latest information more heavily than older data .

As financial planner Carl Richards writes in The New York Times, investors often believe the market will always seem the route it seems today and therefore build unwise decisions: “When the market is down we become convinced that it will never climb out, so we cash out our portfolios and stick the money in a mattress.”

58 Cognitive Biases That Are Screwing Up Everything You Do
Wikimedia Commons

Reciprocity is the faith that fairness should trump other values, even when it’s not in our economic or other interests .

We learn the reciprocity norm from a young age, and it affects all kinds of interactions. One study found that, when restaurant waiters dedicate clients extra mints, “the consumers ” upped their tips. That’s likely because the customers felt obligated to return the favor.

Regression bias occurs when people take action in response to extreme situations. When the situations become less extreme, they take credit for causing the change, when a more likely explanation is that the situation was reverting to the mean .

In ” Thinking, Fast and Slow, ” Kahneman gives an example of how the regression bias plays out in real life. An teacher in the Israeli Air Force asserted that when he chided cadets for bad execution, they always did better on their second try. The instructor believed that his reprimands were the cause of the improvement.

Yet Kahneman told him he was really observing regression to the mean, or random variations in the quality of performance. If you perform really badly one time, it’s highly probable that you’ll do better the next time, even if you do nothing to try to improve.

Restraint bias occurs when we overestimate our capacity for impulse control .

With restraint bias, one overratings one’s ability to show restraint in the face of temptation.

Salience is our tendency to focus on the most easily recognizable features of a person or concept .

For example, research suggests that when there’s only one member of a racial minority on a business team, other members use that individual’s performance to predict how any member of that racial group would perform.

Scope insensitivity is where your willingness to pay for something doesn’t correlated with the scale of the outcome . From Less Wrong 😛 TAGEND Once upon a time, three groups of subjects were asked how much they would pay to save 2,000/ 20,000/ 200,000 migrating birds from drowning in uncovered oil ponds. The groups respectively answered $80, $78, and $88. This is scope insensitivity or scope neglect: the increasing numbers of birds saved — the scope of the altruistic action — had little consequence on willingness to pay . Seersucker illusion is the over-reliance on expert advice .

Seersucker illusion has to do with the avoidance of responsibility. We call in “experts” to forecast when typically they have no greater opportunity of predicting an outcome than the rest of the population. In other terms, “for every seer there’s a sucker.”

Selective attention occurs when we allow our expectations to influence how we perceive the world .

The classic study on selective attention is called the ” invisible gorilla” experiment. Psychologists Christopher Chabris and Daniel Simons generated a short film in which a team wearing white and a squad wearing black pass basketballs. Participants are asked to count the number of passes made by either the white or the black squad. Halfway through the video, a woman wearing a gorilla suit cross the court, thumps her chest, and strolls off screen. She’s on screen for a total of nine seconds.

About half of the thousands of people who have watched the video( you can watch it here) don’t notice the gorilla, presumably because they’re so wrapped up in counting the basketball passes.

Of course, when asked if they would notice the gorilla in this situation, nearly everyone says they would.

Self-enhancing transmitting bias occurs when everyone shares their success more than their failings .

Self-enhancing transmission bias leads to a false perception of reality and inability to accurately assess situations.

Status quo bias is the tendency to opt things to stay the same .

This is similar to loss-aversion bias, where people prefer to avoid loss instead of acquiring gains.

Stereotyping occurs when people generalize characteristics about others based on the groups they belong to .

Stereotyping occurs when we expect a group or person to have certain qualities without having real information about the individual.

There may be some value to stereotyping because it allows us to quickly identify strangers as friends or enemies. But people tend to overuse it.

For example, one study found that people were more likely to hire a hypothetical male candidate over a female candidate to perform a mathematical chore, even when they learned that the candidates would perform equally well.

Survivorship bias occurs when individuals concentrating on successful outcomes, yet overlook failure .

Survivorship bias is an error that comes from focusing only on surviving examples, causing us to misjudge different situations. For instance, we might think that being an entrepreneur is easy because we haven’t heard of all of the entrepreneurs who have failed.

It can also cause us to assume that survivors are inordinately better than failings, without regard for the importance of luck or other factors.

Tragedy of the commons occurs when individuals use public resources in their own self interest rather than for the common good .

We overuse common resources because it’s not in any individual’s interest to conserve them. This explains the overuse of natural resources, opportunism, and any acts of self-interest over collective interest.

Unit bias occurs when people think a particular size is the optimal sum .

We believe that there is an optimal unit size, or a universally acknowledged amount of a given item that is perceived as appropriate. This explains why when served larger portions, we feed more.

Zero-risk bias occurs when we choose to eliminate danger utterly in one area, rather than eliminate more risk spread out across different areas .

Sociologists have found that we love certainty — even if it’s counterproductive.

Thus the zero-risk bias.

In general, people tend to prefer approaches that eliminate some risks totally, as opposed to approaches that reduce all risks — even though the second option would produce a greater overall decrease in risk.

Read the original article on Business Insider. Follow us on Facebook and Twitter. Copyright 2019.

Read next on Business Insider: A Harvard psychiatrist has identified 7 abilities to help you get along with anybody

Read more:

58 Cognitive Biases That Are Screwing Up Everything You Do
58 Cognitive Biases That Are Screwing Up Everything You Do
58 Cognitive Biases That Are Screwing Up Everything You Do
58 Cognitive Biases That Are Screwing Up Everything You Do
58 Cognitive Biases That Are Screwing Up Everything You Do

58 Cognitive Biases That Are Screwing Up Everything You Do

58 Cognitive Biases That Are Screwing Up Everything You Do

Leave a Reply

Your email address will not be published. Required fields are marked *