How Common Biases Trick us Into Foolish Trade-offs


To avoid difficult trade-offs and minimize effort, we tend to rely heavily on experience, gut feeling, impulse, and rules of thumb. But sometimes the very things we rely on to make decisions trick us in unexpected ways. Shortcuts are helpful, but they tend to thwart rational thinking.

In Simple Rules: How to Thrive in a Complex World, Donald Sull and Kathleen Eisenhardt say:

“rather than expending conscious cognitive effort,” we “adopt universal heuristics that are cognitively easy, like representativeness (Pick what is usual) and availability (Pick what first comes to mind).”

When we lack time and have insufficient information, we either adopt automatic and obvious rules, or no rules at all to get things done.

For example, in social situations when we are meeting someone for the first time, the first impression, which is based on existing experiences and biases, is often the one that counts. We are often too busy to go deeper, which takes hard work. How many opportunities have do we miss because we misjudge someone or misinterpret what they are saying?

The truth is we are human. And that is also the beauty of it; we can learn to think better and improve our odds of success. In decision making situations, biases encourage us to overestimate our abilities and underestimate the challenges we face. This is called overconfidence.

When we recognize the symptoms, we can do something about the known cause.

Last week, Learning Habit included a link to an article with some examples and ideas to improve our ability to become more effective by being more aware of our biases and triggers. It included a list of common biases that blindside us in our organizations. They are:


Things change all the time. As technology continues to fade into the background and learn from our behaviors, our use and involvement with it accelerate how it changes us. So much so that sometimes or often we need to interject some personal agency into what we're doing and think it through —or we dive head first into dire consequences.

We go with what our gut tells us for expediency-sake. The “good enough” option. We transfer the habit in business, and here's what it does to us:

Belief Bias: Deciding whether an argument is strong or weak on the basis of whether you agree with its conclusion. (“This logic can’t be right; it would lead us to make that investment I don’t like.”)

Confirmation Bias: Seeking and finding evidence that confirms your beliefs and ignoring evidence that does not. (“I trust only one news channel; it tells the truth about the political party I despise.”)

Availability Bias: Making a decision based on the information that comes to mind most quickly, rather than on more objective evidence. (“I’m not worried about heart disease, but I live in fear of shark attacks because I saw one on the news.”)

Anchoring Bias: Relying heavily on the first piece of information offered (the “anchor”) when considering a decision. (“First they offered to sell the car for $35,000. Now they’re asking $30,000. It must be a good deal.”)

Base Rate Fallacy: When judging how probable something is, ignoring the base rate (the overall rate of occurrence). (“I know that only a small percentage of startups succeed, but ours is a sure thing.”)

Planning Fallacy: Underestimating how long it will take to complete a task, how much it will cost, and its risks, while overestimating its benefits. (“Trust me, we can finish this project in just three weeks.”)

Representativeness Bias: Believing that something that is more representative is necessarily more prevalent. (“There may be more qualified programmers in the rest of the world, but we’re staffing our software design group from Silicon Valley.”)

Hot Hand Fallacy: Believing that someone who was successful in the past has a greater chance of achieving further success. (“Bernard Madoff has had an unbroken winning streak; I’m reinvesting.”)

Halo Effect: Letting someone’s positive qualities in one area influence overall perception of that individual. (“He may not know much about people, but he’s a great engineer and a hard-working guy; let’s put him in charge of the team.”)


When we're close to something we perceive it as more important. It's natural to feel stronger ownership for what is near and dear. Yet, it does a reality distortion trick on us. Short-term thinking overwhelms long-term consequences:

Endowment Effect: Expecting others to pay more for something than you would pay yourself. (“This is sure to fetch thousands at the auction.”)

Affective Forecasting: Judging your future emotional states based on how you feel now. (“I feel miserable about it, and I always will.”)

Temporal Discounting: Placing less value on rewards as they move further into the future. (“They made a great offer, but they can’t pay me for five weeks, so I’m going with someone else.”)


We see the world as we are, and we tend to agree with ourselves, so much so that we end up hiring copies of who we are as part of our teams. The lack of distinct points of view blinds us to the many other options we might have. It's difficult, but not impossible to overcome our ability to see only the fault in others' but not our own reasoning. 

For example, gathering feedback from units other than our own or customers help us remain aware of:

Blind Spot: Identifying biases in other people but not in yourself. (“She always judges people much too harshly.”)

False Consensus Effect: Overestimating the universality of your own beliefs, habits, and opinions. (“Of course I hate broccoli; doesn’t everyone?”)

Fundamental Attribution Error: Believing that your own errors or failures are due to external circumstances, but others’ errors are due to intrinsic factors like character. (“I made a mistake because I was having a bad day; you made a mistake because you’re not very smart.”)

Hindsight Bias: Seeing past events as having been predictable in retrospect. (“I knew the financial crisis was coming.”)

Illusion of Control: Overestimating your influence over external events. (“If I had just left the house a minute earlier, I wouldn’t have gotten stuck at this traffic light.”)

Illusion of Transparency: Overestimating the degree to which your mental state is accessible to others. (“Everyone in the room could see what I was thinking; I didn’t have to say it.”)

Egocentric Bias: Weighing information about yourself disproportionately in making judgments and decisions — for example, about communications strategy. (“There’s no need for a discussion of these legal issues; I understood them easily.”)


Every single business (and person) is in the risk management business. We're loss averse, we'd rather have a penny today than invest in a pound for tomorrow. This series of biases prevent us from expanding our opportunities and putting ourselves in the path of luck. It's a frame problem, but we mistake it for an information problem.

Loss Aversion: Making a risk-averse choice if the expected outcome is positive, but making a risk-seeking choice to avoid negative outcomes. (“We have to take a chance and invest in this, or our competitors will beat us to it.”)

Framing Effect: Basing a judgment on whether a decision is presented as a gain or as a loss, rather than on objective criteria. (“I hate this idea now that I see our competitors walking away from it.”)

Sunk Costs: Having a hard time giving up on something (a strategy, an employee, a process) after investing time, money, or training, even though the investment can’t be recovered. (“I’m not shutting this project down; we’d lose everything we’ve invested in it.”)


The trickiest of all, because it's based on many of the emotional data points that make up our identity. We're quite attached to it. It's not just skin- or gender-deep, either. Our perception of what a particular team member or partner should look like on paper because that's how we've solved the problem before keeps us from higher levels of achievement.

This is the case of us cutting off our nose to spite our face:

Ingroup Bias: Perceiving people who are similar to you (in ethnicity, religion, socioeconomic status, profession, etc.) more positively. (“We can trust her; her hometown is near mine.”)

Outgroup Bias: Perceiving people who are different from you more negatively. (“We can’t trust him; look where he grew up.”)

Thinking in binary mode can get us intro trouble. Every time we go onto auto pilot in decision making relying on experience, similarity, or closeness to the issue alone, when we try to shortcut solutions to complex problems for the sake of expediency or leaning too heavily on rules and policies without challenging our assumptions, we trick ourselves into foolish trade-offs.

What we need is a way to hedge our bets. Slowing down and taking time to think is a luxury many of us don't have or can't rely on consistently. We can however create a reliable framework and system that will allow us to manage ourselves better as we deal with issues as they arise, so we can free up more cognitive capacity to find the opportunities they contain.