The Complete Guide to Cognitive Biases: 25 That Control Your Life

Your brain takes shortcuts. Most of the time those shortcuts work fine — but they also produce systematic errors called cognitive biases. Psychologists have catalogued hundreds of them. Here are 25 that show up in everyday decisions, relationships, and work.

Part of our Mental Models Guide guide.

What Is a Cognitive Bias?

A cognitive bias is a repeatable pattern of deviation from rational judgment. They arise from heuristics — mental rules of thumb the brain uses to process information quickly. The problem is that speed trades off against accuracy.

Key insight from my own life: I spent three years teaching high school and watched students (and myself) fall into the same traps year after year. Naming the bias is the first step to catching it.

The 25 Cognitive Biases

1. Confirmation Bias

We seek information that confirms what we already believe and ignore contradictory evidence. Fix: Deliberately seek one strong counter-argument before deciding. Research by Nickerson (1998) in Review of General Psychology confirms this is one of the most pervasive biases.

2. Dunning-Kruger Effect

Beginners overestimate their competence; experts underestimate it. The less you know, the less you know what you don’t know. Fix: Track your predictions and compare them to outcomes.

3. Availability Heuristic

Events that come to mind easily feel more probable. Plane crashes feel more dangerous than car crashes because they make the news. Fix: Look up base rates before estimating probability.

4. Anchoring Bias

The first number you hear disproportionately influences every number that follows. Salary negotiations, price negotiations, and even medical diagnoses are affected. Fix: Name your own anchor first.

5. Sunk Cost Fallacy

Past investment (money, time, emotion) makes us continue bad projects. The past cost is gone; only future value matters. Fix: Ask “Would I start this today if I had no prior investment?”

6. Survivorship Bias

We study successful companies, athletes, and strategies while ignoring the failed ones that no longer exist. Fix: Actively hunt for examples of failure in any domain you study.

7. Framing Effect

“90% survival rate” and “10% death rate” are identical facts — but people respond very differently to each. Fix: Restate any important statistic in the opposite frame.

8. Hindsight Bias

After an event, we believe we “knew it all along.” This distorts how we learn from the past. Fix: Write predictions before events happen and review them honestly afterward.

9. Status Quo Bias

We prefer the current state of affairs even when change would be objectively better. Inertia masquerades as preference. Fix: Set a review date for every default you live with.

10. Attribution Error

We attribute others’ failures to character and our own failures to circumstances — and vice versa for success. Fix: When judging someone, ask what situational forces might explain their behavior.

11. Bandwagon Effect

We adopt beliefs because many others hold them. Social proof is powerful but often misleading. Fix: Separate “popular” from “true” in every important decision.

12. Negativity Bias

Bad events have more psychological impact than equally good ones. One criticism outweighs five compliments. Fix: Consciously log positive events daily to rebalance the ledger.

13. Optimism Bias

We systematically underestimate how long projects take and how much they cost. Fix: Use the “outside view” — how long do similar projects actually take?

14. Planning Fallacy

A specific form of optimism bias: our plans ignore past experience and focus only on best-case scenarios. Fix: Add 30–50% buffer to every timeline estimate.

15. Halo Effect

One positive trait (attractiveness, fluency, confidence) causes us to assume other positive traits. Fix: Evaluate attributes independently, one at a time.

16. In-Group Bias

We favor members of our own group — school, nationality, sports team — over outsiders, often unconsciously. Fix: Ask whether you would judge the same action differently if an out-group member did it.

17. Decoy Effect

Adding a third, inferior option shifts preference toward one of the original two. Pricing pages exploit this constantly. Fix: Evaluate each option against your own criteria, not against the decoy.

18. Bystander Effect

The more people present, the less likely any individual acts. Responsibility diffuses. Fix: In emergencies, assign help to a specific person by name.

19. Curse of Knowledge

Once we know something, we cannot imagine not knowing it — which makes teaching and communication hard. Fix: Explain ideas to a newcomer and refine based on their questions.

20. Narrative Fallacy

We construct stories to explain random events. A string of bad luck becomes “the universe testing me.” Fix: Ask what the data looks like without the story attached.

21. Recency Bias

Recent events feel more important than older ones. A market crash last month feels more real than a crash from 2008. Fix: Deliberately look at long-run data before making any forecast.

22. Overconfidence Bias

Most people rate themselves above average in driving, intelligence, and ethics — which is statistically impossible. Fix: Score your confidence and track calibration over time.

23. Illusion of Control

We believe we influence outcomes that are actually random — dice rolls, stock ticks, slot machines. Fix: Separate skill domains from chance domains rigorously.

24. Zero-Risk Bias

We prefer eliminating a small risk entirely over reducing a large risk substantially, even when the math favors the latter. Fix: Calculate expected value instead of focusing on zero.

25. Choice-Supportive Bias

After making a decision, we remember it more positively than it deserved. This protects ego but prevents learning. Fix: Keep a decision journal and revisit choices six months later.

How to Use This List

Pick three biases that feel most personally relevant. Design one concrete “fix” for each and practice for 30 days. You cannot eliminate biases — but you can build systems that slow you down at exactly the right moments.

Citations

  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.
  • Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
  • Dunning, D., & Kruger, J. (1999). Unskilled and unaware of it. Journal of Personality and Social Psychology, 77(6), 1121–1134.


References

Leave a Reply

Your email address will not be published. Required fields are marked *