For more detail, see our analysis of how to teach argumentative writing.
I learned the most not when I was right, but when I was wrong. Scott Alexander (Slate Star Codex, Astral Codex Ten) reviews his predictions every year and publicly shares the ones he got wrong [1]. What strikes me is not the mistakes themselves — it is the discipline of the review. Most people avoid thinking about their errors. Alexander builds a system around them. For more detail, see our analysis of regret minimization framework.
This article unpacks Alexander’s approach, explains the research behind why it works, and gives you a practical protocol you can start this month. For more detail, see our analysis of chesterton’s fence and remote work policies.
Why Most People Don’t Learn from Mistakes
The psychology of error avoidance is well-documented. When we are wrong, our brains experience something akin to social pain — the same neural circuits that process physical hurt light up when we face embarrassment or failure. The natural response is avoidance: minimize the mistake, externalize the blame, move on quickly.
Related: cognitive biases guide
This is not weakness. It is a cognitive survival mechanism. But it is catastrophic for growth. Every avoided mistake is a missed calibration opportunity. And miscalibrated thinking compounds — the same error shows up again in a different form, costing you again and again.
Philip Tetlock’s landmark research on forecasting accuracy found that the single strongest predictor of good judgment was not intelligence, domain expertise, or access to information. It was actively open-minded thinking — the willingness to seek out disconfirming evidence and update beliefs when wrong (Tetlock & Gardner, 2015). Superforecasters did not avoid mistakes. They hunted them.
Scott Alexander’s Mistake Taxonomy
Alexander’s framework is useful because it distinguishes between types of mistakes — and different types require different fixes.
1. Honest Mistakes (Information Gaps)
You were wrong because you lacked relevant information that was not reasonably available to you at the time. Example: you predicted a drug trial would succeed, but an unpublished negative result from a related compound would have changed your view — and you had no access to it.
Fix: Improve your information sources. These mistakes are forgivable and only fixable at the input level.
2. Motivated Reasoning (Belief Protection)
You were wrong because you wanted a particular answer to be true. You unconsciously weighted evidence that confirmed your existing belief and discounted evidence that challenged it. This is confirmation bias in action.
Fix: Pre-mortem analysis. Before committing to a belief, ask: “If I am wrong about this, what is the most likely reason?” Steel-manning the opposing view forces you out of motivated reasoning.
3. Predictable Errors (Systematic Bias)
These are the most valuable to identify. You were wrong in a way you could have predicted — because you consistently make the same type of mistake. You overestimate how quickly people change their behavior. You underestimate how long institutional processes take. You chronically overcommit to optimistic timelines.
Fix: Pattern recognition across your mistake log. Once you identify a systematic bias, you can apply a correction factor. Superforecasters do this explicitly — they know their personal biases and adjust their estimates accordingly.
Tetlock’s Forecasting Accuracy Research
Philip Tetlock spent 20 years studying what separates accurate forecasters from poor ones. His 2015 book Superforecasting with Dan Gardner is the definitive synthesis. Key findings:
Last updated: 2026-04-15
Your Next Steps
- Today: Pick one idea from this article and try it before bed tonight.
- This week: Track your results for 5 days — even a simple notes app works.
- Next 30 days: Review what worked, drop what didn’t, and build your personal system.
About the Author
Written by the Rational Growth editorial team. Our health and psychology content is informed by peer-reviewed research, clinical guidelines, and real-world experience. We follow strict editorial standards and cite primary sources throughout.
References
- Alexander, S. (2013). Reactionary Philosophy & The Reactionary Mind. Slate Star Codex. Link
- Alexander, S. (2026). Shameless Guesses, Not Hallucinations. Astral Codex Ten. Link
- Alexander, S. (2026). AMA (Ask Machines Anything). Astral Codex Ten. Link
- Aaronson, S. (2026). January 2026 Posts. Shtetl-Optimized. Link
- Gelman, A. (2026). Don’t get any on you. Statistical Modeling, Causal Inference, and Social Science. Link
- Byrnes, S. (2023). My goodbye post for LessWrong. LessWrong. Link
Frequently Asked Questions
What is the key takeaway about scott alexander mistakes frame?
Evidence-based approaches consistently outperform conventional wisdom. Start with the data, not assumptions, and give any strategy at least 30 days before judging results.
How should beginners approach scott alexander mistakes frame?
Pick one actionable insight from this guide and implement it today. Small, consistent actions compound faster than ambitious plans that never start.