Thinking in Bets: How Poker Players Make Better Decisions

You make a decision. It turns out badly. You conclude you made a bad decision. This is one of the most common — and most costly — reasoning errors humans make. It’s called resulting, and poker players learned to root it out long before behavioral economists gave it a name.

After looking at the evidence, a few things stood out to me.

The Core Insight From Annie Duke

Annie Duke’s 2018 book Thinking in Bets argues that most of us conflate the quality of a decision with the quality of its outcome. A bad outcome from a good decision is just bad luck. A good outcome from a bad decision is also just luck — and it’s more dangerous, because it reinforces the wrong process.

Related: mental models guide

Professional poker is a forcing function for this distinction. In the long run, good decisions produce better results than bad ones. But over any short series of hands, luck dominates. The discipline of separating process from outcome — “did I make a good bet given what I knew?” rather than “did I win?” — is what separates winning players from losing ones over thousands of hands.

Resulting in Real Life

Consider a driver who runs a red light and makes it through safely. They conclude: not a big deal, I do this sometimes. This is resulting — using the outcome to evaluate the decision. The decision (run a red light) was poor regardless of outcome.

Or a student who crammed the night before an exam and happened to get a good score. They conclude: cramming works. This is also resulting. They got lucky on what appeared on the test. Their study process was still low-quality relative to distributed practice.

I’ve made this error teaching. Early in my career I tried an unstructured discussion format with a class that happened to go brilliantly. I repeated it with different classes and it bombed repeatedly. The first success was partly luck — that class was unusually engaged. I had to learn to evaluate the method, not the result. [1]

What Calibration Means

Duke emphasizes calibration — the alignment between your expressed confidence and your actual accuracy. A well-calibrated person who says they’re 80% sure about something is right about 80% of the time across many such claims. Most people are dramatically overconfident.

Philip Tetlock’s decades of research on expert forecasting (summarized in Superforecasting, 2015) found that calibration is learnable. The key practices: express beliefs in probabilities rather than certainties, keep score on your predictions, and update beliefs when evidence changes rather than when you feel embarrassed to have been wrong.

Have you ever wondered why this matters so much?

Three Tools to Think Better About Decisions

1. The 10-10-10 Frame

Before any significant decision: how will I feel about this in 10 minutes, 10 months, 10 years? This expands the time horizon and reduces the weight of immediate emotion on the decision.

2. Pre-Mortem Analysis

Before executing a decision, imagine it’s a year from now and things have gone badly. Work backward: what went wrong? This surfaces hidden risks without the ego defense mechanisms that activate after failure.

I think the most underrated aspect here is

3. Decision Journaling

Write down significant decisions, your reasoning, and your confidence level at the time. Review periodically. This creates an honest record that can’t be revised by hindsight bias — and it’s the fastest way to identify your actual patterns of systematic error.

Last updated: 2026-03-28

Your Next Steps

  • Today: Pick one idea from this article and try it before bed tonight.
  • This week: Track your results for 5 days — even a simple notes app works.
  • Next 30 days: Review what worked, drop what didn’t, and build your personal system.

About the Author

Written by the Rational Growth editorial team. Our health and psychology content is informed by peer-reviewed research, clinical guidelines, and real-world experience. We follow strict editorial standards and cite primary sources throughout.

Sources

References

  1. Duke, A. (2018). Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts. Portfolio/Penguin. Link
  2. Duke, A. (2018). Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts. Big Think. Link
  3. Duke, A. (2018). Thinking in Bets by Annie Duke | A Guide to Smarter Decisions. YouTube – Clay Finck. Link
  4. Duke, A. (2018). Beyond Luck—Behavioral Science and the Art of Decision Making. T. Rowe Price – The Angle Podcast. Link
  5. Duke, A. (2018). Book Summary: Thinking in Bets by Annie Duke. The Exceptional Skills. Link
  6. Duke, A. (2023). Thinking in Bets for Engineers — with Annie Duke. Refactoring.fm / YouTube. Link

Related Posts

Frequently Asked Questions

What is the key takeaway about thinking in bets?

Evidence-based approaches consistently outperform conventional wisdom. Start with the data, not assumptions, and give any strategy at least 30 days before judging results.

How should beginners approach thinking in bets?

Pick one actionable insight from this guide and implement it today. Small, consistent actions compound faster than ambitious plans that never start.

Published by

Rational Growth Editorial Team

Evidence-based content creators covering health, psychology, investing, and education. Writing from Seoul, South Korea.

Leave a Reply

Your email address will not be published. Required fields are marked *