Rational Growth — Rational Growth

Lost $2,847 in 1 Trade—Probability Thinking Fixed It

I lost $2,847 on a single stock because I was certain it would go up. Tuesday morning, I’d read one positive earnings report and convinced myself the decision was obvious. No nuance, no doubt, no consideration of alternative outcomes. It wasn’t until later that year—after watching my account balance shrink—that I realized my mistake wasn’t ignorance. It was thinking in binaries: right or wrong, yes or no, guaranteed or impossible. The moment I learned to think in probabilities instead, everything changed.

You’re probably not alone in this struggle. Most of us were taught to think in absolutes. A student either passes or fails. A business idea either works or doesn’t. You’re either healthy or sick. But the real world doesn’t operate in binaries. It operates in probabilities—ranges of likelihood, degrees of confidence, and conditional outcomes that shift as new information arrives.

This is where Bayesian thinking comes in. It’s not complicated mathematics or abstract philosophy. It’s a practical framework for making better decisions with incomplete information. And unlike binary thinking, it actually reflects how reality works.

Why Binary Thinking Fails Us

Last week, I watched a colleague present a business proposal. She’d done solid research—market analysis, competitive positioning, financial projections. But then she concluded: “This will succeed.” Not “it has a strong probability of success” or “the odds favor this outcome.” She said it like it was certain.

Related: cognitive biases guide

This happens constantly in boardrooms, coffee shops, and personal decisions. We see evidence and collapse it into certainty. We take one data point—one friend’s recommendation, one article, one bad experience—and treat it as truth.

Binary thinking is appealing because it’s simple. It requires no math. No uncertainty. No uncomfortable middle ground. You make a decision and feel confident. The problem? When you ignore probability, you ignore risk. You also ignore opportunity (Kahneman, 2011).

Here’s the damage binary thinking does: You overestimate how likely rare events are. You underestimate how often you’re wrong. You miss information that contradicts your initial view. You make decisions too quickly because you’re not updating your beliefs as new evidence arrives. The stock I bought was headed down 60% over three months. But I’d stopped looking for contrary evidence once I’d decided.

Understanding Probability: The Foundation

Let’s start simple. A probability is just the likelihood something will happen, expressed as a number between 0 and 1. Zero means impossible. One means certain. Everything else lives in between.

When I say there’s a 70% probability it rains tomorrow, I’m saying: if we had 100 days with identical weather conditions, it would rain on about 70 of them. That’s it. No magic. No special knowledge required.

The problem is that most people avoid thinking in actual numbers. We use vague language instead: “probably,” “likely,” “might.” These words feel safer than committing to a specific probability. But that vagueness is exactly why we make poor decisions.

Research shows that when people are forced to assign actual probabilities to outcomes, they make better predictions and better decisions (Tetlock & Gardner, 2015). Not perfect predictions—nobody’s crystal ball works. But better ones.

Here’s a concrete example. Imagine you’re deciding whether to ask your boss for a raise. In binary thinking, you either will or you won’t succeed. In probabilistic thinking, you ask: “What’s the actual likelihood?” Maybe it’s 55%. Not certain, but better than coin flip odds. That changes what you do next. You might prepare more. You might research salary data. You might choose a better timing. You’re optimizing for the most likely outcome while accepting the genuine risk of failure.

What Bayesian Thinking Actually Is

Bayes’ theorem sounds intimidating. It looks like math: P(A|B) = P(B|A) × P(A) / P(B). Forget the formula. The idea is simple and practical.

Bayesian thinking is about updating your beliefs when you get new information. It’s a formal way to answer: “Given what I thought before, and given this new evidence, what should I think now?”

Let me show you how I use this every morning. I wake up and assess the day’s probability of being productive. Let’s say I’ve historically been productive 60% of the time, so that’s my starting point. But then I notice: I slept poorly. That’s new evidence. It pushes my probability down—maybe to 45%. But then I check my calendar and see I have a focused work block with zero meetings. That pushes it back up to 65%. I’m not being random. I’m systematically updating based on evidence.

The Bayesian approach has three steps. First, you start with a prior belief—what you already think, based on past experience. Second, you encounter new evidence. Third, you calculate a posterior belief—your updated view after incorporating that evidence (Spiegelhalter, 2019). [2]

This is exactly how successful decision-makers operate. They don’t change their minds randomly. They change their minds systematically, incorporating new data into their existing framework. That’s what thinking in probabilities means. [1]

From Theory to Practice: Real-World Decisions

Six months ago, I was deciding whether to switch careers. It felt like a binary choice: stay or leave. But Bayesian thinking forced me to be more precise.

I started with my prior: based on my experience in education and observing others, I estimated a 50% probability that career switching would improve my happiness and income within two years. That’s my baseline, honest assessment.

Then I gathered new evidence. I talked to five people who’d made similar switches. Four of them reported positive outcomes. That’s 80% success—higher than my prior. I researched salary data for my target field. It showed 35% higher average pay. New evidence, stronger prior. I took an online course in the new skill to test my interest. I got excited and completed 95% of it. Another positive signal.

After each piece of evidence, I updated my probability. My prior of 50% gradually shifted upward. By the end, I was estimating 72% probability of success. Not certain. But substantially more optimistic than my starting point.

This process has a hidden benefit. Because I’m explicitly tracking my reasoning, I can explain my decision to others. “Here’s what I thought before. Here’s the evidence I found. Here’s how I updated my thinking.” That transparency helps catch blind spots. A friend pointed out that my sample of five people was self-selected—career switchers are more likely to talk about their success. So I adjusted downward slightly, to 68%. Still optimistic, but more realistic.

You can apply this framework to any decision. Job offer. Investment. Relationship. Health choice. Medical treatment. The structure is always the same: prior → evidence → update → decide.

Common Pitfalls in Probabilistic Thinking

Learning to think in probabilities doesn’t mean you’ll stop making mistakes. But you’ll make different ones. And you can learn to avoid the most common traps.

The first trap is confirmation bias. You gather evidence that supports your prior and ignore evidence against it. If you’ve decided a person is untrustworthy, you remember their mistakes and forget their kindnesses. Bayesian thinking requires actively seeking disconfirming evidence. When deciding to hire someone, don’t just ask “Why would they be great?” Also ask “What could go wrong? What mistakes might they make?”

The second trap is overconfidence. Research on expert prediction shows that people are systematically overconfident. They assign higher probabilities to outcomes than are actually justified (Taleb, 2007). A simple fix: whenever you estimate a probability above 80%, ask yourself “What would I see if I was wrong?” That creates psychological space to acknowledge genuine uncertainty.

The third trap is not updating fast enough. You calculate a probability, make a decision, and then ignore new evidence. Markets crash, and you hold the stock because your original thesis seemed sound. A partnership isn’t working, but you stay because you committed to it initially. Bayesian thinking demands that you continuously update. At least monthly, review your major decisions and ask: “Given everything I now know, what would I decide today?” If the answer is different, you might need to change course.

The fourth trap is confusing probability with predictability. Just because something is 80% likely doesn’t mean it will definitely happen. On the flip side, just because something is 20% likely doesn’t mean it won’t. Probability is about frequencies over many events, not individual outcomes.

Building Your Bayesian Intuition

You don’t need calculus to think like a Bayesian. You need practice. Here are concrete ways to build this skill.

Keep a probability journal. For decisions you’re facing, write down your prior probability. “I think there’s a 65% chance this project succeeds.” Then, over time, write down the evidence you encounter and how it updates your thinking. At the end, compare your updated probability to what actually happened. Over dozens of decisions, you’ll calibrate your intuition.

Practice with sports and news. Before a game, estimate the probability of each outcome. Check your prediction afterward. This low-stakes practice builds your probability muscles. Over time, you’ll get better at estimating the true likelihood of events.

Use betting to test your confidence. Don’t actually gamble, but mentally bet. When you’re 70% sure about something, would you bet $10 to win $15? If not, you’re not really 70% confident. This exercise reveals the gap between how confident you feel and how confident you actually are.

Find the base rate. Before updating based on new information, always ask: “What’s the baseline? How often does this happen in general?” If you’re deciding whether a symptom indicates disease, the base rate of that disease matters enormously. If it affects 1 in 1,000 people and you have a symptom, your prior probability is low. A positive test result updates it upward, but not as dramatically as most people think. This is why understanding base rates prevents panic and unnecessary medical procedures.

When Certainty Is an Illusion

The shift from binary to Bayesian thinking is fundamentally about intellectual humility. It’s admitting that almost nothing is certain. And that’s actually liberating.

In my teaching, I’ve noticed that the most effective learners aren’t the ones who are certain they understand. They’re the ones who hold their ideas lightly, ready to update as they learn more. The same applies to work. The best analysts I know don’t project confidence. They project calibrated uncertainty. They say things like “I’m 70% confident in this forecast, and here’s what would change that.”

This might seem less decisive than binary thinking. It’s not. It’s more decisive because it’s more aligned with reality. You can commit fully to a decision while simultaneously holding genuine uncertainty about the outcome. “I’m going all-in on this strategy. I believe it has a 75% probability of success. And I’m prepared for the 25% chance it doesn’t work.”

That’s not wishy-washy. That’s mature decision-making.

Conclusion: Your Next Decision

The good news is you don’t need to master Bayesian statistics to benefit from probabilistic thinking. You just need to stop collapsing uncertainty into false certainty. You need to start tracking your beliefs and updating them systematically. [3]

Pick one major decision you’re facing right now. Estimate your prior probability—what you currently think is most likely to happen. Write it down. Then, over the next week, actively gather evidence. What would you see if you were right? What would you see if you were wrong? How does each piece of evidence update your thinking?

By the end, you won’t have perfect information. But you’ll have thought more carefully than 90% of decision-makers. You’ll have a transparent, updatable framework. And you’ll have built a habit of thinking in probabilities—the same habit that separates good decision-makers from great ones.


Last updated: 2026-05-11

About the Author

Published by Rational Growth. Our health, psychology, education, and investing content is reviewed against primary sources, clinical guidance where relevant, and real-world testing. See our editorial standards for sourcing and update practices.


Your Next Steps


Sources

References

Kahneman, D. (2011). Thinking, Fast and Slow. FSG.

Newport, C. (2016). Deep Work. Grand Central.

Clear, J. (2018). Atomic Habits. Avery.

Published by

Seokhui Lee

Science teacher and Seoul National University graduate publishing evidence-based articles on health, psychology, education, investing, and practical decision-making through Rational Growth.

Leave a Reply

Your email address will not be published. Required fields are marked *