“How do you think the exam will go?” a student asked. The old me would have said “It’ll be fine” or “You need to study harder.” Binary answers.
Now I think differently. “Based on how much you’ve been studying, there’s about a 60% chance you’ll score 80 or above. If you study this section more, that could go up to 70%.” That’s probabilistic thinking.
Why Binary Thinking Is a Problem
Humans naturally tend to think in binaries. Success or failure. Right or wrong. Possible or impossible.
The problem with this mindset is that it distorts reality. Reality is continuous. There’s a spectrum of possibilities. Binary thinking collapses that spectrum into two extremes.
Daniel Kahneman explains this as a feature of System 1.[1] Fast thinking loves categorization. Calculating probabilities is the work of slow System 2.
What Is Bayesian Thinking?
Bayes’ Theorem mathematically defines how to update existing beliefs when new evidence appears.[2]
In practical terms:
- Estimate the initial probability that something is true. (Prior probability)
- New evidence emerges.
- Update the probability by considering how often that evidence would appear if it were actually true. (Posterior probability)
Example: a student suddenly does well on an exam. How do you judge “the probability this student’s skills genuinely improved” vs. “the probability they got lucky”? Bayesian thinking combines the student’s historical performance pattern (prior probability) with this result (new evidence) to make a judgment.
Using Probabilistic Language
Philip Tetlock found that superforecasters use probabilistic language.[3] Instead of vague expressions like “probably,” “definitely,” or “likely,” they use numbers: “70% probability,” “almost certain (95%+),” “fifty-fifty (50%).”
The benefits of this habit:
- You explicitly acknowledge your own uncertainty.
- You can verify your predictions later.
- You become more open to new evidence.
Probabilistic Thinking in Teaching
I now apply probabilistic thinking to student assessment. Instead of “this student understands this concept,” I say: “The probability that this student actually understands this concept is about 70%. I’ll verify it with a deeper question in the next lesson.”
Why this approach works: because I leave room to be wrong, I update my judgment when new evidence arrives (the student’s responses, the next exam result). With binary thinking, I would have concluded “this student understands it” and ignored the next piece of evidence.
On LessWrong, Eliezer Yudkowsky views probabilistic thinking as the core of rationality.[4] Our beliefs should be proportional to the evidence. Weak evidence — weak belief. Strong evidence — strong belief. That’s Bayesian rationality.
Starting today, once a day, try assigning a probability to one of your beliefs or predictions. “I think this meeting will go well (70%).” “I think this idea will be effective (40%).” It feels awkward at first. But this habit will fundamentally change how you think.
References
- Kahneman, D. (2011). Thinking, Fast and Slow. Farrar, Straus and Giroux.
- Bayes, T. (1763). An essay towards solving a problem in the doctrine of chances. Philosophical Transactions, 53, 370-418.
- Tetlock, P., & Gardner, D. (2015). Superforecasting. Crown Publishers.
- Yudkowsky, E. (2008). Rationality: From AI to Zombies. MIRI.
- Taleb, N. N. (2007). The Black Swan. Random House.