Proportionality Bias and Conspiracy Thinking: Why Our Brains Demand Big Causes for Big Events



Proportionality Bias and Conspiracy Thinking: Why Our Brains Demand Big Causes for Big Events

On September 11, 2001, terrorists attacked the United States, killing nearly 3,000 people. It was a tragedy of enormous scale. Within weeks, alternative explanations began circulating—theories that the U.S. government orchestrated the attacks, that controlled demolitions brought down the towers, that missiles hit the Pentagon. These weren’t fringe whispers; millions of intelligent people entertained them.

Why? Because our brains have a powerful—and usually invisible—cognitive bias that demands a sense of proportionality between cause and effect. When something massive happens, we struggle to accept that it emerged from relatively ordinary circumstances. This tendency, known as proportionality bias, is one of the most important psychological mechanisms driving conspiracy thinking in the modern world.

I’ve spent years teaching critical thinking and researching how intelligent people fall into conspiratorial rabbit holes. The pattern is almost always the same: a big event occurs, and our intuition rebels against the idea that it could result from accident, incompetence, or small-scale human action. We demand that huge consequences must flow from huge, deliberate causes. Understanding this bias—how it works, why evolution gave it to us, and how to counteract it—is essential for anyone who wants to think more clearly about the world.

What is Proportionality Bias?

Proportionality bias is the cognitive tendency to believe that big events must have big, important, or complex causes (Sunstein & Vermeule, 2009). In essence, our brains assume a rough equivalence between the magnitude of an outcome and the magnitude of what caused it. A small cause produces a small effect. A massive effect must have a massive cause. [2]

Related: cognitive biases guide

This seems rational on the surface. In many physical domains, it is rational—the energy released in an earthquake is proportional to the magnitude of tectonic movement, and the impact of a car depends largely on its mass and speed. But in complex systems—politics, epidemiology, security, finance—proportionality often breaks down entirely.

Consider: A single person with a box cutter and basic flight training can kill thousands. A single misplaced decimal point in code can crash a space probe worth billions. A single mutation in a virus, transmitted between a few people in a wet market, can trigger a global pandemic. A company founder’s documented email, sent to a handful of employees with a poorly thought-out strategy, can eventually destroy billions in shareholder value.

In each case, the effect is massive, but the initiating cause is trivial—almost embarrassingly small. Our brains find this deeply unsatisfying. We want proportionality. And when the world doesn’t provide it, we invent causes large and complex enough to match the outcome we observe (Swami et al., 2011).

Why Evolution Gave Us This Bias

Before you dismiss proportionality bias as a design flaw, consider when it was useful. For most of human history, we lived in small groups with transparent cause-and-effect relationships. A lion attack (massive effect) came from a hungry lion (substantial cause). A harvest failure (significant loss) came from drought or disease (observable cause). A leader’s downfall (major event) came from visible rebellion or defeat in battle (proportional cause).

In ancestral environments, the world operated more or less according to proportionality principles. If something huge happened, you wanted to find the huge underlying cause—because that cause was likely still around and still dangerous. The bias helped keep us alive.

The problem is that modern society is nothing like the ancestral environment. We live in complex systems with non-linear dynamics, technological infrastructure, global supply chains, and institutions so intricate that a single point of failure can cascade into civilization-scale consequences. Yet our brains are still searching for proportionality, still demanding that big effects must have big causes.

This mismatch creates a perfect breeding ground for conspiracy thinking. When something genuinely terrible happens—a financial crisis, a terrorist attack, a pandemic—and our intuitions about proportionality aren’t satisfied by the official explanation, we become vulnerable to alternative narratives that do satisfy them. And conspiracy theories, by design, always offer proportionally large causes: shadowy elites, coordinated networks, deliberate malice. [4]

Proportionality Bias and Conspiracy Thinking: The Connection

The relationship between proportionality bias and conspiracy thinking is so strong that researchers now consider it one of the central psychological drivers of conspiratorial belief (Leman & Cinnirella, 2013). Here’s how the mechanism works: [3]

                                                • A major event occurs: A crisis, tragedy, or market collapse that affects many people and dominates media coverage.
                                                • The official explanation feels inadequate: It attributes the outcome to accident, negligence, complexity, or ordinary human greed and shortsightedness. These feel too small.
                                                • Proportionality bias triggers: The brain rejects the explanation as disproportionate. It demands a cause equal to the effect.
                                                • Conspiracy theory provides relief: An alternative narrative emerges that attributes the event to intentional, coordinated action by powerful actors. This feels proportional. The sense of cognitive dissonance resolves.

Importantly, this isn’t about stupidity or education level. Highly intelligent people are just as susceptible to proportionality bias and conspiracy thinking as anyone else—sometimes more so, because they’re skilled at post-hoc rationalization and can construct more elaborate justifications for their conspiratorial beliefs (Sunstein & Vermeule, 2009). I’ve seen this firsthand when teaching critical thinking to graduate students and professionals; the most articulate conspiracy theorists are often the brightest people in the room. [1]

The bias doesn’t require active deception or intentional reasoning. It operates as an intuitive pull—a feeling that something doesn’t add up. And feelings are powerful. A person can intellectually accept that accidents happen, that complex systems fail in unintuitive ways, that small causes can have large effects. But if the emotional sense of proportionality isn’t satisfied, they’ll keep searching for an explanation that is. [5]

Real-World Examples: From Markets to Pandemics

Let’s examine how proportionality bias and conspiracy thinking manifest across different domains:

The 2008 Financial Crisis

Millions of people lost homes and savings. It was the worst economic catastrophe since the Great Depression. Surely, the thinking went, something enormous must have caused it—a coordinated conspiracy by bankers, the Federal Reserve, or the Illuminati.

The actual cause was far more mundane: mortgage-backed securities created through a complex, poorly understood financial engineering process; rating agencies that had perverse incentives to rate garbage as gold; a cultural assumption that housing prices could only go up; and individuals making locally rational decisions that collectively created systemic risk. Incompetence and incentive misalignment, not conspiracy. But it doesn’t feel proportional to the disaster that resulted.

COVID-19 and the Lab-Leak Theory

The COVID-19 pandemic killed millions and disrupted civilization. Early official explanations attributed it to zoonotic transmission—a virus jumping from animals to humans, probably in a market, in a process that has happened many times throughout history. To many people, this felt absurdly insufficient. Millions had died; surely the cause must be equally immense: a bioweapon, a lab leak, a coordinated cover-up.

Here’s where it gets interesting: later investigations suggested a lab leak was plausible, and institutions may have been less than fully transparent. But even when skepticism about official narratives was warranted, proportionality bias shaped which skepticism arose. People didn’t ask, “Could this be a spillover with unusual characteristics?” They demanded, “Wasn’t this deliberately released?”

Unexpected Stock Market Crashes

When the market drops 10% in a day with no obvious trigger, proportionality bias activates immediately. A drop that size must reflect something huge—a hidden economic calamity, a government conspiracy, market manipulation by insiders. Often, investigations later reveal the cause was algorithmic, technical, or based on the simple mechanics of options expiration and margin calls. Small cause, enormous effect. It never quite feels adequate.

How to Recognize Proportionality Bias in Your Own Thinking

Self-awareness is the first step. When you find yourself attracted to a conspiracy theory, ask yourself these diagnostic questions:

                                                • Am I rejecting the official explanation primarily because it feels too small? Notice when your objection is emotional (“This doesn’t feel right, it doesn’t add up”) rather than evidential (“Here’s data contradicting the account”).
                                                • Am I assuming that a massive effect must have a massive cause? Remember that complex systems routinely produce enormous consequences from minor initiating events.
                                                • Am I drawn to explanations that feature intentional, coordinated action by powerful actors? These feel proportional, but they’re not necessarily true. Many disasters involve no one deliberately causing them.
                                                • Would I believe the official explanation if the outcome had been smaller? If a financial downturn of 2% wouldn’t trigger conspiracy theories but one of 30% does, that’s proportionality bias at work, not a change in evidence.
                                                • Am I pattern-matching the current situation to historical conspiracies that were real? The CIA did conduct COINTELPRO, governments have lied about wars, and elites have coordinated against public interest. These facts make us understandably skeptical—but they don’t make current claims proportionally more likely to be true.

The goal isn’t to become naive or credulous of official narratives. Institutions do lie; powerful people do coordinate; transparency is rare. The goal is to evaluate claims on evidence rather than on whether they satisfy your proportionality intuitions.

Practical Strategies to Counteract the Bias

Once you recognize proportionality bias, you can implement evidence-based strategies to think more clearly:

Actively Seek Small-Cause, Large-Effect Examples

Train your intuition by studying cases where minor causes produced massive consequences. The Deepwater Horizon explosion started with a failed pressure relief valve. The Fukushima disaster followed a design decision not to build the wall tall enough. Chernobyl resulted from a poorly designed safety test. The 1995 Oklahoma City bombing was executed by two men with fertilizer. Build a mental library of these examples so your brain becomes less surprised when they occur.

Distinguish Between “Possible” and “Proportional”

Something can be theoretically possible while remaining extraordinarily unlikely. A government could have orchestrated a terrorist attack, but doing so would require an implausibly vast conspiracy of silence, and the political incentives don’t align with the claimed benefits. Distinguishing between logical possibility and actual likelihood is crucial. Most conspiracy theories fail this test—they’re possible in theory but require so many people to keep secrets so successfully that the burden of proof becomes unreasonably high.

Demand Better Evidence for Larger Claims

Carl Sagan’s principle applies: extraordinary claims require extraordinary evidence. If you’re going to believe that an event was caused by a vast, coordinated conspiracy rather than by accident, incompetence, or complex-system dynamics, the evidence standard should be very high. A feeling of disproportionality is not evidence. Circumstantial connections are not evidence. The burden of proof increases with the implausibility of the claim.

Practice Comfort with Unsatisfying Explanations

This might be the most important strategy. Train yourself to sit with explanations that feel too small for the outcome they produced. Notice the discomfort, name it, and don’t let it drive you toward false narratives. Our emotions are useful signals, but they’re not truth-detection instruments. An explanation can be true even if it feels inadequate to you emotionally.

Conclusion

Proportionality bias and conspiracy thinking are deeply intertwined—and they’re not character flaws or signs of stupidity. They’re natural outputs of our evolved cognitive architecture confronting a world far more complex and non-linear than the environments in which our brains developed. Understanding this bias is essential for anyone who wants to navigate information landscapes responsibly, maintain epistemic integrity, and avoid being drawn into narratives that feel psychologically satisfying but aren’t evidentially supported.

The path forward isn’t to distrust your intuitions entirely—they contain useful information. Rather, it’s to recognize when you’re being driven by proportionality bias, to actively seek out examples that violate your proportionality expectations, and to demand evidence proportional to the implausibility of the claims you’re evaluating. In doing so, you’ll build resilience against conspiracy thinking and develop a more accurate model of how the world actually works.

Last updated: 2026-03-24

Disclaimer: This article is for educational and informational purposes only. It is not a substitute for professional medical advice, diagnosis, or treatment. Always consult a qualified healthcare provider with any questions about a medical condition.

Your Next Steps

      • Today: Pick one idea from this article and try it before bed tonight.
      • This week: Track your results for 5 days — even a simple notes app works.
      • Next 30 days: Review what worked, drop what didn’t, and build your personal system.

Frequently Asked Questions

What is Proportionality Bias and Conspiracy Thinking?

Proportionality Bias and Conspiracy Thinking refers to a practical approach to personal growth that emphasizes evidence-based habits, rational decision-making, and measurable progress over time. It combines insights from behavioral science and self-improvement research to help individuals build sustainable routines.

How can Proportionality Bias and Conspiracy Thinking improve my daily life?

Applying the principles behind Proportionality Bias and Conspiracy Thinking can lead to better focus, more consistent productivity, and reduced decision fatigue. Small, intentional changes — practiced daily — compound into meaningful long-term results in both personal and professional areas.

Is Proportionality Bias and Conspiracy Thinking worth the effort?

Yes. Research in habit formation and behavioral psychology consistently shows that structured, goal-oriented approaches yield better outcomes than unplanned efforts. Starting with small, achievable steps makes Proportionality Bias and Conspiracy Thinking accessible for anyone regardless of prior experience.

References

  1. Douglas, K. M., et al. (2017). Understanding Conspiracy Theories. Political Psychology. Link
  2. Van Prooijen, J.-W., & Douglas, K. M. (2018). Belief in Conspiracy Theories: Basic Principles of an Emerging Research Domain. European Journal of Social Psychology. Link
  3. Douglas, K. M., et al. (2019). Countering Disinformation and Building Trust Online. Nature Reviews Psychology. Link
  4. Swami, V., et al. (2014). Analytic Thinking Reduces Belief in Conspiracy Theories. Cognition. Link
  5. Brotherton, R., French, C. C., & Pickering, L. E. (2013). Measuring Belief in Conspiracy Theories: The Generic Conspiracist Beliefs Scale. Frontiers in Psychology. Link
  6. Lantian, A., et al. (2018). From Proportionality Bias to Conspiracy Theories: The Dual Role of Need for Uniqueness. Social Influence. Link

Related Reading

Published by

Rational Growth Editorial Team

Evidence-based content creators covering health, psychology, investing, and education. Writing from Seoul, South Korea.

Leave a Reply

Your email address will not be published. Required fields are marked *