mental-model pathforcescale causeaccumulateprevent cycle generic

Escalation of Commitment

mental-model generic

Decision-makers increase investment in a failing course of action to justify prior decisions, creating a ratchet that deepens the trap.

Transfers

  • predicts that decision-makers who have publicly committed to a course of action will increase their investment after receiving negative feedback, because abandonment would require admitting the original decision was wrong
  • identifies a self-reinforcing trap where each incremental investment raises the psychological cost of quitting, creating a ratchet effect in which the decision-maker is progressively locked in by their own prior choices

Limits

  • cannot reliably distinguish irrational escalation from rational persistence -- some failing projects genuinely need more investment to reach the threshold where returns materialize, and the model provides no criterion for telling the two apart from the inside
  • focuses on individual psychology (self-justification, ego defense) while underweighting structural forces -- organizational incentives, political commitments, contractual lock-in, and audience costs often drive escalation more powerfully than any cognitive bias

Structural neighbors

Beliefs Are Beings with a Life Cycle life-course · path, scale, cause
Perfectionism psychology · path, scale, cause
The Cure Is Worse Than the Disease medicine · path, force, cause
Analysis Paralysis medicine · path, force, cause
Natural Capital ecology · scale, cause
Sunk Cost Fallacy related
Loss Aversion related
Full commentary & expressions

Transfers

Escalation of commitment describes the tendency to continue investing in a decision after receiving evidence that it is failing. Unlike the sunk cost fallacy, which focuses on the irrationality of considering past costs, escalation of commitment emphasizes the dynamic — the way each new investment makes the next one feel more necessary, creating a self-reinforcing trap.

Key structural parallels:

  • Self-justification as engine — Barry Staw’s original formulation (1976) identified the core mechanism: decision-makers who are personally responsible for the initial choice feel compelled to vindicate it. Admitting failure means admitting that the original decision was wrong, which threatens self-concept and professional reputation. The easier path is to invest more, reframe the setbacks as temporary, and wait for the eventual payoff. Each new investment deepens the commitment to the narrative that the original decision was sound.
  • The ratchet effect — escalation is not a single bad decision but a series of incrementally rational ones. At each stage, the decision-maker compares the cost of continuing (one more increment) against the cost of stopping (admitting total loss plus reputational damage). The incremental cost always looks manageable; the total cost only becomes visible in retrospect. This is why escalation is so difficult to detect from the inside: no single step is obviously irrational.
  • Public commitment amplifies the trap — Staw and Ross’s research showed that escalation intensifies when the decision is public, when the decision-maker is personally identified with the project, and when external audiences are watching. A CEO who announces a strategic initiative at an investor conference has staked their credibility on it. A politician who promises a war will be short has staked their electoral future. The audience creates accountability that makes reversal feel like betrayal rather than learning.
  • Organizational embedding — escalation is not just a cognitive bias; it is an organizational process. Failing projects accumulate staff, budgets, political sponsors, and institutional defenders. Canceling a project means firing the team, writing off the budget, and embarrassing the sponsors. These are not sunk costs (they are future costs of stopping), and they create structural barriers to de-escalation that have nothing to do with individual psychology.

Limits

  • The model cannot distinguish perseverance from pathology — this is the fundamental limitation. Many successful endeavors look like escalation of commitment during the difficult middle: Amazon lost money for years before becoming profitable; SpaceX nearly failed three times before achieving orbit. From the inside, rational persistence and irrational escalation feel identical. The model names the pattern but provides no reliable decision rule for when to stop. It is diagnostic after the fact and largely useless in the moment.
  • Structural forces dwarf cognitive ones — the psychological literature focuses on self-justification and ego defense, but in organizational settings, escalation is often driven by incentive structures (managers rewarded for project continuation, not project cancellation), political dynamics (no one wants to be the person who killed the project), and contractual obligations (termination penalties exceed continuation costs). Treating escalation as a cognitive bias when it is structurally incentivized leads to solutions aimed at the wrong level.
  • De-escalation is not always available — the model implicitly assumes that stopping is an option. In many situations, it is not. Military engagements cannot be unilaterally abandoned without consequences. Infrastructure projects cannot be half-completed. Mergers cannot be easily reversed. The model’s prescription — “recognize the trap and stop” — assumes a freedom of action that decision-makers often do not have.
  • Cultural variation in the meaning of persistence — Western rational-actor models frame continued investment after negative feedback as a bias. But in contexts where perseverance is a core value (Japanese corporate culture, military honor codes, many religious traditions), what the model calls “escalation” may be experienced as faithfulness, duty, or moral consistency. The model’s normative frame (escalation is always bad) is culturally specific.
  • The model is retrospective — we only call it escalation when the project ultimately fails. When the project succeeds, we call it vision, persistence, or leadership. This creates a systematic bias in the literature: only failed escalations are studied, which inflates the apparent irrationality of the behavior.

Expressions

  • “We’ve invested too much to stop now” — the canonical escalation justification, which conflates past investment with future value
  • “Doubling down” — the gambling metaphor for increasing commitment after losses
  • “Throwing good money after bad” — the folk critique, shared with the sunk cost fallacy
  • “Stay the course” — the political version, typically deployed when negative evidence accumulates
  • “We just need to push through” — the startup version, framing escalation as resilience
  • “Vietnam syndrome” — the geopolitical archetype, where escalation of military commitment continued for years past the point where strategic objectives had become unattainable
  • “Project death march” — the software engineering term for a project everyone knows will fail but no one will cancel

Origin Story

Barry Staw introduced the concept in his 1976 paper “Knee-Deep in the Big Muddy,” which used an experimental paradigm where MBA students made investment decisions in a simulated company. Students who were personally responsible for the initial investment allocated significantly more funds after receiving negative feedback than those who inherited someone else’s decision. The title referenced the Pete Seeger anti-Vietnam War song, connecting the laboratory finding to the most visible escalation of the era.

Staw and his collaborator Jerry Ross developed the model through the 1980s, identifying four categories of escalation determinants: project (objective features of the investment), psychological (self-justification, sunk cost reasoning), social (peer pressure, audience effects), and organizational (institutional momentum, political dynamics). Joel Brockner’s parallel work on “entrapment” explored similar dynamics in interpersonal and auction settings.

The concept gained renewed attention after large-scale organizational failures — the Concorde program, the Sydney Opera House, the FBI’s Virtual Case File system, the Iraq War — each of which exhibited the classic escalation pattern: initial commitment, negative feedback, increased investment, further deterioration, eventual abandonment at vastly greater cost than early cancellation would have required.

References

  • Staw, B.M. “Knee-Deep in the Big Muddy: A Study of Escalating Commitment to a Chosen Course of Action” (1976) — Organizational Behavior and Human Performance, the foundational paper
  • Staw, B.M. & Ross, J. “Understanding Behavior in Escalation Situations” (1989) — Science, the mature four-factor model
  • Brockner, J. “The Escalation of Commitment to a Failing Course of Action” (1992) — Academy of Management Review, a major review and synthesis
  • Sleesman, D.J. et al. “Cleaning Up the Big Muddy: A Meta-Analytic Review of the Determinants of Escalation of Commitment” (2012) — the most comprehensive quantitative review
pathforcescale causeaccumulateprevent cycle

Contributors: agent:metaphorex-miner