Decision Intel Taxonomy
Cognitive Bias Taxonomy
20 cognitive biases with stable, citeable identifiers (DI-B-001 through DI-B-020). Each bias includes academic grounding, real-world case studies, and debiasing techniques. These IDs are permanent and can be referenced in research, compliance audits, and regulatory filings.
DI-B-001
Confirmation Bias
hardCase Study | Kodak (1975–2012)
Kodak and Digital Photography
Kodak engineer Steve Sasson invented the digital camera in 1975, but leadership dismissed evidence that digital would replace film — selectively citing data showing film was still profitable. By the time they pivoted, it was too late.
Quick Tip: Before deciding, write down what evidence would change your mind — then go look for it.
Detection & Debiasing
Assign a "Devil's Advocate" to challenge the dominant hypothesis before any major decision.Use a structured "Consider the Opposite" exercise: list 3 reasons your conclusion might be wrong.Seek out disconfirming evidence before finalizing — specifically look for data that contradicts your position.
Wason, P.C. (1960). "On the failure to eliminate hypotheses in a conceptual task." Quarterly Journal of Experimental Psychology, 12(3), 129–140.
DOIDI-B-002
Anchoring Bias
moderateCase Study | University of Arizona Study (1987)
Real Estate Pricing Experiments
In Northcraft & Neale's landmark study, real estate agents were given different listing prices for identical properties. Even experienced agents' valuations were heavily influenced by the arbitrary initial price, despite claiming they ignored it.
Quick Tip: Always ask: "Would I reach the same conclusion if the first number I saw was different?"
Detection & Debiasing
Generate your own estimate BEFORE seeing any reference numbers or proposals.Use multiple independent anchors — get 3+ data points from different sources before averaging.Explicitly challenge the first number you encounter: ask "Why is this the right starting point?"
Tversky, A. & Kahneman, D. (1974). "Judgment under Uncertainty: Heuristics and Biases." Science, 185(4157), 1124–1131.
DOIDI-B-003
Availability Heuristic
moderateCase Study | U.S. Transportation Data (2001–2002)
Post-9/11 Driving Deaths
After 9/11, Americans drove instead of flying due to vivid fear of terrorism. This led to an estimated 1,595 additional traffic deaths in the year following — far exceeding the risk of another attack. Vivid, recent events distorted risk assessment.
Quick Tip: When a risk feels scary, look up the actual statistics before adjusting your plans.
Detection & Debiasing
Always check base rates: ask "How often does this actually happen?" before making probability judgments.Use statistical data instead of examples — replace "I heard about a case where…" with "The data shows…"Create a "recency check": is this top of mind because it's truly important or because it happened recently?
Tversky, A. & Kahneman, D. (1973). "Availability: A heuristic for judging frequency and probability." Cognitive Psychology, 5(2), 207–232.
DOICase Study | U.S. Government (1961)
Bay of Pigs Invasion
President Kennedy's advisory team unanimously supported the 1961 invasion of Cuba despite serious flaws in the plan. Advisors suppressed doubts to maintain group harmony. After the disaster, Kennedy restructured his decision-making process to explicitly encourage dissent.
Quick Tip: If everyone agrees too quickly, that's a red flag — not a green light.
Detection & Debiasing
Use anonymous voting or written submissions before group discussion to capture independent opinions.Rotate a formal "dissenter" role — someone whose job is to find flaws in the consensus position.Split into independent sub-groups that analyze the problem separately before reconvening.
Janis, I.L. (1972). Victims of Groupthink: A Psychological Study of Foreign-Policy Decisions and Fiascoes. Boston: Houghton Mifflin.
SourceDI-B-005
Authority Bias
moderateCase Study | Yale University (1963)
Milgram Obedience Experiments
Stanley Milgram's experiments showed that 65% of participants administered what they believed were lethal electric shocks when instructed by an authority figure in a lab coat. Authority cues override independent moral judgment.
Quick Tip: Separate the argument from the arguer — evaluate evidence, not credentials.
Detection & Debiasing
Evaluate arguments on evidence quality, not the seniority of who's making them.Use blind review processes where the source identity is hidden during evaluation.Ask: "Would I accept this argument if an intern presented it?"
Milgram, S. (1963). "Behavioral Study of Obedience." Journal of Abnormal and Social Psychology, 67(4), 371–378.
DOIDI-B-006
Bandwagon Effect
easyCase Study | NASDAQ Market (1997–2002)
Dot-Com Bubble
In the late 1990s, investors poured money into internet companies with no revenue models because "everyone else was investing." The NASDAQ lost 78% of its value when the bubble burst in 2000–2002, wiping out $5 trillion in market value.
Quick Tip: Popularity is not proof. Ask: "What's the evidence this works, separate from who else is doing it?"
Detection & Debiasing
Before following a trend, write down your independent rationale — would you still do this if nobody else was?Track the base rate of trend adoption vs. actual success rates in your industry.Implement a "cooling off" period: wait 48 hours before joining any fast-moving consensus.
Leibenstein, H. (1950). "Bandwagon, Snob, and Veblen Effects in the Theory of Consumers' Demand." Quarterly Journal of Economics, 64(2), 183–207.
DOIDI-B-007
Overconfidence Bias
hardCase Study | LTCM (1998)
Long-Term Capital Management Collapse
Nobel Prize-winning economists running LTCM were so confident in their models that they leveraged $4.8B into $125B in derivatives. When markets moved against them in 1998, the fund collapsed so catastrophically it threatened the global financial system.
Quick Tip: Add 30% to your worst-case estimate — that's probably closer to realistic.
Detection & Debiasing
Use calibration training: practice estimating ranges and track your accuracy over time.Replace point estimates with ranges (e.g., "between 2-4 months" instead of "3 months exactly").Ask: "What's the probability I'm wrong?" and if you can't say at least 10%, you're likely overconfident.
Moore, D.A. & Healy, P.J. (2008). "The trouble with overconfidence." Psychological Review, 115(2), 502–517.
DOIDI-B-008
Hindsight Bias
moderateCase Study | Global Financial Markets (2008)
2008 Financial Crisis "Predictors"
After the 2008 crash, countless analysts claimed they "saw it coming." However, pre-crisis records show most were bullish. Hindsight bias rewrites memory to make past events seem predictable, which prevents learning the real lessons.
Quick Tip: Before reviewing what happened, write down what you expected — then compare honestly.
Detection & Debiasing
Keep a decision journal: record your predictions AND your confidence BEFORE outcomes are known.In post-mortems, start by asking "What did we believe at the time?" before discussing what happened.Use pre-registration: document your analysis and predictions before events unfold.
Fischhoff, B. (1975). "Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty." Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288–299.
DOIDI-B-009
Planning Fallacy
moderateCase Study | NSW Government, Australia (1957–1973)
Sydney Opera House
Originally estimated at $7M and 4 years, the Sydney Opera House took 16 years and cost $102M — a 1,457% cost overrun. This pattern repeats across 90% of large infrastructure projects worldwide.
Quick Tip: How long did similar projects actually take? Use that as your baseline, not your optimism.
Detection & Debiasing
Use "reference class forecasting": compare to similar past projects rather than building bottom-up estimates.Apply a "planning fallacy multiplier" — historical data suggests multiplying time estimates by 1.5–2x.Break projects into small milestones and estimate each independently, then add buffer time.
Kahneman, D. & Tversky, A. (1979). "Intuitive prediction: biases and corrective procedures." TIMS Studies in Management Science, 12, 313–327.
SourceDI-B-010
Loss Aversion
moderateCase Study | Coca-Cola (1985)
New Coke Reversal
In 1985, Coca-Cola replaced its formula with New Coke after winning blind taste tests. Despite better taste-test results, consumers revolted — not because New Coke was bad, but because losing the original felt like a loss. Coca-Cola reversed the decision within 79 days.
Quick Tip: Ask: "If I didn't already have this, would I pay to get it?" That reveals whether you're protecting a loss or making a smart choice.
Detection & Debiasing
Reframe decisions in terms of total outcomes rather than gains vs. losses — focus on the end state.Use the "10/10/10 rule": How will you feel about this in 10 minutes, 10 months, and 10 years?Calculate the expected value objectively: multiply probability × magnitude for both gain and loss scenarios.
Kahneman, D. & Tversky, A. (1979). "Prospect Theory: An Analysis of Decision under Risk." Econometrica, 47(2), 263–291.
DOIDI-B-011
Sunk Cost Fallacy
moderateCase Study | British/French Governments (1962–2003)
Concorde Supersonic Jet
The British and French governments continued funding Concorde for decades despite clear evidence it would never be commercially viable — because billions had already been spent. The project became the textbook "Concorde fallacy" example.
Quick Tip: Money already spent is gone. The only question is: "What's the best use of the NEXT dollar?"
Detection & Debiasing
Apply the "clean slate test": If you were starting fresh today with no prior investment, would you still choose this path?Separate past costs from future decisions — what's spent is gone regardless of what you decide next.Set pre-defined "kill criteria" at the start of projects and honor them when triggered.
Arkes, H.R. & Blumer, C. (1985). "The psychology of sunk cost." Organizational Behavior and Human Decision Processes, 35(1), 124–140.
DOIDI-B-012
Status Quo Bias
easyCase Study | Blockbuster (2000–2010)
Blockbuster vs. Netflix
Blockbuster had the chance to buy Netflix for $50M in 2000 but chose to protect its existing store model. The preference for the status quo — 9,000 physical stores — led to bankruptcy in 2010, while Netflix grew to a $150B company.
Quick Tip: Inaction is also a decision. Ask: "What is the cost of NOT changing?"
Detection & Debiasing
Periodically conduct "zero-base" reviews: justify every ongoing process as if starting from scratch.Assign someone to champion change — a specific person whose role is to argue for alternatives.Reframe the question: instead of "Should we change?" ask "If we were starting new, would we choose this current setup?"
Samuelson, W. & Zeckhauser, R. (1988). "Status quo bias in decision making." Journal of Risk and Uncertainty, 1(1), 7–59.
DOIDI-B-013
Framing Effect
easyCase Study | Stanford/Princeton Study (1981)
Asian Disease Problem
Tversky & Kahneman showed that when a medical program was framed as "saving 200 out of 600 people" vs. "400 will die," participants reversed their preferences — despite identical outcomes. How information is presented changes decisions.
Quick Tip: Flip the frame: if the data was presented oppositely, would you still reach the same conclusion?
Detection & Debiasing
Reframe every proposal in at least two ways (positive and negative framing) before deciding.Convert relative numbers to absolute: "20% improvement" → "improves from 50 to 60 out of 300."Ask the presenter: "Can you show me this data framed differently?" — then compare your reactions.
Tversky, A. & Kahneman, D. (1981). "The framing of decisions and the psychology of choice." Science, 211(4481), 453–458.
DOIDI-B-014
Selective Perception
hardCase Study | Hastorf & Cantril Study (1954)
Dartmouth vs. Princeton Football Study
After a rough 1951 football game, students from each school watched identical film footage but "saw" completely different games — each side counted more fouls by the opposing team. Prior allegiances filtered perception of identical evidence.
Quick Tip: Ask someone who disagrees with you to read the same document — compare what each of you noticed.
Detection & Debiasing
Use structured evaluation rubrics that force attention to all aspects, not just salient ones.Have someone with a different perspective review the same information independently.Practice "steel-manning": articulate the strongest version of the opposing view before critiquing it.
Hastorf, A.H. & Cantril, H. (1954). "They saw a game: A case study." Journal of Abnormal and Social Psychology, 49(1), 129–134.
DOICase Study | Deloitte Research (2015)
Performance Review Season
Research by Deloitte found that 62% of a performance rating is driven by events in the last 3 months, despite reviews covering a full year. A strong Q4 erases a weak Q1 in most managers' evaluations.
Quick Tip: Before making a judgment based on recent data, check whether the longer-term trend tells a different story.
Detection & Debiasing
Keep running logs of events/data throughout the evaluation period rather than relying on memory.Weight time periods explicitly: assign equal importance to each quarter/month in your analysis.When making forecasts, compare current trends against 5-year and 10-year averages.
Murdock, B.B. (1962). "The serial position effect of free recall." Journal of Experimental Psychology, 64(5), 482–488.
DOIDI-B-016
Cognitive Misering
moderateCase Study | Theranos (2003–2018)
Theranos Due Diligence Failures
Investors poured $700M into Theranos based on Elizabeth Holmes' compelling narrative, without conducting basic technical due diligence. Board members — including former secretaries of state — relied on surface impressions rather than verifying the technology actually worked.
Quick Tip: If a high-stakes decision took less than an hour, you probably didn't think hard enough.
Detection & Debiasing
Implement mandatory "verification checkpoints" — specific stages where claims must be independently confirmed.Use the "5 Whys" technique: ask "why?" five times to push past superficial reasoning.Set a minimum deliberation time proportional to decision stakes — high-stakes decisions get at least 48 hours.
Stanovich, K.E. & West, R.F. (2000). "Individual differences in reasoning: Implications for the rationality debate?" Behavioral and Brain Sciences, 23(5), 645–665.
DOIDI-B-017
Halo Effect
moderateCase Study | Enron (1985–2001)
Enron and the "Smartest Guys in the Room"
Enron's early success in energy trading created a halo that blinded analysts, rating agencies, and investors to massive accounting fraud. The company's prestige and charismatic leadership made stakeholders assume competence across all operations, even as internal controls collapsed.
Quick Tip: If you can't name a specific weakness in the option you favor, you're probably under a halo.
Detection & Debiasing
Evaluate each dimension of a decision independently — score financial, operational, and strategic merits separately before combining.Use blind evaluation where possible: strip names, brands, and reputations from materials before assessment.Ask "Would I still rate this highly if the brand/person/track record were unknown?"
Thorndike, E.L. (1920). "A constant error in psychological ratings." Journal of Applied Psychology, 4(1), 25–29.
DOIDI-B-018
Gamblers Fallacy
moderateCase Study | Monte Carlo Casino (1913)
Monte Carlo Casino, 1913
At the Monte Carlo Casino, the roulette ball landed on black 26 times in a row. Gamblers lost millions betting on red, convinced that a "correction" was due — but each spin was independent. The same fallacy appears in investment committees after a string of successful deals: "We're due for a miss."
Quick Tip: Past outcomes only predict future ones when there is a genuine causal link — not just a pattern.
Detection & Debiasing
Explicitly state the base rate for each outcome before evaluating a sequence of events.Ask "Is there a causal mechanism linking past outcomes to this decision, or am I pattern-matching on randomness?"Use pre-commitment: decide your criteria before seeing the outcome sequence.
Tversky, A. & Kahneman, D. (1971). "Belief in the law of small numbers." Psychological Bulletin, 76(2), 105–110.
DOIDI-B-019
Zeigarnik Effect
hardCase Study | Yahoo (2002–2013)
Yahoo's Incomplete Acquisitions
Yahoo famously passed on acquiring Google (2002) and Facebook (2006). In subsequent acquisition decisions, the unfinished business of those missed deals heavily influenced strategy — leading to the overpriced $1.1B acquisition of Tumblr, partly driven by the psychological weight of previous "ones that got away."
Quick Tip: If a past missed opportunity keeps coming up in your current deliberation, name it and set it aside.
Detection & Debiasing
Explicitly list all "open loops" (unresolved past decisions) before starting a new evaluation — then consciously set them aside.Use a decision journal: closing out past decisions in writing reduces their psychological pull.Ask "Am I evaluating this opportunity on its own merits, or trying to close an old chapter?"
Zeigarnik, B. (1927). "Das Behalten erledigter und unerledigter Handlungen" [On finished and unfinished tasks]. Psychologische Forschung, 9, 1–85.
DOIDI-B-020
Paradox Of Choice
easyCase Study | Columbia University / Enterprise Procurement (2000)
Jam Study and Enterprise Software Selection
Sheena Iyengar's famous study showed customers were 10x more likely to purchase when offered 6 jam varieties vs. 24. The same effect plagues enterprise procurement: when evaluating too many vendor options, committees often default to the incumbent or delay decisions indefinitely.
Quick Tip: If your team has been evaluating options for weeks without converging, you probably have too many options.
Detection & Debiasing
Limit initial screening to 3–5 options maximum using pre-agreed criteria before deep evaluation.Use "satisficing" deliberately: define your minimum acceptable criteria upfront and choose the first option that meets them.Implement a two-stage process: broad screening with hard cutoffs, then deep comparison of the shortlist only.
Iyengar, S.S. & Lepper, M.R. (2000). "When choice is demotivating: Can one desire too much of a good thing?" Journal of Personality and Social Psychology, 79(6), 995–1006.
DOITaxonomy IDs are permanent and will never change. Cite as: "Decision Intel Bias Taxonomy, 2026. [DI-B-XXX]."