Decision Tree Analysis
Input: $ARGUMENTS
Overview
Systematic procedure for structuring complex decisions with multiple branches, probabilities, and outcomes.
Step 0: Context Assessment
| Factor | Value | Notes |
|---|---|---|
| Stakes | HIGH / MED / LOW | |
| Probability confidence | HIGH / MED / LOW | How confident are probability estimates? |
| Reversibility | EASY / HARD / IMPOSSIBLE |
If LOW probability confidence + HIGH stakes: Include empirical validation of key probabilities before final decision.
Steps
Step 1: Define the decision problem
Clearly articulate what decision needs to be made:
- State the core decision question
- Identify who is making the decision
- Determine the time horizon and decision deadline
- Clarify the objective (maximize profit, minimize cost, etc.)
- Identify constraints on available options
Step 2: Identify decision points and options
Map all points where the decision maker must choose:
- List the initial decision (root node)
- Identify any subsequent decisions that depend on earlier outcomes
- For each decision point, list all available options
- Ensure options are mutually exclusive
- Include “do nothing” or “wait” where applicable
Step 3: Identify uncertainty and chance events
Map all points where chance affects outcomes:
- List key uncertainties that influence results
- For each uncertainty, define possible outcomes
- Ensure outcomes at each chance node are exhaustive (cover all possibilities)
- Determine where in the tree each chance event occurs
- Identify any dependencies between chance events
Step 4: Construct the tree structure
Build the decision tree connecting all nodes:
- Start with root decision node (first choice)
- Add branches for each option
- After each branch, add relevant chance nodes or subsequent decisions
- Continue until all paths reach terminal nodes
- Review tree for completeness and logical consistency
Step 5: Assign probabilities to chance events
Estimate probability for each branch at chance nodes:
- Gather available data on historical frequencies
- Consult experts for informed estimates
- Use base rates from similar situations
- Assign probability to each branch (0 to 1)
- Verify probabilities sum to 1.0 at each chance node
- Document rationale and uncertainty in estimates
Step 6: Assign values to terminal nodes
Determine the payoff or value at each end state:
- Identify all terminal nodes (end points)
- Calculate or estimate value for each outcome
- Use consistent units (dollars, utility points, etc.)
- Include all relevant costs and benefits in each path
- Account for time value of money if relevant
Step 7: Calculate expected values (fold back the tree)
Solve the tree working from terminal nodes back to root:
- Start at rightmost nodes (terminal values are known)
- At each chance node: EV = sum(probability x value) for all branches
- At each decision node: choose option with highest EV
- Record the optimal choice at each decision node
- Continue until root node has expected value
Step 8: Perform sensitivity analysis
Test robustness of optimal strategy to assumptions:
- Identify probabilities with highest uncertainty
- Vary each probability within reasonable range
- Recalculate optimal strategy for each variation
- Find “crossover points” where optimal decision changes
- Assess whether crossover points are plausible
Step 9: Calculate value of information (optional)
Determine if gathering more information is worthwhile:
- Identify information that could reduce uncertainty
- Calculate EV with perfect information (EVPI)
- Calculate EV with imperfect information if applicable
- Value of information = EV(with info) - EV(without info)
- Compare to cost of obtaining information
Step 10: Empirical Validation of Key Probabilities (HIGH stakes)
When to include: HIGH stakes + LOW confidence in probability estimates.
Before committing to the optimal strategy:
-
Identify critical probabilities:
- Which probabilities most affect the optimal decision?
- (From sensitivity analysis: crossover points)
- These are the ones worth validating
-
Design probability validation:
- Historical data: What base rates exist?
- Expert elicitation: What do domain experts estimate?
- Small-scale tests: Can we observe a sample?
- Reference class forecasting: What happened in similar situations?
-
Log predictions for calibration:
Probability estimate: [event] has [X]% chance Confidence in estimate: HIGH/MED/LOW How we'd know: [observation that confirms/disconfirms] Review date: [when we'll know]
→ INVOKE: /empirical_validation [critical probability estimates]
- Update tree if estimates change significantly:
- Recalculate expected values
- Check if optimal strategy changes
- Document the update
When to Use
- Decision has multiple sequential stages or phases
- Uncertain events will influence optimal choices
- Need to compare options with different risk-return profiles
- Want to calculate expected value of different strategies
- Stakeholders need visual representation of decision structure
- Decisions depend on outcomes of intermediate chance events
- Evaluating whether to gather information before deciding
- Comparing “act now” vs “wait and see” strategies
Verification
- Tree structure is complete with all paths reaching terminal nodes
- Probabilities sum to 1.0 at each chance node
- Terminal values are consistently calculated
- Expected values correctly computed by folding back
- Sensitivity analysis performed on key probability estimates
- Optimal strategy clearly identified with decision rule
- Assumptions and limitations documented
- If HIGH stakes + LOW confidence: Probability validation completed
- Key predictions logged for calibration
Integration Points
- Often invoked from: /procedure_engine, /comparison
- Routes to: /selection (final choice), /empirical_validation (probability validation)
- Related: /expected_value, /risk_assessment, /probabilistic_reasoning