Tier 4

pcef

Procedure Effectiveness

Input: $ARGUMENTS


Overview

Unified framework for procedure effectiveness tracking. Combines two approaches:

  • OPERATIONAL: Usage logging, value ratings, tier assignments, action items
  • EMPIRICAL: Correlation analysis, outcome tracking, statistical thresholds

Use operational tracking after each significant procedure use. Use empirical analysis when you have 5+ projects with structured data.

Steps

Step 1: Operational Tracking — Log Usage

After each significant procedure use (>15 minutes):

PROCEDURE USE LOG:
Procedure: [name/abbreviation]
Date: [when]
Context: [what problem/goal]
Duration: [how long]
Output quality: [1-5, where 5 = highly valuable output]
Insight generated: [Y/N — did it reveal something you didn't know?]
Action items produced: [N items]
Would use again for similar problem: [Y/N]
Notes: [anything unusual about this use]

Step 2: Operational Tracking — Rate Value

After logging, assign a value rating:

RatingCriteriaDescription
5 — EssentialChanged the outcomeWithout this procedure, result would be significantly worse
4 — ValuableImproved efficiency or qualityCould have gotten there without it, but slower/messier
3 — UsefulProvided structureHelped organize thinking, but didn’t change direction
2 — MarginalAdded little beyond what you’d do naturallyProcedure was overhead, not value
1 — WastefulConsumed time without benefitWould actively avoid next time

Step 3: Operational Tracking — Assign Tier

Based on accumulated usage data:

TierCriteriaAction
A — CoreAverage rating ≥ 4, used 5+ timesMaintain, refine, promote
B — UsefulAverage rating ≥ 3, used 3+ timesKeep, improve weak areas
C — SituationalAverage rating ≥ 3, used 1-2 timesKeep, monitor for more use
D — UnderperformingAverage rating < 3Revise or archive
U — UntestedNever usedTest in next relevant situation

Step 4: Empirical Analysis — Correlate with Outcomes

When you have 5+ completed projects:

  1. Gather project data:

    • Project success score (1-5)
    • Which procedures were used
    • How many procedures used
    • Time from start to completion
  2. Calculate correlations:

    • Does using procedure X correlate with higher project success?
    • Does using MORE procedures correlate with success (or is there diminishing returns)?
    • Do certain COMBINATIONS of procedures predict success?
  3. Statistical thresholds:

    • Correlation ≥ 0.3: Weak positive relationship (investigate)
    • Correlation ≥ 0.5: Moderate positive relationship (likely useful)
    • Correlation ≥ 0.7: Strong positive relationship (definitely useful)
    • Note: With small samples, these should be interpreted cautiously

Step 5: Empirical Analysis — Identify Patterns

Look for:

PatternWhat It MeansAction
Procedure X always precedes successX is likely valuableEnsure X is used
Procedure X sometimes helps, sometimes doesn’tContext-dependentIdentify which contexts
Procedure X correlates with FAILUREX may be misapplied or flawedInvestigate
Procedures X+Y together predict successSynergyUse them together
More procedures ≠ more successQuality over quantityFocus on high-value procedures
Certain procedures are never usedMay be unnecessaryArchive or promote

Step 6: Generate Action Items

From the analysis:

PROCEDURE EFFECTIVENESS REVIEW:
Period: [date range]
Projects analyzed: [N]
Procedures tracked: [N]

Tier assignments:
A (Core): [list]
B (Useful): [list]
C (Situational): [list]
D (Underperforming): [list]
U (Untested): [list]

Key findings:
1. [finding] — action: [what to do]

Revisions needed:
1. [procedure] — issue: [what's wrong] — fix: [how to improve]

Archive candidates:
1. [procedure] — reason: [why it's not working]

Promotion candidates:
1. [procedure] — reason: [why it deserves more use]

When to Use

  • Operational: After completing any significant procedure (>15 minutes)
  • Empirical: After completing 5+ projects with structured data
  • Quarterly procedure review
  • When deciding to keep/archive procedures
  • When procedure effectiveness is disputed
  • → INVOKE: /cppd (cross-project pattern detection) for system-level patterns
  • → INVOKE: /prr (procedure review) for individual procedure improvement

Verification

  • Usage logged with consistent format
  • Value ratings assigned honestly (not inflated)
  • Tier assignments based on data (not assumptions)
  • Correlations calculated with appropriate caution about sample size
  • Action items are specific (not just “improve procedure X”)
  • Archive/promotion decisions justified