Purpose: Systematically evaluate options using weighted criteria to make data-driven executive decisions.
Instructions:
Pro Tip: Present this framework AFTER your recommendation to show rigorous analysis, not to replace executive judgment.
Example: "We need to decide between three approaches for scaling our data infrastructure: (1) Migrate to managed cloud database, (2) Build custom distributed system, (3) Hybrid approach with caching layer. This decision impacts $2.4M annual cost, 18-month engineering roadmap, and our ability to support 10x user growth. Decision must be made by end of Q1 to stay on schedule."
| Criterion | Weight | What We're Measuring |
|---|---|---|
| [Criterion 1] | [%] | [How we evaluate this—what good looks like] |
| [Criterion 2] | [%] | [How we evaluate this—what good looks like] |
| [Criterion 3] | [%] | [How we evaluate this—what good looks like] |
| [Criterion 4] | [%] | [How we evaluate this—what good looks like] |
| [Criterion 5] | [%] | [How we evaluate this—what good looks like] |
| TOTAL | 100% | Weights must sum to 100% |
Example Criteria and Weights:
Scoring Scale: 1 = Poor | 2 = Below Average | 3 = Average | 4 = Good | 5 = Excellent
| Criterion (Weight) | Option A | Option B | Option C | Rationale |
|---|---|---|---|---|
| [Criterion 1] [%] |
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[Why these scores?] |
| [Criterion 2] [%] |
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[Why these scores?] |
| [Criterion 3] [%] |
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[Why these scores?] |
| [Criterion 4] [%] |
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[Why these scores?] |
| [Criterion 5] [%] |
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[1-5]
[Score Ă— Weight]
|
[Why these scores?] |
| WEIGHTED TOTAL | [Total] | [Total] | [Total] | Out of 5.0 |
Example Completed Matrix:
| Criterion | Managed Cloud | Custom Build | Hybrid |
|---|---|---|---|
| Total Cost (30%) | 4 = 1.2 | 2 = 0.6 | 3 = 0.9 |
| Time to Market (25%) | 5 = 1.25 | 2 = 0.5 | 3 = 0.75 |
| Technical Risk (20%) | 4 = 0.8 | 2 = 0.4 | 3 = 0.6 |
| Strategic Alignment (15%) | 3 = 0.45 | 5 = 0.75 | 4 = 0.6 |
| Operational Complexity (10%) | 5 = 0.5 | 1 = 0.1 | 3 = 0.3 |
| TOTAL | 4.2 | 2.35 | 3.15 |
Example: "Managed cloud database (Score: 4.2) is the clear winner, scoring highest on cost, time-to-market, risk, and operational simplicity. While it scored lower on strategic alignment due to vendor lock-in concerns, the 9-month faster time-to-market and $800k lower 3-year TCO make it the pragmatic choice for our current scale. We can revisit the custom build option if we reach 100M+ users where cost dynamics change."
Example:
Example:
Example: "If time-to-market was less critical (weighted 10% instead of 25%), the hybrid approach scores 3.4 vs. managed cloud's 3.9—still behind but closer. If strategic alignment was weighted 30% instead of 15%, custom build scores 3.25 vs. managed cloud's 3.95—managed cloud still wins. Our recommendation is robust across reasonable weighting scenarios."
Note: This framework supports decision-making but doesn't replace executive judgment. Use it to structure analysis and ensure rigor, but the final decision should consider intangibles, organizational context, and strategic intuition that don't fit neatly into scorecards.
Template provided by Tech Exec Insight | techexecinsight.com