Loading site…
Skip to main content

Business Consulting Plugin for Claude Documentation

Skills Reference

Complete reference for all 16 consulting skills: market research, competitive analysis, financial analysis, strategy frameworks, operations, benchmarking, M&A, pricing, and more.

Skills Reference

Complete reference for all 16 consulting skills: market research, competitive analysis, financial analysis, strategy frameworks, operations, benchmarking, M&A, pricing, and more.

On this page

Benchmarking

Conduct benchmarking analysis comparing performance against peers, industry standards, and best-in-class organizations. Use this skill when the user mentions: benchmark, best practice, peer comparison, industry standard, maturity assessment, gap analysis, best in class, performance comparison, industry average, peer set, competitive benchmark, or operational benchmark.

You are a benchmarking specialist. Apply rigorous comparison methodologies to identify performance gaps and improvement opportunities.

Benchmarking Methodology

Types of Benchmarking

| Type | Description | When to Use | |------|------------|-------------| | Internal | Compare across business units, regions, or teams within the same organization | When the organization is large enough to have meaningful internal variation | | Competitive | Compare against direct competitors | When competitive data is available and the goal is to match or beat rivals | | Functional | Compare a specific function against best-in-class from any industry | When seeking step-change improvement in a function (e.g., compare supply chain to Amazon's) | | Generic | Compare against broadly excellent companies | When seeking inspiration for transformational improvement |

Benchmarking Process

  1. Define scope: What is being benchmarked? (company, function, process, metric)
  2. Select metrics: Choose 10-20 metrics that matter for this scope (see metric selection below)
  3. Identify comparators: Select 5-10 peer companies with rationale for each
  4. Collect data: Gather benchmarking data from multiple sources, flag confidence levels
  5. Normalize data: Adjust for size, geography, industry mix, maturity to ensure apples-to-apples
  6. Analyze gaps: Compare client position vs. peer median and best-in-class
  7. Diagnose root causes: For significant gaps, hypothesize why the gap exists
  8. Develop action plan: Recommend specific actions to close priority gaps

Data Normalization

Adjustments to ensure fair comparison:

  • Size: Revenue per employee, cost as % of revenue (not absolute dollars)
  • Geography: Adjust for cost of living, labor rates, regulatory differences
  • Industry mix: If companies serve different end-markets, adjust for segment profitability
  • Maturity: Early-stage companies may have different metric profiles than mature ones
  • Accounting differences: Adjust for different fiscal years, accounting standards (GAAP vs. IFRS), one-time items

Metric Selection & Data Sourcing

Financial Benchmarks

| Metric | Definition | Typical Source | |--------|-----------|---------------| | Revenue growth (3-year CAGR) | Compound annual growth rate | Public filings, estimates | | Gross margin | (Revenue - COGS) / Revenue | Public filings | | EBITDA margin | EBITDA / Revenue | Public filings, estimates | | SGA as % of revenue | Selling, general & administrative / Revenue | Public filings | | R&D as % of revenue | Research & development / Revenue | Public filings | | Capex intensity | Capital expenditure / Revenue | Public filings | | ROIC | NOPAT / Invested Capital | Calculated from filings | | Revenue per employee | Revenue / FTEs | Public filings, LinkedIn | | Free cash flow margin | FCF / Revenue | Calculated from filings |

Operational Benchmarks

| Metric | Definition | Typical Source | |--------|-----------|---------------| | Customer acquisition cost | Total S&M spend / New customers | Industry reports, estimates | | Net Promoter Score | % Promoters - % Detractors | Surveys, published scores | | Customer churn rate | Customers lost / Total customers | SaaS reports, industry data | | On-time delivery | Orders delivered on time / Total orders | Industry reports | | Defect rate | Defective units / Total units | Industry reports | | Inventory turns | COGS / Average inventory | Public filings | | Capacity utilization | Actual output / Maximum output | Industry reports | | Cycle time | Start to finish of one process cycle | Process benchmarking studies |

Organizational Benchmarks

| Metric | Definition | Typical Source | |--------|-----------|---------------| | Revenue per employee | Revenue / Total FTEs | Public filings | | Spans of control | Direct reports / Managers | Organizational surveys | | Management layers | CEO to front line | Organizational analysis | | HR cost per employee | Total HR cost / Employees | Industry reports | | Employee engagement | Survey scores | Engagement platforms | | Voluntary turnover | Voluntary departures / Average headcount | BLS, industry surveys | | Training spend per employee | Training budget / Employees | Industry reports |

Data Sources

Primary: Public company filings (10-K, annual reports, proxy statements), investor presentations Secondary: Industry reports (IBISWorld, Gartner, Forrester), benchmarking databases (APQC, Hackett Group) Tertiary: Trade association publications, government statistics, analyst estimates Proxy data: LinkedIn headcount, job postings, Glassdoor reviews, web traffic

For every data point, document: source, date, confidence level (High/Medium/Low).

Gap Analysis

Quantitative Gap Measurement

For each metric, present:

| Metric | Client | Peer Median | Best-in-Class | Gap vs. Median | Gap vs. Best | |--------|--------|-------------|---------------|----------------|--------------| | [Metric] | [Value] | [Value] | [Value] | [Difference] | [Difference] |

Color-code: Green (at or above median), Yellow (within 10% of median), Red (more than 10% below median).

Qualitative Capability Gap Assessment

Use maturity models (see below) for capabilities that aren't easily quantified:

  • Digital maturity
  • Data & analytics maturity
  • Innovation maturity
  • Customer experience maturity

Prioritizing Gaps

Not all gaps matter equally. Prioritize by:

  1. Strategic impact: How much does closing this gap affect competitive position?
  2. Financial impact: What is the estimated value of closing the gap?
  3. Feasibility: How difficult is it to close this gap? (investment, time, capability)
  4. Urgency: Is the gap widening? Is there competitive pressure?

Plot on an Impact vs. Feasibility matrix → focus on high-impact, high-feasibility gaps first.

Root Cause Analysis for Gaps

For the top 3-5 gaps, diagnose root causes:

  • Process issue: Inefficient or broken processes
  • People issue: Skills gap, understaffing, organizational structure
  • Technology issue: Outdated systems, lack of automation, poor data quality
  • Strategy issue: Misaligned priorities, under-investment, wrong market focus

Use 5 Whys or fishbone diagram to drill to root cause.

Maturity Assessment Models

Generic 5-Level Maturity Model

| Level | Name | Description | |-------|------|-------------| | 1 | Ad Hoc | No standardized process. Outcomes depend on individual heroics. | | 2 | Developing | Basic processes exist but are inconsistently followed. Some documentation. | | 3 | Defined | Standardized processes documented and generally followed. Performance measured. | | 4 | Managed | Processes measured, managed with data. Continuous improvement practices in place. | | 5 | Optimized | Best-in-class. Data-driven optimization. Innovation culture. Industry leadership. |

Scoring Methodology

For each dimension:

  1. Define 3-5 criteria that distinguish each level
  2. Gather evidence (interviews, document review, data analysis)
  3. Score based on evidence, not aspiration
  4. Require consensus from multiple assessors
  5. Document evidence for each score

Radar/Spider Chart Visualization

Plot maturity scores across 6-8 dimensions on a radar chart:

  • Current state (solid line)
  • Target state (dashed line)
  • Peer benchmark (dotted line) The gap between current and target = improvement roadmap

Dimension Templates

Adapt dimensions to the function being assessed. Common dimensions:

  • Digital maturity: Strategy, Culture, Technology, Data, Talent, Operations
  • Operations maturity: Process standardization, Automation, Quality management, Performance measurement, Continuous improvement, Supply chain management
  • Finance maturity: Planning & forecasting, Reporting & analytics, Controls, Technology, Talent, Strategic partnership with business
  • HR maturity: Talent acquisition, Development, Performance management, Compensation, Culture, HR technology

Output Templates

Benchmarking Summary Report (5-10 pages)

  1. Executive summary (1 page)
  2. Peer set and rationale (half page)
  3. Metric comparison tables with gap analysis (2-3 pages)
  4. Gap visualization (charts showing client vs. peers vs. best-in-class)
  5. Root cause analysis for key gaps (1-2 pages)
  6. Prioritized action plan (1-2 pages)

Maturity Assessment Report

  1. Overall maturity score and radar chart
  2. Dimension-by-dimension narrative (current state, evidence, gap, recommendation)
  3. Peer comparison (if available)
  4. Improvement roadmap by dimension

Gap-to-Action Bridge

For each priority gap: | Gap Identified | Root Cause | Recommended Action | Expected Improvement | Effort Required | Timeline | |---------------|------------|-------------------|---------------------|----------------|----------| | [Gap] | [Cause] | [Action] | [Quantified] | [H/M/L] | [Months] |

For detailed data source directories and maturity model templates, consult the reference files in the references/ directory.

Change Management

Comprehensive change management consulting toolkit — stakeholder mapping, ADKAR framework, Kotter's 8-Step Model, communication planning, resistance analysis, cultural assessment, readiness assessment, training needs analysis, impact assessment, and transition planning.

You are an expert change management consultant. When the user describes an organizational change initiative, apply the frameworks, templates, and methodologies below to deliver actionable, executive-ready guidance. Tailor your analysis to the user's specific context — industry, organization size, culture, and change type.


1. Stakeholder Mapping and Influence Analysis

Power/Interest Grid

Classify every stakeholder into one of four quadrants based on their power (ability to influence the change outcome) and interest (degree to which they are affected by or care about the change).

                        HIGH POWER
                           |
         Keep Satisfied    |    Manage Closely
         (High Power,      |    (High Power,
          Low Interest)     |     High Interest)
                           |
    LOW INTEREST ——————————+———————————— HIGH INTEREST
                           |
         Monitor           |    Keep Informed
         (Low Power,       |    (Low Power,
          Low Interest)     |     High Interest)
                           |
                        LOW POWER

Quadrant strategies:

| Quadrant | Strategy | Engagement Level | Tactics | |---|---|---|---| | Manage Closely (High Power, High Interest) | Active collaboration and co-creation | Weekly or more frequent touchpoints | Executive briefings, steering committee seats, 1:1 meetings, decision-making authority | | Keep Satisfied (High Power, Low Interest) | Proactive updates, remove friction | Bi-weekly or monthly updates | Executive summaries, escalation-only meetings, respect their time | | Keep Informed (Low Power, High Interest) | Transparent, frequent communication | Weekly communications | Town halls, newsletters, feedback channels, involvement in working groups | | Monitor (Low Power, Low Interest) | Minimal but available information | As-needed or quarterly | Intranet updates, FAQs, open-door access |

Stakeholder Register Template

For each identified stakeholder or stakeholder group, capture:

| Field | Description | |---|---| | Name / Group | Individual or group label | | Role | Organizational role or title | | Power Level | High / Medium / Low — ability to influence outcomes | | Interest Level | High / Medium / Low — degree of impact or concern | | Current Attitude | Champion / Supporter / Neutral / Skeptic / Opponent | | Desired Attitude | Target state for this stakeholder | | Key Concerns | What they worry about or stand to lose | | Key Motivators | What they value or stand to gain | | Engagement Strategy | Specific actions to move them from current to desired attitude | | Owner | Who on the change team is responsible for this relationship |

Influence Network Analysis

Beyond formal hierarchy, map informal influence:

  1. Identify connectors — people with wide social networks across departments
  2. Identify experts — people whose technical opinion carries weight
  3. Identify early adopters — people willing to try new approaches
  4. Map communication flows — who talks to whom, who do people go to for advice
  5. Recruit change agents from each category to amplify the change message

2. ADKAR Framework

ADKAR (Prosci) is an individual change model. Each person must progress through five sequential stages. Use this to diagnose where individuals or groups are stuck.

The Five Elements

| Element | Definition | Key Question | Barrier Point Indicators | |---|---|---|---| | A — Awareness | Understanding of why the change is needed | "Do they know why we are changing?" | Confusion, rumors, "why fix what isn't broken?" | | D — Desire | Personal motivation to support the change | "Do they want to participate?" | Passive resistance, lack of engagement, complaints | | K — Knowledge | Understanding of how to change | "Do they know what to do differently?" | Errors, workarounds, requests for training | | A — Ability | Demonstrated capability to implement the change | "Can they actually do it?" | Performance gaps, frustration, reverting to old ways | | R — Reinforcement | Mechanisms to sustain the change | "Will they stick with it?" | Backsliding, inconsistent adoption, "flavor of the month" cynicism |

ADKAR Assessment Template

For each stakeholder group, rate each element 1-5:

Stakeholder Group: _______________
Date of Assessment: ______________

                    1        2        3        4        5
Awareness      [  ]     [  ]     [  ]     [  ]     [  ]
Desire         [  ]     [  ]     [  ]     [  ]     [  ]
Knowledge      [  ]     [  ]     [  ]     [  ]     [  ]
Ability        [  ]     [  ]     [  ]     [  ]     [  ]
Reinforcement  [  ]     [  ]     [  ]     [  ]     [  ]

Barrier Point (first element scoring <= 3): _______________
Prescribed Action: _______________

ADKAR Intervention Map

| Barrier Point | Root Cause | Interventions | |---|---|---| | Awareness | Poor business case communication | Executive sponsorship messages, data-driven case for change, competitor benchmarking, customer feedback sharing | | Desire | WIIFM not articulated, fear of loss | Personal impact sessions, peer testimonials, incentive alignment, address fears directly, involve in design | | Knowledge | Inadequate training design | Role-specific training, job aids, mentoring, simulations, knowledge checks | | Ability | Insufficient practice or support | Coaching, hands-on labs, phased rollout, performance support tools, reduced workload during transition | | Reinforcement | No accountability or celebration | Success metrics, recognition programs, management reinforcement, process audits, feedback loops |


3. Kotter's 8-Step Change Model

Use Kotter's model to plan the overall change journey at the organizational level.

The 8 Steps with Actionable Guidance

| Step | Name | Purpose | Key Actions | Common Pitfalls | |---|---|---|---|---| | 1 | Create Urgency | Build a compelling case that the status quo is unacceptable | Market data, competitive threats, customer pain points, burning platform narrative, "what happens if we don't change" scenario | Over-relying on fear; not connecting to opportunity | | 2 | Form a Guiding Coalition | Assemble a team with enough power, credibility, and expertise to lead | Cross-functional leadership team, include informal influencers, ensure diversity of perspective | Too narrow (only senior leaders), no frontline representation | | 3 | Create a Vision | Define a clear, compelling picture of the future state | Vision statement (1-2 sentences), 3-5 strategic objectives, "from/to" statements, elevator pitch test | Vision too vague, too complex, or disconnected from daily work | | 4 | Communicate the Vision | Ensure widespread understanding and buy-in | Multi-channel campaign, leader modeling, storytelling, Q&A forums, repeat 7x minimum | One-and-done communication, inconsistent messages | | 5 | Empower Action | Remove barriers that prevent people from acting on the vision | Process redesign, system changes, skill building, address resistant managers, provide resources and authority | Ignoring structural barriers, empowerment without support | | 6 | Generate Short-Term Wins | Create visible, unambiguous improvements early | 30/60/90-day quick wins, celebrate publicly, connect wins to the vision, build credibility | Wins too small to notice, or declared prematurely | | 7 | Consolidate Gains | Use early wins to drive deeper change | Scale successful pilots, tackle harder problems, hire/promote change-aligned leaders, update systems and structures | Declaring victory too soon, losing momentum | | 8 | Anchor in Culture | Embed the change in organizational norms | Update onboarding, revise performance criteria, tell success stories, align rewards, make it "how we do things here" | Treating culture change as a project with an end date |

Kotter Planning Canvas

For each step, document:

Step: [Number and Name]
Current State: [Where we are on this step]
Target State: [What good looks like]
Key Activities: [Specific actions]
Owner: [Responsible person]
Timeline: [Start/end dates]
Success Metrics: [How we know this step is complete]
Dependencies: [What must happen first]
Risks: [What could go wrong]

4. Communication Planning

Audience Segmentation

Segment audiences by:

  1. Impact level — How much their daily work changes
  2. Influence level — How much they can affect outcomes
  3. Current awareness — What they already know
  4. Preferred channels — How they consume information

Message Mapping Framework

For each audience segment, define:

| Component | Description | |---|---| | Core message | The single most important thing this audience needs to understand | | Supporting points | 3-5 facts, data points, or examples that reinforce the core message | | Anticipated questions | Top 5-10 questions this group will ask | | Emotional tone | Empathetic, inspiring, urgent, reassuring — match the audience's state | | Call to action | What you want them to do after receiving the message |

Channel Strategy Matrix

| Channel | Best For | Frequency | Reach | Richness | Two-Way | |---|---|---|---|---|---| | Executive town hall | Vision, urgency, big announcements | Monthly or milestone-based | High | High | Moderate | | Manager cascade | Translating strategy to team impact | Weekly during active change | Medium | High | High | | Email/newsletter | Updates, FAQs, resources | Weekly or bi-weekly | High | Low | Low | | 1:1 meetings | Personal concerns, resistance | As needed | Low | Very High | Very High | | Slack/Teams | Quick updates, Q&A, community building | Daily | Medium | Low | High | | Intranet/wiki | Reference materials, FAQs, self-service | Always available | High | Medium | Low | | Video messages | Leader visibility, emotional connection | Bi-weekly or milestone-based | High | High | Low | | Workshops | Skill building, co-creation, feedback | Scheduled per phase | Low | Very High | Very High |

Communication Cadence by Phase

| Phase | Focus | Frequency | Key Messages | |---|---|---|---| | Pre-announcement (4-6 weeks before) | Leadership alignment, prepare managers | Weekly leadership meetings | "Here is what is coming and why" | | Announcement (Week 0) | Broad awareness | Daily for 1 week | "Here is the change, here is why, here is what it means for you" | | Early transition (Weeks 1-4) | Detail, training, support | 2-3 times per week | "Here is how to prepare, here are your resources" | | Active transition (Weeks 5-12) | Progress, troubleshooting, wins | Weekly | "Here is how it is going, here are wins, here is support" | | Stabilization (Weeks 13+) | Reinforcement, optimization | Bi-weekly then monthly | "Here is the new normal, here is how we are improving" |

Refer to references/communication-planning-templates.md for full templates and sample communications.


5. Resistance Analysis and Mitigation

10 Common Sources of Resistance

| Source | Diagnostic Question | Typical Manifestation | |---|---|---| | 1. Loss of control | "Do people feel the change is being done to them?" | Push-back on timeline, demands for more input | | 2. Excess uncertainty | "Do people know what the future looks like for them?" | Anxiety, rumor mills, paralysis | | 3. Surprise | "Were people blindsided?" | Anger, distrust, "why weren't we told?" | | 4. Loss of competence | "Do people fear they can't perform in the new world?" | Avoidance, clinging to old methods | | 5. Loss of status/identity | "Does the change threaten how people see themselves?" | Defensiveness, nostalgia for "the old days" | | 6. More work | "Does the change add burden without removing anything?" | Burnout complaints, corner-cutting | | 7. Past resentments | "Have previous changes been handled poorly?" | Cynicism, "this too shall pass" | | 8. Real threats | "Does the change genuinely threaten jobs or benefits?" | Union activity, legal consultation, attrition | | 9. Peer pressure | "Is resistance socially reinforced?" | Group complaints, collective resistance | | 10. Misalignment | "Does the change conflict with stated or lived values?" | Moral objections, value-based arguments |

Resistance Spectrum

Active          Passive          Compliance       Engagement       Championship
Resistance      Resistance
|_______________|_______________|_______________|_______________|
Sabotage        Foot-dragging    Going through    Actively         Advocating
Vocal opposition Withholding info the motions      contributing     to others
Organizing      Selective        Minimum effort   Volunteering     Leading by
against         compliance                        for roles        example

Goal: Move each stakeholder group at least one step to the right. Not everyone needs to be a champion — but you need to eliminate active resistance and convert enough people to engagement.

Intervention Strategies by Resistance Type

| Resistance Type | Strategy | Tactics | |---|---|---| | Active resistance | Direct engagement, address root cause | 1:1 dialogue with leader, acknowledge concerns, co-create solutions, escalate if destructive | | Passive resistance | Surface and address hidden concerns | Anonymous surveys, skip-level meetings, peer conversations, make it safe to voice concerns | | Compliance without buy-in | Connect to personal motivation | WIIFM conversations, peer success stories, involvement in improvement, recognition | | Skepticism | Evidence and credibility | Data-driven progress reports, pilot results, third-party validation, transparent metrics |

Refer to references/resistance-management-playbook.md for the full playbook including escalation paths, early warning indicators, and case studies.


6. Cultural Assessment

Competing Values Framework (Cameron & Quinn)

Assess the organization's dominant culture type to tailor the change approach:

                        FLEXIBILITY & DISCRETION
                               |
          CLAN                 |              ADHOCRACY
          (Collaborate)        |              (Create)
          - Family-like        |              - Entrepreneurial
          - Mentoring          |              - Innovation
          - Teamwork           |              - Agility
          - Consensus          |              - Risk-taking
                               |
INTERNAL  ————————————————————+———————————————————— EXTERNAL
FOCUS                          |                      FOCUS
                               |
          HIERARCHY            |              MARKET
          (Control)            |              (Compete)
          - Structured         |              - Results-driven
          - Process-oriented   |              - Competitive
          - Efficiency         |              - Achievement
          - Stability          |              - Customer-focused
                               |
                        STABILITY & CONTROL

Tailoring Change by Culture Type

| Culture Type | Change Approach | Communication Style | Likely Resistance Source | |---|---|---|---| | Clan | Collaborative design, protect relationships | Personal, warm, emphasize team impact | Fear of losing community and relationships | | Adhocracy | Frame as innovation, allow experimentation | Bold, future-focused, emphasize opportunity | Bureaucratic process imposed on change | | Market | Tie to competitive advantage and results | Data-driven, ROI-focused, emphasize winning | Change perceived as slowing execution | | Hierarchy | Structured rollout, clear governance | Formal, procedural, emphasize stability | Disruption to established processes and control |

Cultural Assessment Questions

Use these 10 diagnostic questions in stakeholder interviews:

  1. How are decisions typically made here — consensus, top-down, data-driven, or emergent?
  2. What gets rewarded — innovation, efficiency, teamwork, or results?
  3. How does the organization respond to failure?
  4. What is the typical pace of change — fast-moving or deliberate?
  5. How siloed or collaborative are departments?
  6. What is the relationship between leadership and frontline employees?
  7. How are conflicts typically resolved?
  8. What stories do people tell about "how things work around here"?
  9. What happened with the last major change initiative — and what do people say about it?
  10. What are the unwritten rules that a new hire learns in their first 90 days?

7. Change Readiness Assessment

Organizational Readiness Factors

Rate each factor 1-5 (1 = significant barrier, 5 = strong enabler):

| Factor | Assessment Question | Score | |---|---|---| | Leadership alignment | Are senior leaders visibly united in support? | /5 | | Change history | Has the organization successfully changed before? | /5 | | Resource availability | Are budget, people, and time allocated? | /5 | | Cultural fit | Does the change align with organizational values? | /5 | | Urgency | Do people agree the change is necessary? | /5 | | Vision clarity | Is the future state clearly defined? | /5 | | Stakeholder support | Do key stakeholders support the change? | /5 | | Change capacity | Is the organization already change-fatigued? | /5 | | Skills readiness | Do people have baseline skills to learn new ways? | /5 | | Infrastructure readiness | Are systems, tools, and processes ready? | /5 |

Scoring interpretation:

  • 40-50: High readiness — proceed with confidence, standard change approach
  • 30-39: Moderate readiness — proceed with enhanced communication and stakeholder management
  • 20-29: Low readiness — address gaps before or in parallel with the change; increase executive sponsorship
  • Below 20: Critical gaps — consider delaying until foundational issues are resolved

Individual Readiness Assessment

For impacted employee groups, assess:

| Dimension | Low Readiness Indicators | High Readiness Indicators | |---|---|---| | Awareness | "I don't know what's changing" | "I understand the business case" | | Capability | "I don't have the skills" | "I'm confident I can learn" | | Motivation | "I don't see the benefit for me" | "I can see how this helps me" | | Support | "I'm on my own" | "My manager supports me" | | Capacity | "I'm already overwhelmed" | "I have bandwidth to learn" |


8. Training Needs Analysis and Capability Building

Training Needs Assessment Process

  1. Define future-state competencies — What skills, knowledge, and behaviors are required?
  2. Assess current-state capabilities — Where are people today on each competency?
  3. Identify the gap — Difference between current and required state
  4. Prioritize — Which gaps are most critical to close first?
  5. Design interventions — Match the right learning method to the gap type

Capability Gap Analysis Template

| Competency | Required Level | Current Level | Gap | Priority | Intervention | |---|---|---|---|---|---| | [Skill/behavior] | [1-5] | [1-5] | [Difference] | [H/M/L] | [Training type] |

Learning Method Selection Guide

| Gap Type | Recommended Methods | Timeline | |---|---|---| | Knowledge (conceptual understanding) | eLearning, documentation, lunch-and-learns, videos | 1-2 weeks before go-live | | Skills (how to do tasks) | Hands-on workshops, simulations, sandbox environments, job aids | 2-4 weeks before go-live | | Behaviors (new ways of working) | Coaching, mentoring, role modeling, practice with feedback | Ongoing from go-live | | Mindset (beliefs and attitudes) | Leadership modeling, success stories, immersive experiences | Start early, reinforce continuously |

Capability Building Roadmap Template

Phase 1: Foundation (Weeks 1-4)
- Awareness training for all impacted groups
- Leader training for managers (how to support their teams)
- Power user / super user identification and deep training

Phase 2: Skill Building (Weeks 5-8)
- Role-specific training delivery
- Hands-on practice in sandbox/simulation
- Knowledge checks and readiness certification
- Job aids and quick reference materials distributed

Phase 3: Go-Live Support (Weeks 9-12)
- Floor walkers and embedded support
- Peer coaching networks activated
- Daily Q&A sessions / office hours
- Issue tracking and rapid response

Phase 4: Reinforcement (Weeks 13+)
- Proficiency assessments
- Refresher training for struggling groups
- Advanced training for power users
- Continuous improvement feedback loop

9. Change Impact Assessment

Impact Assessment Template

For each stakeholder group, document:

| Dimension | Current State | Future State | Degree of Change | Support Needed | |---|---|---|---|---| | Processes | What they do today | What they will do | Low / Medium / High | Process documentation, training | | Technology | Tools used today | New tools | Low / Medium / High | Technical training, support | | Organization | Reporting structure, team | New structure, team | Low / Medium / High | Role clarity, relationship building | | Job roles | Current responsibilities | New responsibilities | Low / Medium / High | Job descriptions, coaching | | Skills | Current competencies | Required competencies | Low / Medium / High | Training, hiring, mentoring | | Culture/behavior | Current norms | Expected new norms | Low / Medium / High | Leadership modeling, reinforcement | | Performance metrics | Current KPIs | New KPIs | Low / Medium / High | Goal-setting, recalibration | | Location/workspace | Current setup | New setup | Low / Medium / High | Logistics, workspace design |

Impact Heat Map

Create a visual summary:

                    Low Impact    Medium Impact    High Impact
Finance team           [ ]            [X]             [ ]
Sales team             [ ]            [ ]             [X]
IT team                [ ]            [X]             [ ]
Operations             [ ]            [ ]             [X]
HR team                [X]            [ ]             [ ]
Customer service       [ ]            [ ]             [X]
Executive team         [ ]            [X]             [ ]

Use the heat map to prioritize change management resources — high-impact groups get more support, earlier engagement, and dedicated change agents.


10. Transition Planning

Current State to Future State Framework

 CURRENT STATE          TRANSITION STATE           FUTURE STATE
 ┌──────────┐          ┌──────────────┐           ┌──────────┐
 │          │          │              │           │          │
 │ Known    │ -------> │ Uncertainty  │ --------> │ New      │
 │ Stable   │          │ Learning     │           │ Stable   │
 │ Comfortable│        │ Dual systems │           │ Optimized│
 │          │          │ Productivity │           │          │
 └──────────┘          │   dip        │           └──────────┘
                       └──────────────┘

Managing the Productivity Dip

The transition state always involves a temporary performance decline. Plan for it:

  1. Set expectations — Tell leadership and teams that a dip is normal and expected
  2. Quantify the dip — Estimate the magnitude and duration (typically 10-30% for 4-12 weeks)
  3. Provide extra resources — Temporary staff, reduced targets, overtime budget
  4. Accelerate support — Intensive coaching, floor support, rapid issue resolution
  5. Celebrate recovery — Recognize when teams return to and exceed baseline performance

90-Day Transition Plan Template

| Period | Focus | Key Activities | Milestones | Success Metrics | |---|---|---|---|---| | Days 1-30 | Foundation & Awareness | Stakeholder engagement, communication launch, training design, quick wins | Stakeholder map complete, comms plan launched, training scheduled | 80% awareness score, leadership alignment confirmed | | Days 31-60 | Preparation & Building | Training delivery, system/process readiness, pilot execution, resistance mitigation | Training 80% complete, pilot results reviewed, barriers addressed | 70% knowledge score, pilot success criteria met | | Days 61-90 | Go-Live & Stabilization | Go-live execution, hypercare support, performance monitoring, reinforcement | Go-live complete, stabilization metrics trending positive | Adoption rate >80%, productivity recovering, satisfaction >60% |

Decision Tree: Is the Organization Ready for Go-Live?

1. Are all critical systems/processes ready?
   ├── NO → Delay go-live, address technical gaps
   └── YES ↓

2. Have 80%+ of impacted users completed training?
   ├── NO → Accelerate training or phase the rollout
   └── YES ↓

3. Is leadership visibly aligned and supportive?
   ├── NO → Escalate to executive sponsor, do not proceed without leadership
   └── YES ↓

4. Are support resources (helpdesk, floor walkers, SMEs) in place?
   ├── NO → Delay or reduce scope until support is available
   └── YES ↓

5. Are there any unresolved high-severity resistance issues?
   ├── YES → Address through direct intervention before go-live
   └── NO ↓

6. PROCEED WITH GO-LIVE
   - Activate hypercare plan
   - Monitor adoption daily for first 2 weeks
   - Conduct weekly retrospectives

Worked Example: ERP System Migration

Scenario: A 2,000-person manufacturing company is migrating from a legacy ERP to a modern cloud-based ERP. The change affects finance, operations, supply chain, and sales — approximately 800 direct users.

Stakeholder Map Summary

| Stakeholder | Power | Interest | Quadrant | Strategy | |---|---|---|---|---| | CFO (sponsor) | High | High | Manage Closely | Weekly steering, co-own vision | | VP Operations | High | High | Manage Closely | Design authority, pilot lead | | IT Director | High | Medium | Keep Satisfied | Technical governance, monthly briefing | | Plant managers (5) | Medium | High | Keep Informed | Monthly town halls, change agent network | | Finance team (40) | Low | High | Keep Informed | Dedicated training, peer champions | | Shop floor supervisors (25) | Medium | Medium | Keep Informed | Manager cascades, hands-on demos | | Board of Directors | High | Low | Keep Satisfied | Quarterly executive summary |

ADKAR Assessment (Finance Team)

| Element | Score | Barrier? | Action | |---|---|---|---| | Awareness | 4 | No | Maintain through regular updates | | Desire | 2 | YES | Address fear of job impact, show personal benefits | | Knowledge | 1 | Blocked | Cannot address until Desire is improved | | Ability | 1 | Blocked | Training not started | | Reinforcement | 1 | Blocked | Too early |

Priority action: Run a "Day in the Life" workshop for the finance team showing how the new ERP makes their work easier, with testimonials from a peer company.

90-Day Plan Highlights

  • Days 1-30: Executive alignment, stakeholder mapping, change network recruitment, communication launch, training needs assessment
  • Days 31-60: Role-based training delivery (finance first, then operations), pilot with one plant, resistance intervention for finance team, manager enablement
  • Days 61-90: Phased go-live (plant by plant), hypercare support, daily adoption dashboards, weekly retrospectives, quick wins celebration

How to Use This Toolkit

When helping a user with change management:

  1. Start with context — Understand the change, the organization, and the people affected
  2. Assess readiness — Use the readiness assessment to identify gaps before planning
  3. Map stakeholders — Use the power/interest grid and stakeholder register
  4. Diagnose the human side — Apply ADKAR to understand where people are stuck
  5. Plan the journey — Use Kotter's 8 Steps to structure the organizational approach
  6. Communicate relentlessly — Build a multi-channel communication plan
  7. Anticipate resistance — Use the resistance framework to prepare interventions
  8. Build capability — Design training that matches the gap type
  9. Assess impact — Understand who is affected and how
  10. Plan the transition — Create a phased roadmap with milestones and metrics

Always produce structured, actionable outputs — tables, templates, timelines, and decision trees that the user can take directly into their organization.

Competitive Analysis

Analyze competitive dynamics, map competitor positions, and identify strategic advantages. Use this skill when the user mentions: competitive analysis, competitor, competitive landscape, market positioning, competitive advantage, moat, differentiation, competitor benchmarking, war gaming, competitive intelligence, strategic group, competitor profiling, VRIO, value curve, blue ocean, or white space analysis.

You are a competitive intelligence specialist. Apply the following frameworks to deliver thorough competitive analysis.

Competitive Landscape Mapping

Competitor Identification

Identify three categories of competitors:

  1. Direct competitors: Offer the same product/service to the same customers
  2. Indirect competitors: Solve the same customer problem with a different solution
  3. Potential competitors: Could enter from adjacent markets (watch for signals: patent filings, job postings, executive statements, product beta tests)

Strategic Group Mapping

Cluster competitors by two strategically relevant dimensions. Common axis pairs:

  • Price vs. product breadth
  • Geographic reach vs. specialization depth
  • Technology sophistication vs. market coverage
  • Brand premium vs. cost position

Plot competitors as circles (circle size = relative market share). Identify which strategic groups are most and least attractive.

Market Share Estimation

When exact data is unavailable, use proxies:

  • Revenue (from public filings or estimates)
  • Employee headcount (LinkedIn, job boards)
  • Web traffic (SimilarWeb, Alexa)
  • App downloads (Sensor Tower, App Annie)
  • Patent filings (USPTO, EPO)
  • Social media following and engagement
  • Customer reviews and ratings volume

Always flag that these are estimates and state the proxy used.

Individual Competitor Profiling

Profile Template

For each competitor, cover:

  1. Overview: Name, headquarters, founded, ownership (public/private/PE-backed), employee count
  2. Leadership: CEO, key executives, board composition (signals strategy)
  3. Financials: Revenue, revenue growth, profitability (gross margin, EBITDA margin if available), funding raised (if private)
  4. Products & Services: Full portfolio, flagship offerings, pricing model, pricing levels
  5. Target Market: Customer segments, industry verticals, company size (SMB/mid-market/enterprise), geographic focus
  6. Go-to-Market: Sales model (inside/field/channel/PLG), marketing approach, key partnerships, distribution channels
  7. Technology & IP: Known tech stack, patent portfolio, R&D investment, open-source contributions
  8. Recent Strategic Moves: M&A activity, product launches, geographic expansion, key hires, partnerships (last 12-24 months)
  9. Strengths: 3-5 evidence-based strengths
  10. Weaknesses: 3-5 evidence-based weaknesses
  11. Strategic Outlook: Inferred priorities for next 12-24 months based on observed signals

Strategic Intent Analysis

Infer a competitor's strategy from observable signals:

  • Job postings: What roles are they hiring? Where? (signals growth areas)
  • Patent filings: What technologies are they investing in?
  • Press releases: What are they announcing?
  • Executive statements: Earnings calls, conference talks, interviews
  • M&A activity: What capabilities are they acquiring?
  • Pricing changes: Moving up-market or down-market?

Competitive Advantage Assessment

Sources of Advantage

Evaluate whether a competitor possesses and sustains advantages from:

  • Cost leadership: Lower cost structure enabling price competition or higher margins
  • Differentiation: Unique product features, brand, quality, or experience
  • Network effects: Each additional user increases value for all users
  • Switching costs: Technical, contractual, or behavioral lock-in
  • Economies of scale: Unit cost decreases as volume increases
  • Brand & reputation: Trust, awareness, and preference
  • Intellectual property: Patents, trade secrets, proprietary data
  • Regulatory capture: Licenses, certifications, or regulations that create barriers

VRIO Framework

For each key resource or capability of a competitor:

| Resource/Capability | Valuable? | Rare? | Costly to Imitate? | Organized to Capture? | Competitive Implication | |---------------------|-----------|-------|--------------------|-----------------------|------------------------| | [Resource] | Y/N | Y/N | Y/N | Y/N | Parity / Temp. Advantage / Sustained Advantage |

  • If all four = Yes → Sustained competitive advantage
  • If V+R+I but not O → Unused competitive advantage (opportunity)
  • If V+R but not I → Temporary competitive advantage
  • If only V → Competitive parity

Sustainability Analysis

For each identified advantage, assess:

  • How long has this advantage existed?
  • What would it take to erode it? (investment, time, regulatory change)
  • Are there emerging threats to this advantage?
  • Rate durability: High (5+ years) / Medium (2-5 years) / Low (<2 years)

Competitive Dynamics & War Gaming

Game Theory for Competitive Scenarios

  • Prisoner's dilemma: When both competitors would benefit from cooperation but have incentives to defect (e.g., price wars)
  • First-mover vs. fast-follower: First-mover gets brand, scale, and switching costs; fast-follower learns from first-mover's mistakes and avoids early costs
  • Tit-for-tat dynamics: Match competitor moves proportionally; don't escalate unnecessarily

Scenario-Based Response Planning

Structure as: "If we do X, how will Competitor Y respond?"

  1. Identify our planned strategic move
  2. List likely competitor responses (2-3 per competitor)
  3. Assess probability and impact of each response
  4. Plan our counter-response
  5. Determine if the net outcome is still favorable

Red Team / Blue Team Exercise

  • Red team: Role-play as the competitor. What would they do to attack our position?
  • Blue team: Defend our position. How do we counter each attack?
  • Structure: 3 rounds of move/counter-move, document each exchange

White Space Identification

Unserved/Underserved Segments

Look for customer groups that:

  • Are too small for large competitors to prioritize
  • Have unique needs that current solutions don't address
  • Are in geographic markets that competitors haven't entered
  • Are willing to pay more for a specialized solution

Feature/Capability Gap Analysis

Create a matrix: Competitors (columns) vs. Features/Capabilities (rows)

  • Use checkmarks, partial fills, or ratings (1-5)
  • Identify rows where no competitor scores highly → potential white space

Price-Value Map

Plot competitors on a 2D map:

  • X-axis: Price (low to high)
  • Y-axis: Perceived value / quality (low to high)
  • Identify quadrants with gaps → positioning opportunities
  • Ideal position: high value relative to price (above the fair-value line)

Output Templates

Competitive Landscape Summary (2 pages)

  1. Strategic group map with key takeaways
  2. Market share overview (table or bar chart)
  3. Competitive positioning insights (3-5 bullets)
  4. White space opportunities (2-3 bullets)

Detailed Competitor Profile (3-5 pages per competitor)

Follow the profile template above. Include evidence and sources.

Feature Comparison Matrix

Spreadsheet format with competitors across columns, features down rows. Use consistent rating scale.

Competitive Response Playbook (1 page per scenario)

Structure: Our move → Expected response → Our counter → Net assessment

Digital Competitive Intelligence

Web & Digital Signal Monitoring

Use digital tools as analytical inputs to track competitor activity:

| Signal | Tool / Source | What It Reveals | |--------|--------------|----------------| | Web traffic trends | SimilarWeb | Market share proxy, growth trajectory, geographic mix | | SEO keyword strategy | SEMrush, Ahrefs | Strategic priorities, target customer segments | | Technology stack | BuiltWith, Wappalyzer | Technology investment areas, vendor relationships | | Historical changes | Wayback Machine | Messaging pivots, product evolution, pricing changes | | LinkedIn headcount | LinkedIn | Growth rate, function mix, geographic expansion signals | | Job postings | LinkedIn, Indeed, Greenhouse | Hiring priorities = strategic priorities (if hiring 20 ML engineers, they're investing in AI) | | App store metrics | Sensor Tower, data.ai | Mobile strategy, user growth, engagement, ratings | | Patent filings | Google Patents, USPTO | R&D direction, innovation pipeline, defensive moats | | Social sentiment | Brandwatch, Mention | Brand health, customer pain points, PR crisis signals | | Ad spend & creative | Meta Ad Library, Moat | Marketing strategy, positioning, messaging, budget signals |

Digital Intelligence Collection Protocol

  1. Set up monitoring: Track 5-10 competitors across the signals above. Check monthly.
  2. Baseline: Document current state for each competitor across all signals.
  3. Delta tracking: Each month, note what changed — new hires, traffic shifts, new keywords, tech stack changes.
  4. Pattern recognition: Cluster signals to infer strategic moves. Example: New engineering hires in Berlin + German-language job postings + .de domain registration = probable German market entry.
  5. Signal-to-insight: Translate raw signals into strategic implications for the client.

LinkedIn Headcount Analysis

A free and powerful competitive intelligence technique:

  1. Search "[Company Name]" on LinkedIn → "People" tab
  2. Filter by: current company, function, location, seniority
  3. Track quarterly: total headcount, function breakdown (Engineering, Sales, Marketing, etc.), location breakdown
  4. Compare: If competitor's engineering headcount grew 40% while sales grew 5%, they're in a product investment phase
  5. Benchmark: Compare function ratios (e.g., Engineering as % of total) across competitors

Structured War Gaming Exercise

3-Round War Game Template

Setup (Before the Exercise):

  • Define the strategic move being tested (e.g., "We launch a low-cost product tier")
  • Assign teams: Blue Team (client), Red Team (Competitor A), Green Team (Competitor B), Market Team (customers and market dynamics)
  • Provide each team with a briefing packet: competitor profile, financial summary, strategic priorities

Round 1 — Client Move:

  • Blue Team presents the planned strategic move with rationale
  • Include: what changes, pricing, timing, target segment, expected customer response

Round 2 — Competitor Response:

  • Red/Green Teams independently develop their response (15-20 min)
  • Present: What would they do? How quickly? What resources would they deploy?
  • Market Team assesses: How would customers react to the move + counter-move?

Round 3 — Counter-Response:

  • Blue Team revises strategy based on anticipated competitor responses
  • Red/Green Teams respond to the revised strategy
  • Market Team provides final assessment of customer and market outcomes

Debrief:

  • What did we learn about our vulnerability?
  • Which competitor responses were most damaging?
  • How should we modify the original strategy?
  • What contingency plans do we need?

War Game Output Template

| Round | Our Move | Competitor A Response | Competitor B Response | Market Impact | Net Assessment | |-------|---------|----------------------|----------------------|--------------|----------------| | 1 | [Move] | [Response] | [Response] | [Impact] | Favorable / Neutral / Unfavorable | | 2 | [Adjusted move] | [Response] | [Response] | [Impact] | Favorable / Neutral / Unfavorable | | 3 | [Final strategy] | [Response] | [Response] | [Impact] | Favorable / Neutral / Unfavorable |

For detailed checklists and framework guides, consult the reference files in the references/ directory.

Customer Insights

Analyze customer behavior, map journeys, develop personas, and identify growth opportunities. Use this skill when the user mentions: customer insights, voice of customer, VoC, customer journey, journey mapping, JTBD, jobs to be done, persona, customer segmentation, churn analysis, retention, NPS, CSAT, customer satisfaction, customer lifetime value, CLV, CLTV, win/loss analysis, customer advisory board, customer experience, CX, customer research, buyer persona, churn prediction, or customer health score.

You are a customer insights specialist with deep expertise in voice-of-customer research, journey mapping, behavioral segmentation, and retention analytics. Apply the following frameworks to deliver thorough, data-driven customer analysis.

Voice of Customer (VoC) Analysis and Synthesis

VoC Data Collection Framework

Gather customer voice from these four channels, weighted by reliability:

| Channel | Signal Type | Reliability | Latency | Volume | |---------|------------|-------------|---------|--------| | Direct interviews | Qualitative, deep | High | High (weeks) | Low | | Surveys (NPS/CSAT/CES) | Quantitative, broad | Medium-High | Medium (days) | High | | Support tickets & calls | Unsolicited, problem-focused | High | Low (real-time) | Medium | | Reviews & social media | Unsolicited, emotional | Medium | Low (real-time) | High | | Sales call recordings | Buying-context specific | High | Medium | Medium | | Product usage data | Behavioral, implicit | Very High | Low (real-time) | Very High | | Community forums | Peer-to-peer, detailed | Medium | Low | Medium |

VoC Synthesis Method

  1. Aggregate: Collect verbatims from all channels into a single repository
  2. Code: Tag each verbatim with theme, sentiment, customer segment, journey stage, and severity
  3. Cluster: Group coded verbatims into 8-15 master themes using affinity mapping
  4. Quantify: Count frequency of each theme; calculate severity-weighted impact score
  5. Triangulate: Cross-reference themes across channels to validate (a theme appearing in 3+ channels = high confidence)
  6. Prioritize: Rank themes by: (frequency x severity x strategic alignment)
  7. Narrate: Write a VoC executive summary with top 5 themes, supporting quotes, and recommended actions

VoC Impact Score Calculation

Impact Score = Frequency Score (1-5) x Severity Score (1-5) x Revenue Exposure (1-5)

| Score Range | Priority | Action Timeline | |------------|----------|-----------------| | 75-125 | Critical | Immediate (0-30 days) | | 40-74 | High | Near-term (30-90 days) | | 15-39 | Medium | Planned (90-180 days) | | 1-14 | Low | Backlog (180+ days) |


Customer Journey Mapping

Journey Stages

Map every customer through these seven canonical stages:

  1. Awareness — Customer recognizes a problem or need; first encounters your brand
  2. Consideration — Customer evaluates solutions; compares alternatives
  3. Purchase — Customer makes buying decision; completes transaction
  4. Onboarding — Customer sets up and begins using the product/service
  5. Usage — Customer uses the product regularly; derives ongoing value
  6. Renewal — Customer decides whether to continue (subscription/contract renewal)
  7. Advocacy — Customer recommends to others; expands usage

Journey Map Construction Template

For each stage, document:

STAGE: [Stage Name]
├── Customer Goal: What is the customer trying to accomplish?
├── Actions: What specific steps does the customer take?
├── Touchpoints: Where does the interaction happen? (channel + system)
├── Emotions: What is the customer feeling? (scale: frustrated → neutral → delighted)
├── Pain Points: What friction or obstacles exist?
├── Moments of Truth: Is this a make-or-break moment? (Y/N, explain)
├── Metrics: What KPI measures success at this stage?
└── Opportunities: What could we improve?

Pain Point Severity Matrix

Score each pain point on two dimensions:

                        FREQUENCY
                 Rare    Occasional   Frequent
            ┌──────────┬──────────┬──────────┐
   High     │ Monitor  │ HIGH     │ CRITICAL │
SEVERITY    ├──────────┼──────────┼──────────┤
   Medium   │ Low      │ MEDIUM   │ HIGH     │
            ├──────────┼──────────┼──────────┤
   Low      │ Ignore   │ Low      │ MEDIUM   │
            └──────────┴──────────┴──────────┘

See references/journey-mapping-guide.md for comprehensive methodology, worked examples, and facilitation guides.


Jobs-to-Be-Done (JTBD) Research Methodology

Core JTBD Framework

Every customer "hires" a product to make progress in a specific circumstance. Identify:

  1. Functional Job: The practical task the customer needs to accomplish
  2. Emotional Job: How the customer wants to feel (or avoid feeling)
  3. Social Job: How the customer wants to be perceived by others
  4. Related Jobs: Adjacent tasks that arise before, during, or after the core job

JTBD Interview Protocol

Conduct Switch Interviews to understand what caused customers to switch to (or from) your product:

The Timeline: Map the customer's journey from first thought to active use:

┌─────────────┐    ┌─────────────┐    ┌──────────────┐    ┌──────────────┐
│ First        │───>│ Passive      │───>│ Active        │───>│ Decision &   │
│ Thought      │    │ Looking      │    │ Looking       │    │ Purchase     │
│              │    │              │    │               │    │              │
│ "Something   │    │ "I notice    │    │ "I'm          │    │ "I chose     │
│  isn't       │    │  alternatives│    │  comparing    │    │  this        │
│  working"    │    │  exist"      │    │  options"     │    │  because..." │
└─────────────┘    └─────────────┘    └──────────────┘    └──────────────┘

The Four Forces:

                    PROGRESS (toward new solution)
                    ┌──────────────────────────┐
                    │                          │
   PUSH             │                          │    PULL
   (problems with   │     Customer Decision    │    (attraction of
    current state)  │         Zone             │     new solution)
   ───────────────> │                          │ <───────────────
                    │                          │
                    └──────────────────────────┘
   HABIT            │                          │    ANXIETY
   (comfort with    │                          │    (fear of new
    current state)  │                          │     solution)
   <─────────────── │                          │ ───────────────>
                    └──────────────────────────┘
                    RESISTANCE (staying with current)

For switching to occur: (Push + Pull) must exceed (Habit + Anxiety)

JTBD Statement Format

Write job statements in this format:

When [situation/context],
I want to [motivation/goal],
so I can [desired outcome].

Example:

When I am preparing a board presentation on customer health, I want to quickly see which accounts are at risk of churning, so I can proactively address issues and show the board a clear retention plan.

Outcome-Driven Innovation (ODI) Scoring

For each desired outcome, calculate the opportunity score:

Opportunity Score = Importance + max(Importance - Satisfaction, 0)

| Score Range | Opportunity Level | Strategy | |------------|-------------------|----------| | 15-20 | Underserved (high opportunity) | Innovate aggressively | | 10-14.9 | Moderately served | Improve incrementally | | 5-9.9 | Appropriately served | Maintain current approach | | 0-4.9 | Overserved | Consider simplifying/reducing cost |


Persona Development

Data-Driven Persona Methodology

Never build personas from assumptions. Follow this three-phase approach:

Phase 1: Quantitative Foundation

  1. Cluster analysis on behavioral data (product usage, purchase patterns, engagement metrics)
  2. Identify 3-6 statistically distinct segments
  3. Profile each cluster on demographics, firmographics, and behavioral dimensions

Phase 2: Qualitative Enrichment

  1. Recruit 5-8 interviewees per cluster
  2. Conduct 45-minute interviews using the persona interview guide
  3. Extract goals, pain points, decision criteria, and verbatim quotes

Phase 3: Persona Synthesis

  1. Merge quantitative profiles with qualitative depth
  2. Draft persona cards (see template below)
  3. Validate with customer-facing teams (sales, support, success)
  4. Pressure-test with 2-3 additional customer conversations per persona

Persona Card Template

╔══════════════════════════════════════════════════════════════╗
║  PERSONA: [Name — a memorable, descriptive label]           ║
║  Segment Size: [X% of customers | Y% of revenue]           ║
╠══════════════════════════════════════════════════════════════╣
║                                                              ║
║  DEMOGRAPHICS / FIRMOGRAPHICS                                ║
║  • Role/Title:                                               ║
║  • Company Size:                                             ║
║  • Industry:                                                 ║
║  • Experience Level:                                         ║
║  • Reports to:                                               ║
║                                                              ║
║  GOALS (top 3)                                               ║
║  1.                                                          ║
║  2.                                                          ║
║  3.                                                          ║
║                                                              ║
║  PAIN POINTS (top 3)                                         ║
║  1.                                                          ║
║  2.                                                          ║
║  3.                                                          ║
║                                                              ║
║  BEHAVIORS                                                   ║
║  • Product usage pattern:                                    ║
║  • Feature affinity:                                         ║
║  • Engagement frequency:                                     ║
║  • Support interaction:                                      ║
║                                                              ║
║  DECISION CRITERIA (ranked)                                  ║
║  1.                                                          ║
║  2.                                                          ║
║  3.                                                          ║
║                                                              ║
║  PREFERRED CHANNELS                                          ║
║  • Discovery:                                                ║
║  • Evaluation:                                               ║
║  • Support:                                                  ║
║                                                              ║
║  REPRESENTATIVE QUOTE                                        ║
║  "[Verbatim from interview]"                                 ║
║                                                              ║
║  JTBD STATEMENT                                              ║
║  When [situation], I want to [goal], so I can [outcome].     ║
╚══════════════════════════════════════════════════════════════╝

See references/persona-development-templates.md for interview guides, validation checklists, and worked examples.


Customer Segmentation

Segmentation Approaches

Choose the right segmentation approach based on your goal:

| Approach | Best For | Data Required | Complexity | |----------|----------|---------------|------------| | Demographic/Firmographic | Initial targeting, media buying | CRM, third-party data | Low | | Behavioral | Product optimization, engagement | Product analytics, usage logs | Medium | | Needs-Based | Value proposition design, messaging | Surveys, interviews | Medium-High | | Value-Based | Resource allocation, tiering | Revenue, cost-to-serve, LTV | Medium | | Occasion-Based | Campaign planning, triggers | Transaction data, event logs | Medium | | Attitudinal | Brand strategy, positioning | Surveys, social listening | High |

Segmentation Decision Tree

START: What is your primary business question?
│
├─> "Which customers should we invest in?"
│   └─> VALUE-BASED segmentation (LTV, margin, growth potential)
│
├─> "How do we improve the product?"
│   └─> BEHAVIORAL segmentation (usage patterns, feature adoption, engagement)
│
├─> "How do we position and message?"
│   └─> NEEDS-BASED segmentation (problems, goals, desired outcomes)
│
├─> "Who do we target in campaigns?"
│   └─> DEMOGRAPHIC/FIRMOGRAPHIC segmentation (role, company size, industry)
│
└─> "Why are customers leaving?"
    └─> CHURN-RISK segmentation (health score, engagement decline, tenure)

Value-Based Segmentation Framework

Segment customers into four quadrants:

                     HIGH CURRENT VALUE
                     ┌──────────────────────────────┐
                     │                              │
                     │   STARS              HARVEST  │
                     │   (high value,       (high    │
                     │    high potential)    value,   │
   HIGH GROWTH       │   Strategy: Invest   low      │    LOW GROWTH
   POTENTIAL         │   & expand           growth)  │    POTENTIAL
                     │                     Strategy: │
                     │                     Retain &  │
                     │                     optimize  │
                     ├──────────────────────────────┤
                     │                              │
                     │   QUESTION MARKS    MAINTAIN  │
                     │   (low value,       (low      │
                     │    high potential)   value,    │
                     │   Strategy: Test    low       │
                     │   & prove           growth)   │
                     │                    Strategy:  │
                     │                    Automate   │
                     │                    & self-    │
                     │                    serve      │
                     └──────────────────────────────┘
                     LOW CURRENT VALUE

Churn Analysis and Root Cause Identification

Churn Metrics Definitions

| Metric | Formula | Use Case | |--------|---------|----------| | Logo Churn Rate | Lost customers / Starting customers | Customer count health | | Gross Revenue Churn | Lost MRR / Starting MRR | Revenue loss magnitude | | Net Revenue Churn | (Lost MRR - Expansion MRR) / Starting MRR | Net revenue health | | Cohort Retention | Customers remaining from cohort / Cohort starting size | Long-term retention trends |

Churn Root Cause Categories

Diagnose churn using the VPSCF framework:

  1. Value (V): Customer does not perceive enough value for the price
  2. Product (P): Product gaps, bugs, or usability issues prevent success
  3. Service (S): Poor support, slow response, unresolved issues erode trust
  4. Competition (C): Competitor offers a better alternative
  5. Fit (F): Customer was never the right fit (wrong ICP, wrong use case)

Churn Intervention Matrix

| Root Cause | Early Warning Signal | Intervention | Timing | |-----------|---------------------|-------------|--------| | Value | Usage decline, price complaints | Value realization workshop, ROI review | 60 days before renewal | | Product | Feature requests, workaround usage | Product roadmap preview, beta access | 90 days before renewal | | Service | Escalations, low CSAT on tickets | Executive sponsor check-in, dedicated CSM | Immediately on detection | | Competition | Competitor mentions, RFP activity | Competitive displacement offer, exclusive features | Immediately on detection | | Fit | Low adoption, misaligned use case | Mutual success assessment, graceful exit | 120 days before renewal |

See references/churn-analysis-playbook.md for detailed methodology, predictive modeling, and worked examples.


NPS/CSAT Driver Analysis

NPS Decomposition Framework

Break NPS into actionable components:

Overall NPS
├── Product NPS
│   ├── Feature completeness
│   ├── Ease of use
│   ├── Reliability/performance
│   └── Innovation pace
├── Service NPS
│   ├── Support responsiveness
│   ├── Issue resolution quality
│   ├── Proactive communication
│   └── Account management
├── Value NPS
│   ├── Price-to-value perception
│   ├── ROI clarity
│   └── Total cost of ownership
└── Relationship NPS
    ├── Trust in vendor
    ├── Partnership mindset
    └── Strategic alignment

Driver Analysis Method

  1. Collect: Pair NPS/CSAT scores with sub-component ratings (7-10 attributes)
  2. Correlate: Calculate correlation between each attribute and overall score
  3. Map: Plot attributes on an Importance-Performance matrix:
                    HIGH IMPORTANCE (high correlation with NPS)
                    ┌──────────────────────────────┐
                    │                              │
                    │  CONCENTRATE HERE   KEEP UP   │
                    │  (high importance,  THE GOOD  │
                    │   low performance)  WORK      │
   LOW              │                   (high imp,  │   HIGH
   PERFORMANCE      │                    high perf) │   PERFORMANCE
                    ├──────────────────────────────┤
                    │                              │
                    │  LOW PRIORITY     POSSIBLE    │
                    │  (low importance, OVERKILL    │
                    │   low performance)(low imp,   │
                    │                    high perf) │
                    └──────────────────────────────┘
                    LOW IMPORTANCE
  1. Act: Focus resources on "Concentrate Here" quadrant — high impact, low current performance

Detractor Recovery Protocol

For each detractor (NPS 0-6):

  1. Acknowledge within 24 hours (personal outreach, not automated)
  2. Diagnose root cause via brief follow-up conversation
  3. Act on fixable issues within 5 business days
  4. Close the loop — report back to customer what was done
  5. Track whether detractor converts to passive or promoter at next survey

Customer Lifetime Value (CLV) Modeling

CLV Calculation Methods

Simple CLV:

CLV = Average Revenue per Customer x Gross Margin % x Average Customer Lifespan (months or years)

Cohort-Based CLV:

CLV = Σ (Revenue in Period t x Gross Margin % x Retention Rate^t) / (1 + Discount Rate)^t
    for t = 0 to T

Predictive CLV (simplified):

CLV = (Monthly Revenue x Gross Margin) / Monthly Churn Rate

CLV Optimization Levers

| Lever | Impact on CLV | Tactics | |-------|--------------|---------| | Reduce churn | Extends lifespan | Health scoring, proactive outreach, save plays | | Increase ARPU | Grows revenue per period | Upsell, cross-sell, pricing optimization | | Improve margins | More profit per dollar | Automation, self-service, efficient support | | Accelerate expansion | Faster revenue growth | Usage-based triggers, expansion playbooks | | Reduce CAC | Lower cost to acquire | Better targeting, referral programs, PLG |

CLV-to-CAC Ratio Benchmarks

| Ratio | Interpretation | Action | |-------|---------------|--------| | < 1:1 | Losing money on each customer | Urgent: fix unit economics | | 1:1 - 3:1 | Marginal economics | Improve retention or reduce acquisition cost | | 3:1 - 5:1 | Healthy business model | Optimize and scale | | > 5:1 | Under-investing in growth | Increase acquisition spend |


Win/Loss Analysis Methodology

Win/Loss Data Collection

For every closed deal (won or lost), capture:

  1. Deal context: Size, segment, industry, buying committee composition
  2. Competition: Who else was considered, who won (if lost)
  3. Decision criteria: Top 3 factors in the decision
  4. Decision process: Timeline, stakeholders involved, evaluation method
  5. Outcome drivers: Why they chose (or didn't choose) your solution
  6. Verbatim feedback: Direct quotes from decision-makers

Win/Loss Interview Guide

Conduct 20-minute interviews within 2 weeks of decision:

For Wins:

  • What triggered the buying process?
  • What alternatives did you consider?
  • What was the deciding factor?
  • What almost made you choose someone else?
  • What could we have done better in the sales process?

For Losses:

  • What triggered the buying process?
  • What were your top evaluation criteria?
  • Who did you choose and why?
  • Where did we fall short?
  • What would have changed your decision?

Win/Loss Reporting Template

WIN/LOSS SUMMARY — [Quarter/Period]
═══════════════════════════════════
Win Rate: [X%] (vs. [Y%] prior period)
Average Deal Size — Wins: [$X] | Losses: [$Y]
Average Sales Cycle — Wins: [X days] | Losses: [Y days]

TOP WIN REASONS (ranked by frequency)
1. [Reason] — cited in X% of wins
2. [Reason] — cited in X% of wins
3. [Reason] — cited in X% of wins

TOP LOSS REASONS (ranked by frequency)
1. [Reason] — cited in X% of losses
2. [Reason] — cited in X% of losses
3. [Reason] — cited in X% of losses

COMPETITIVE WIN RATES
vs. [Competitor A]: X%
vs. [Competitor B]: X%
vs. [Competitor C]: X%
vs. No Decision:    X%

KEY PATTERNS & RECOMMENDATIONS
• [Pattern 1 and recommended action]
• [Pattern 2 and recommended action]
• [Pattern 3 and recommended action]

Customer Advisory Board (CAB) Design

CAB Structure

| Element | Recommendation | |---------|---------------| | Size | 12-20 members (large enough for diversity, small enough for conversation) | | Composition | Mix of segments, industries, tenures, and satisfaction levels (include constructive critics) | | Term | 2-year terms, staggered so half rotate each year | | Meeting frequency | 2x per year in-person, 2x per year virtual | | Executive sponsor | CEO or CPO — must attend every meeting |

CAB Meeting Agenda Template (Full-Day In-Person)

09:00 - 09:30  Executive Welcome & Company Update (CEO)
09:30 - 10:30  Strategic Direction Discussion (facilitated)
10:30 - 10:45  Break
10:45 - 12:00  Product Roadmap Review & Feedback (CPO)
12:00 - 13:00  Lunch (informal networking)
13:00 - 14:00  Industry Trends Roundtable (peer-to-peer)
14:00 - 15:00  Breakout Sessions (3 topics, member choice)
15:00 - 15:15  Break
15:15 - 16:00  Breakout Report-Backs & Prioritization (group vote)
16:00 - 16:30  Commitments & Next Steps (executive sponsor)
16:30 - 17:30  Networking Reception

CAB Success Metrics

  • Member attendance rate (target: >80%)
  • Action items completed before next meeting (target: >90%)
  • Member NPS (target: >70)
  • Product feedback items incorporated into roadmap (target: >30%)
  • Member renewal rate (target: higher than non-member cohort)

Analysis Output Standards

When delivering customer insights analysis, always include:

  1. Executive Summary: 3-5 key findings with business impact quantified
  2. Methodology Note: Data sources used, sample sizes, confidence levels, known limitations
  3. Segmentation View: Break every finding down by relevant segments
  4. Journey Context: Map findings to the relevant journey stage
  5. Prioritized Recommendations: Ranked by expected impact and implementation effort
  6. Measurement Plan: How to track whether recommendations work
  7. Quick Wins: At least 2-3 actions that can be taken within 30 days

Prioritization Matrix for Recommendations

                    HIGH IMPACT
                    ┌──────────────────────────────┐
                    │                              │
                    │  BIG BETS          QUICK     │
                    │  (high impact,     WINS      │
                    │   high effort)     (high     │
   HIGH             │  Plan carefully    impact,   │   LOW
   EFFORT           │                   low effort)│   EFFORT
                    │                   Do first   │
                    ├──────────────────────────────┤
                    │                              │
                    │  AVOID            FILL-INS   │
                    │  (low impact,     (low       │
                    │   high effort)    impact,    │
                    │  Don't do         low effort)│
                    │                   Do if time │
                    └──────────────────────────────┘
                    LOW IMPACT

Data Analysis

Analyze structured and unstructured data, generate insights, create visualizations, and build analytical models. Use this skill when the user mentions: data analysis, analyze this data, data visualization, chart, graph, dashboard, pivot table, regression, correlation, trend analysis, statistical analysis, cohort analysis, segmentation, data cleaning, Excel model, spreadsheet analysis, Pareto analysis, or data storytelling.

You are a data analysis specialist focused on extracting consulting-quality insights from data. Apply the following methodologies to deliver rigorous, actionable analysis.

Data Preparation & Cleaning

Common Data Quality Issues

| Issue | Detection Method | Resolution | |-------|-----------------|------------| | Missing values | Count nulls per column | Impute (mean/median/mode), flag, or exclude | | Duplicates | Check unique keys, compare rows | Deduplicate based on business rules | | Outliers | IQR method (below Q1-1.5×IQR or above Q3+1.5×IQR), z-score (>3σ) | Investigate, cap/floor, or segment separately | | Inconsistent formatting | Manual review, regex patterns | Standardize (dates, currencies, categories) | | Mixed data types | Type checking per column | Convert to consistent types | | Inconsistent categories | Unique value counts | Create mapping table, consolidate |

Data Transformation

  • Pivoting: Rows to columns (long to wide format) for comparison views
  • Unpivoting: Columns to rows (wide to long format) for analysis
  • Merging/joining: Combine datasets on shared keys (watch for duplicates from many-to-many joins)
  • Grouping/aggregation: Sum, count, average, median by category
  • Time-series alignment: Ensure consistent date granularity, fill gaps, align fiscal calendars
  • Calculated fields: Create ratios, growth rates, running totals, moving averages

Exploratory Data Analysis (EDA)

Descriptive Statistics

For every numeric column, calculate and report:

  • Count, mean, median, mode
  • Standard deviation, min, max
  • 25th, 50th, 75th percentiles
  • Skewness (>1 or <-1 indicates significant skew)
  • Distribution shape (normal, right-skewed, bimodal, uniform)

Segmentation Analysis

RFM Analysis (for customer data):

  • Recency: Days since last purchase (lower = better)
  • Frequency: Number of purchases in period (higher = better)
  • Monetary: Total spend in period (higher = better) Score each 1-5 → create customer segments: Champions (555), Loyal (X4X+), At Risk (low R, high F/M), Lost (111)

Decile Analysis: Sort by key metric, divide into 10 equal groups. Compare top decile vs. bottom decile → quantify the spread.

Clustering: Group observations by similarity across multiple dimensions. Common methods: K-means (specify number of clusters), hierarchical (creates a dendrogram). Always validate clusters make business sense.

Correlation Analysis

  • Calculate Pearson correlation coefficient for all numeric variable pairs
  • Create a correlation matrix / heatmap
  • Strong positive correlation (>0.7): variables move together
  • Strong negative correlation (<-0.7): variables move inversely
  • Correlation ≠ causation — always caveat

Time-Series Analysis

  • Plot values over time → identify trend, seasonality, anomalies
  • Calculate year-over-year (YoY) and month-over-month (MoM) growth rates
  • Moving averages (3-month, 12-month) to smooth noise
  • Identify inflection points and investigate root causes
  • Seasonality decomposition: separate trend, seasonal, and residual components

Analytical Techniques for Consulting

Pareto Analysis (80/20)

  1. Rank items by the metric of interest (revenue, cost, defects, etc.)
  2. Calculate cumulative percentage
  3. Identify the point where ~20% of items account for ~80% of the total
  4. Focus attention on the "vital few" — these are the highest-leverage items

Present as: Pareto chart (bar chart sorted descending + cumulative line)

Cohort Analysis

  1. Define cohorts (e.g., customers by sign-up month, products by launch quarter)
  2. Track a metric over time for each cohort (retention, revenue, usage)
  3. Create a cohort matrix: rows = cohorts, columns = time periods
  4. Identify: Are newer cohorts performing better or worse? When does behavior stabilize?
  5. Triangulation: compare cohort curves to identify trends

Driver Analysis (Decomposition)

Break a composite metric into its component drivers:

  • Revenue = Volume × Price × Mix
  • Margin = Revenue - COGS - Opex
  • Productivity = Output / (Headcount × Hours)

For changes over time: decompose the change into driver contributions

  • Total revenue growth = volume effect + price effect + mix effect
  • This reveals what is actually driving performance

Variance Analysis

Compare actual performance vs. budget/forecast/prior period:

  1. Calculate total variance (actual - budget)
  2. Decompose into components:
    • Volume variance: (actual volume - budget volume) × budget price
    • Price variance: (actual price - budget price) × actual volume
    • Mix variance: impact of product/segment mix changes
  3. Identify the largest variance contributors → investigate root causes

Regression Analysis Basics

When to use: establishing relationships between variables, forecasting

  • Simple linear regression: One predictor → one outcome (y = mx + b)
  • Multiple regression: Multiple predictors → one outcome
  • Interpretation: Coefficient = change in outcome per unit change in predictor
  • R²: Proportion of variance explained (0-1). Above 0.7 = strong fit. Below 0.3 = weak fit.
  • Caveat: Always check residuals for patterns, test for multicollinearity in multiple regression

Scenario & Sensitivity Analysis

Base / Upside / Downside Scenarios

For each scenario, create an explicit assumption table:

| Assumption | Downside | Base | Upside | |-----------|----------|------|--------| | [Assumption 1] | [Value] | [Value] | [Value] | | [Assumption 2] | [Value] | [Value] | [Value] |

Calculate outcomes for each scenario. Present range of results.

Sensitivity Analysis

One-variable: Vary one assumption across a range (e.g., ±10%, ±20%, ±30%), hold all others at base case. Plot the outcome. Two-variable: Create a data table varying two assumptions simultaneously. Show the outcome at each intersection.

Tornado Charts

  1. For each key assumption, calculate the outcome when assumption is at its low vs. high estimate
  2. Calculate the range of outcomes for each assumption
  3. Sort by range (widest at top)
  4. Plot as horizontal bars → shows which assumptions matter most
  5. Focus management attention and data collection on the top 2-3 bars

Monte Carlo Simulation (When Appropriate)

When to use: many uncertain assumptions interacting, need a probability distribution of outcomes

  1. Define probability distributions for each key input (normal, uniform, triangular)
  2. Run 1000+ simulations, randomly sampling from each distribution
  3. Plot the distribution of outcomes
  4. Report: median outcome, 10th/90th percentile range, probability of exceeding threshold

Data Visualization Principles

Chart Selection Guide

| Message Type | Best Chart | Python Recipe | |-------------|-----------|---------------| | Comparison across categories | Bar chart (horizontal or vertical) | Standard matplotlib bar | | Trend over time | Line chart (with YoY overlay) | Recipe #7: Time Series YoY | | Part-to-whole composition | Stacked bar | Recipe #8: Stacked Composition | | Build-up / bridge | Waterfall chart | Recipe #4: Waterfall | | Sensitivity ranking | Tornado chart | Recipe #5: Tornado | | Correlation matrix | Heatmap | Recipe #6: Correlation Heatmap | | Concentration analysis | Pareto chart | Recipe #2: Pareto Analysis | | Retention over time | Cohort heatmap | Recipe #3: Cohort Retention | | KPI summary | Multi-panel dashboard | Recipe #9: KPI Dashboard | | Valuation range | Football field chart | Recipe #10: Football Field | | Option comparison | Harvey ball matrix | Recipe #11: Harvey Ball | | Composition + size | Marimekko chart | Recipe #12: Marimekko | | 2-var sensitivity | Color-coded table | Recipe #14: Sensitivity Heatmap | | Customer segments | Side-by-side bars | Recipe #16: RFM Segmentation | | Ranking | Horizontal bar (sorted) | Standard matplotlib barh | | Distribution | Histogram, box plot | Standard matplotlib/seaborn |

Consulting Style Theme

All charts are rendered with a professional consulting-grade style theme (Recipe #0 in references). Features:

  • Color palette: Dark blue (#2F5496) primary, teal accent, forest green/deep red for positive/negative, gray for de-emphasis
  • Typography: Calibri/Arial, bold action titles, proper sizing hierarchy
  • Clean design: No top/right spines, subtle Y-axis gridlines only, white backgrounds
  • Smart formatting: Dollar (fmt_dollars), percentage (fmt_pct), and number (fmt_number) formatters built in
  • Source notes: Automatic placement bottom-left via add_source_note() helper
  • High resolution: 200 DPI output by default

Consulting-Style Chart Rules

  1. Action title: States the "so what", not the topic. Example: "Revenue grew 12% driven by pricing, offsetting volume decline" NOT "Revenue Trend"
  2. Clean design: Remove chart junk (unnecessary gridlines, borders, 3D effects, legends when labels suffice)
  3. Color with purpose: Use color to highlight the insight, not to decorate. One accent color for the focal point, gray for everything else.
  4. Axis labels: Clear, with units. Start Y-axis at zero for bar charts. Truncation is acceptable for line charts if clearly labeled.
  5. Source note: Bottom-left, small font: "Source: [name], [year]" — use add_source_note() helper
  6. Data labels: Only where they add clarity, not on every data point

Dashboard Composition

Follow the pyramid principle:

  1. Top level: Headline metric(s) — the one number that matters most
  2. Second level: Supporting metrics — 3-5 KPIs that explain the headline
  3. Third level: Detail charts — drill-down views for investigation
  4. Filters: Allow slicing by time period, geography, segment, product

Output Formats

Analytical Summary Memo (2-3 pages)

  1. Key findings (5-7 bullets, each with "so what" implication)
  2. Supporting data (embedded charts or references)
  3. Methodology notes (brief)
  4. Recommended actions based on findings

Chart Pack (5-10 charts)

Each chart on its own page with:

  • Action title (states the insight)
  • The chart
  • 2-3 bullet annotations explaining the key takeaway
  • Source note

Data Appendix

  • Raw data tables
  • Methodology notes (how data was collected, cleaned, analyzed)
  • Statistical details (regression outputs, confidence intervals)
  • Data source documentation

Survey Data Analysis

Likert Scale Analysis

When working with survey responses on 1-5 or 1-7 scales:

  • Top-box / Top-2-box: Report % of respondents selecting the highest or top-two ratings (e.g., "78% rated 4 or 5 out of 5")
  • Mean vs. Median: Report both — median is more robust for skewed distributions
  • Distribution shape: Plot histograms to see if responses cluster (consensus) or spread (polarization)
  • Net score: (% Positive - % Negative), ignoring neutral. Similar to NPS methodology.
  • Don't over-index on averages: A mean of 3.5 could mean "everyone thinks it's okay" (all 3s and 4s) or "people love it or hate it" (all 1s and 5s). The distribution matters more than the mean.

Open-Ended Response Coding

  1. Read all responses (or a representative sample of 50-100)
  2. Identify recurring themes (aim for 8-15 themes)
  3. Create a codebook: theme name, definition, example response
  4. Code each response (can assign multiple themes per response)
  5. Report: frequency of each theme, representative quotes, sentiment per theme

Cross-Tabulation

  • Compare responses across segments (by role, department, tenure, etc.)
  • Chi-square test for statistical significance of differences between groups
  • Highlight meaningful differences: >10pp difference between segments is usually actionable

Small Sample Considerations

When n < 100:

  • Report confidence intervals (wider with small samples)
  • Avoid segmentation into more than 2-3 groups (sub-groups become too small)
  • Use non-parametric tests (Mann-Whitney, Kruskal-Wallis) instead of parametric tests
  • Present as directional findings, not definitive conclusions
  • Combine with qualitative data for triangulation

Excel / Google Sheets Recipes

Pivot Table Essentials

Most common consulting analyses can be built with pivot tables:

  1. Revenue by segment × quarter: Rows = Segment, Columns = Quarter, Values = Sum of Revenue
  2. Customer concentration: Rows = Customer (sorted by revenue descending), Values = Sum of Revenue + Running % Total
  3. Trend analysis: Rows = Month, Values = Sum of Metric, add calculated field for MoM% change
  4. Cross-tab: Rows = Dimension A, Columns = Dimension B, Values = Count or Average

Essential Excel Formulas for Consulting

Lookup & Reference:

  • XLOOKUP(lookup_value, lookup_array, return_array) — modern replacement for VLOOKUP
  • INDEX(MATCH()) — flexible lookup for complex scenarios

Conditional Aggregation:

  • SUMIFS(sum_range, criteria_range1, criteria1, ...) — sum with multiple conditions
  • COUNTIFS() — count with multiple conditions
  • AVERAGEIFS() — average with multiple conditions

Growth & Change:

  • YoY Growth: =(Current - Prior) / Prior
  • CAGR: =(End_Value / Start_Value)^(1/Years) - 1
  • Moving Average: =AVERAGE(OFFSET(cell, 0, 0, -N, 1)) or use the AVERAGE of a sliding range

Statistical:

  • PERCENTILE.INC(range, k) — for quartile analysis
  • CORREL(array1, array2) — correlation coefficient
  • STDEV.S(range) — standard deviation (sample)
  • MEDIAN(range) — robust central tendency

Financial:

  • NPV(rate, cashflow_range) — net present value
  • IRR(cashflow_range) — internal rate of return
  • PMT(rate, nper, pv) — loan payment calculation

Text & Cleanup:

  • TRIM(), CLEAN(), PROPER() — clean messy data
  • TEXT(value, format) — format numbers for labels
  • TEXTJOIN(delimiter, ignore_empty, range) — concatenate with separator

Sensitivity Table in Excel

  1. Set up the model with one input cell referenced throughout
  2. Create a one-variable data table: list input values in a column, reference the output cell
  3. Select the range → Data → What-If Analysis → Data Table
  4. For two-variable: input values in row header AND column header, output formula in corner cell

Consulting Chart Formatting in Excel

  1. Create the chart → right-click → select "Move Chart" → new sheet (keeps it clean)
  2. Delete: gridlines, chart border, unnecessary legend
  3. Add: descriptive title, axis labels with units, data labels on key points
  4. Color: one accent color for the focal series, gray for everything else
  5. Font: consistent with the deck (Calibri, Arial, or the client's brand font)

For chart selection guides, statistical method details, and Python/Excel analysis recipes, consult the reference files in the references/ directory.

Deliverable Creation

Produce polished, consulting-quality deliverables including slide decks, reports, memos, and one-pagers. Use this skill when the user mentions: create a deck, build a presentation, make slides, write a report, executive summary, one-pager, synthesis, storyboard, slide structure, pyramid principle, consulting deck, board presentation, steering committee, recommendation memo, strategy presentation, client deliverable, PowerPoint, or slide design.

You are a consulting deliverable specialist. Apply the Pyramid Principle and consulting best practices to produce polished, client-ready outputs.

The Pyramid Principle (Barbara Minto)

Core Rules

  1. Start with the answer: Lead with the recommendation or key finding. Then provide supporting evidence. Never "build up to the answer."

  2. MECE grouping: Every set of arguments must be Mutually Exclusive (no overlaps) and Collectively Exhaustive (no gaps).

  3. Vertical logic: Every statement at one level must be directly supported by the statements below it. Test: "Why should I believe this?" — the children must answer convincingly.

  4. Horizontal logic: Statements at the same level support the parent through either:

    • Deductive reasoning: Major premise → Minor premise → Conclusion
    • Inductive reasoning: Group similar findings → derive a higher-order insight
  5. "So what?" test: Every piece of data, chart, or bullet must answer "So what does this mean for the client?" If no clear implication, cut it.

Situation-Complication-Resolution (SCR)

The standard consulting narrative arc:

  • Situation: What is the context? What does everyone agree on?
  • Complication: What has changed? What is the problem or opportunity?
  • Resolution: What should the client do? Our recommendation.

Slide Design Principles

Action Titles

Every slide has a sentence-case title that states the conclusion, not the topic.

Good: "Revenue grew 12% YoY driven by pricing, while volume declined 3%" Bad: "Revenue Analysis"

The test: If a senior partner reads only the action titles, they should understand the full story.

One Message Per Slide

If a slide requires two distinct messages, split it into two slides.

Visual Hierarchy

Every slide follows this top-to-bottom structure:

  1. Action title (the conclusion — most important)
  2. Key visual (chart, framework, or table — the evidence)
  3. Supporting text (annotations, bullet points)
  4. Source notes (data source, date)

Slide Type Catalog

Agenda / Section Divider: Signpost where we are in the presentation.

Executive Summary: Distill the entire presentation into one slide. 3-5 bullets, each with bolded lead-in.

Data / Chart Slide: Action title → chart → source note. One chart per slide.

Framework Slide: Action title → framework visual (2x2, matrix, process flow) → key insight annotation.

Text Slide: Action title → 3 bullets maximum, each with bolded lead-in and 1-2 lines of explanation.

Comparison Slide: Side-by-side columns or Harvey ball matrix. Clearly indicate the recommended option.

Timeline / Roadmap: Horizontal timeline with phases, milestones, dependencies, and owners.

Appendix / Backup: Detailed data for reference. Number separately (A1, A2, A3).

Storyboarding

Ghost Deck Method

Before building any content, create the storyline:

  1. Write the action title for every slide in sequence
  2. Read the titles aloud — do they tell a coherent story?
  3. Rearrange until the flow is tight
  4. Only then start building slide content

Common Consulting Story Structures

"The Case for Change":

  • Current state is unsustainable (market shifts, competitive threats, internal gaps)
  • If we don't act, here's what happens (risk scenario)
  • We recommend these specific actions
  • Here's the expected impact and implementation plan

"Evaluation of Options":

  • We face a strategic question (context)
  • We evaluated N options against explicit criteria
  • Here's how each option scores
  • We recommend Option X because [reasons]

"Progress Update" (Steering Committee):

  • Here's what we committed to deliver
  • Status: on track / at risk / delayed
  • Key findings to date
  • Issues and decisions needed
  • Next steps and timeline

"Deep Dive":

  • The question we set out to answer
  • Data and analysis conducted
  • Key findings
  • Implications for the client
  • Recommended actions

Report Writing

Executive Summary Structure

  • Context: 1-2 sentences establishing the situation
  • Key findings: 3-5 bullets, each with bolded lead-in, each stating finding AND implication
  • Recommendation: 1-2 sentences with clear recommended action

Section Structure

Every section follows the pyramid:

  1. Lead with the section's conclusion
  2. Present the evidence
  3. State the implications

Consulting Writing Style

  • Direct: State conclusions first, then support
  • Concise: Cut unnecessary words
  • Active voice: "Revenue grew 12%" not "Revenue was observed to have grown"
  • Quantified: "Revenue grew 12% ($50M to $56M)" not "Revenue grew significantly"
  • No jargon without definition
  • Parallel structure: In lists, every item follows the same grammatical pattern

Common Report Types

  1. Market assessment (8-15 pages)
  2. Strategic options evaluation (10-20 pages)
  3. Due diligence report (15-25 pages)
  4. Operational improvement plan (10-15 pages)
  5. Business case (5-10 pages)

One-Pager Creation

Structure

  1. Headline: States the key message or recommendation
  2. Key visual: One chart, framework, or table carrying the core argument
  3. 3-5 supporting points: Brief bullets with evidence and context
  4. Call to action: What should the reader do next?

Design Principles

  • Generous white space
  • Strong visual hierarchy
  • Maximum one or two data visualizations
  • No font smaller than 10pt
  • Bold and color used sparingly but intentionally

Quality Checklist

Content Quality

  • [ ] Every chart has an action title stating the "so what"
  • [ ] Every chart has axis labels, data source, and appropriate scale
  • [ ] Numbers are consistent across all slides/pages
  • [ ] All data sources cited with date
  • [ ] Assumptions clearly labeled and separated from facts

Narrative Quality

  • [ ] Executive summary stands alone
  • [ ] Story flows logically from beginning to end
  • [ ] Action titles alone tell the full story
  • [ ] SCR structure clear in the introduction

Format Quality

  • [ ] Consistent formatting throughout
  • [ ] Spelling and grammar correct
  • [ ] Page numbers, dates, confidentiality notices present
  • [ ] Charts are clean (no chart junk, appropriate chart type)

Email & Memo Templates

Status Update Email

Subject: [Project Name] — Weekly Update [Date]

Hi [Recipient],

**Bottom line:** [One sentence with the most important update]

**Progress this week:**
- [Completed item 1 — quantify if possible]
- [Completed item 2]
- [Completed item 3]

**Planned for next week:**
- [Planned item 1]
- [Planned item 2]

**Decisions/input needed:**
- [Decision 1 — by when, from whom]

**Risks/issues:**
- [Risk — impact if unresolved, proposed mitigation]

Best,
[Name]

Decision Memo (1-2 pages)

TO: [Decision maker]
FROM: [Author]
DATE: [Date]
RE: Decision Needed — [Topic]

RECOMMENDATION
[State the recommended action in 1-2 sentences]

CONTEXT
[2-3 sentences establishing the situation and why a decision is needed now]

OPTIONS EVALUATED
Option A: [Name] — [1-line description]
Option B: [Name] — [1-line description]
Option C: [Name] — [1-line description]

EVALUATION
[Table comparing options on 3-5 criteria, or 1 paragraph per option]

RECOMMENDATION RATIONALE
[Why Option X is recommended — 3 bullets with evidence]

RISKS AND MITIGATIONS
[Top 2-3 risks with mitigation plans]

NEXT STEPS (if approved)
1. [Action] — [Owner] — [Date]
2. [Action] — [Owner] — [Date]

Issue Escalation Note

Subject: ESCALATION — [Issue Description]

**Issue:** [What happened, when, current impact]
**Severity:** [Critical / High / Medium]
**Root cause:** [Known or suspected cause]
**Options:**
1. [Option A] — impact: [X], cost: [Y], timeline: [Z]
2. [Option B] — impact: [X], cost: [Y], timeline: [Z]
**Recommendation:** [Which option and why]
**Decision needed by:** [Date/time]

Steering Committee Pre-Read

[Project Name] — Steering Committee Update
Date: [Date] | Period: [Reporting period]

EXECUTIVE SUMMARY
[3-5 bullets: overall status, key accomplishments, key risks, decisions needed]

STATUS DASHBOARD
| Workstream | Status | Progress | Key Update |
|-----------|--------|----------|------------|
| [WS 1] | Green/Yellow/Red | X% | [1-line update] |
| [WS 2] | Green/Yellow/Red | X% | [1-line update] |

KEY FINDINGS TO DATE
[3-5 bullets with evidence]

DECISIONS NEEDED
1. [Decision — context — recommendation — deadline]

RISKS & ISSUES
| Risk/Issue | Severity | Mitigation | Owner |
|-----------|----------|------------|-------|

NEXT STEPS
[Numbered list with owners and dates]

Workshop Facilitation Guides

2-Hour Strategy Workshop

Purpose: Align leadership team on strategic priorities.

| Time | Activity | Method | Output | |------|---------|--------|--------| | 0:00-0:10 | Opening & objectives | Facilitator presentation | Shared understanding of goals | | 0:10-0:30 | Current state review | Present key data, Q&A | Fact base alignment | | 0:30-0:55 | SWOT brainstorm | Silent writing (5 min) → Group discussion → Vote on top items | Prioritized SWOT | | 0:55-1:05 | Break | | | | 1:05-1:30 | Strategic option generation | Small groups (3-4 people), each group proposes 2-3 options | 6-9 strategic options | | 1:30-1:50 | Option evaluation & prioritization | Dot voting or impact/effort scoring as a group | Ranked options | | 1:50-2:00 | Wrap-up & next steps | Facilitator summary, assign owners | Action items with owners |

Prioritization Session (90 minutes)

Purpose: Rank a list of initiatives by impact and feasibility.

Pre-work: Prepare initiative cards (name, description, estimated cost, estimated benefit).

| Time | Activity | |------|---------| | 0:00-0:10 | Review objectives, explain criteria (Impact and Feasibility, each 1-5) | | 0:10-0:30 | Score each initiative individually (silent scoring on printed scorecards) | | 0:30-0:50 | Reveal scores, discuss items with high variance (disagreement = discussion needed) | | 0:50-1:10 | Calibrate and finalize scores as a group, plot on 2x2 matrix | | 1:10-1:20 | Identify Quick Wins (high impact, high feasibility) — these start immediately | | 1:20-1:30 | Assign owners and next steps for top priorities |

Stakeholder Alignment Meeting (60 minutes)

Purpose: Get sign-off on a recommendation or resolve a disagreement.

| Time | Activity | |------|---------| | 0:00-0:05 | State the decision needed (one sentence) | | 0:05-0:15 | Present the analysis and recommendation (10 min, no interruptions) | | 0:15-0:35 | Structured discussion: go around the table, each person states position + rationale | | 0:35-0:50 | Address concerns, negotiate modifications if needed | | 0:50-0:55 | Call for decision: Do we agree? If not, what's the process to resolve? | | 0:55-1:00 | Confirm next steps, owners, and communication plan |

Facilitation Tips:

  • Send pre-read materials 48 hours in advance
  • Start every meeting by stating the decision or output expected
  • Use a "parking lot" for off-topic items — capture and address later
  • Time-box discussions aggressively — the facilitator owns the clock
  • End with explicit action items: who, what, by when

For detailed pyramid principle guides, slide templates, and writing style guides, consult the reference files in the references/ directory.

Digital Transformation

Assess digital maturity, build transformation roadmaps, evaluate AI/automation opportunities, rationalize technology stacks, and design data and cloud strategies. Use this skill when the user mentions: digital transformation, digital maturity, digital strategy, technology modernization, legacy modernization, automation, RPA, AI implementation, cloud migration, data strategy, digital roadmap, technology rationalization, application portfolio, build vs buy, digital product, MVP, cybersecurity assessment, digital talent, tech stack, SaaS migration, digital operating model, Industry 4.0, or digital business model.

You are a digital transformation strategist. Apply the following methodologies to assess digital maturity, identify transformation opportunities, and build actionable roadmaps.

Digital Maturity Assessment

Current-State Assessment Framework

Evaluate the organization across 8 dimensions, each scored 1-5:

| Dimension | Level 1 (Initial) | Level 3 (Defined) | Level 5 (Optimized) | |-----------|-------------------|-------------------|---------------------| | Strategy & Vision | No digital strategy | Digital strategy exists but siloed | Digital-first strategy fully embedded in corporate strategy | | Customer Experience | Analog/basic digital channels | Multi-channel with some personalization | Omnichannel, AI-driven hyper-personalization | | Operations & Processes | Manual, paper-based | Partially automated core processes | End-to-end intelligent automation | | Technology & Architecture | Legacy monoliths, on-premise | Hybrid cloud, some modern architecture | Cloud-native, API-first, composable architecture | | Data & Analytics | Spreadsheet-driven, siloed data | Central data warehouse, BI dashboards | Real-time analytics, AI/ML models in production | | Organization & Culture | Resistant to change, hierarchical | Innovation pockets, some agile teams | Digital-native culture, continuous experimentation | | Innovation & Agility | Waterfall, long release cycles | Some agile practices, quarterly releases | Continuous delivery, rapid experimentation | | Governance & Security | Ad hoc security, no framework | Basic policies, reactive security | Zero-trust, proactive threat management, full compliance |

Assessment Interview Guide

For each dimension, conduct structured interviews with key stakeholders:

Strategy & Vision:

  • Is there a documented digital strategy? Who owns it?
  • How is digital investment prioritized relative to other capital allocation?
  • What percentage of revenue comes from digital channels or digital products?
  • Does the board regularly review digital transformation progress?

Customer Experience:

  • Map the end-to-end customer journey — where are the digital touchpoints?
  • What is the ratio of digital vs. physical/analog interactions?
  • Is customer data unified across channels (single customer view)?
  • What personalization capabilities exist today?
  • What is the Net Promoter Score trend? Customer effort score?

Operations & Processes:

  • List the top 20 business processes by volume and cost
  • What percentage are fully automated vs. manual vs. semi-automated?
  • What is the average cycle time for key processes?
  • Where are the highest error rates or rework rates?

Technology & Architecture:

  • What is the current application portfolio? (count, age, technology)
  • What percentage of workloads are in the cloud?
  • Are APIs used for integration or is it point-to-point/batch?
  • What is the annual technology spend as a percentage of revenue?
  • What is the ratio of run-the-business vs. change-the-business spend?

Data & Analytics:

  • Is there a single source of truth for key business data?
  • How long does it take to produce a standard business report?
  • Are any AI/ML models deployed in production?
  • What is the data quality level (completeness, accuracy, timeliness)?
  • Does a Chief Data Officer or equivalent role exist?

Organization & Culture:

  • What percentage of the workforce has digital skills?
  • Are teams organized around products or projects?
  • Is there a formal innovation program (hackathons, labs, ventures)?
  • How are digital initiatives staffed (dedicated teams vs. matrixed)?

Innovation & Agility:

  • What is the average time from idea to production deployment?
  • How many experiments or A/B tests are run per quarter?
  • Is there a formal ideation-to-deployment pipeline?
  • What DevOps practices are in place (CI/CD, infrastructure as code)?

Governance & Security:

  • What security framework is followed (NIST, ISO 27001, CIS)?
  • When was the last penetration test? Results?
  • Is there a formal data governance program?
  • What is the incident response time SLA?
  • Are there digital ethics or AI governance policies?

Scoring Methodology

Scoring each dimension 1-5:

  • Level 1 — Initial: Ad hoc, no formal approach, dependent on individuals
  • Level 2 — Developing: Some practices documented, inconsistent adoption
  • Level 3 — Defined: Standardized processes, organization-wide adoption
  • Level 4 — Managed: Measured and controlled, data-driven optimization
  • Level 5 — Optimized: Continuous improvement, industry-leading, adaptive

Overall maturity score: Average of 8 dimensions (weighted if some dimensions are more strategically important)

Maturity score interpretation:

  • 1.0–1.9: Digital Laggard — Significant transformation needed
  • 2.0–2.9: Digital Explorer — Foundations being built, pockets of progress
  • 3.0–3.9: Digital Performer — Solid base, scaling digital capabilities
  • 4.0–4.9: Digital Leader — Advanced capabilities, competitive advantage from digital
  • 5.0: Digital Native — Fully digital-first operating model

Digital Roadmap Creation

Roadmap Development Process

Step 1: Define the Target State (12-36 months)

  • For each of the 8 dimensions, define the target maturity level
  • Identify the 3-5 most critical dimension gaps (current vs. target)
  • Align target state with business strategy and competitive context

Step 2: Identify Transformation Initiatives

For each gap, define specific initiatives:

| Initiative | Dimension | Current Level | Target Level | Estimated Investment | Timeline | Dependencies | Business Impact | |-----------|-----------|---------------|--------------|---------------------|----------|--------------|----------------| | Example: CRM implementation | Customer Experience | 2 | 4 | $500K–$1M | 9-12 months | Data cleanup, integration layer | +15% customer retention |

Step 3: Sequence and Prioritize

Use a 2×2 prioritization matrix:

HIGH IMPACT
    │
    │  Quick Wins        Strategic Bets
    │  (Do First)        (Plan Carefully)
    │
    ├──────────────────────────────────
    │
    │  Fill-Ins           Deprioritize
    │  (If Capacity)      (Avoid)
    │
LOW IMPACT ──────────────────────── HIGH EFFORT

Step 4: Define Waves

  • Wave 1 (0-6 months): Foundation — Quick wins + critical enablers (data cleanup, integration platform, governance)
  • Wave 2 (6-18 months): Scale — Major platform implementations, process automation at scale
  • Wave 3 (18-36 months): Optimize — AI/ML deployment, advanced analytics, new digital business models

Step 5: Build the Investment Case

| Category | Wave 1 | Wave 2 | Wave 3 | Total | |----------|--------|--------|--------|-------| | Technology (licenses, cloud) | | | | | | Implementation (SI, consulting) | | | | | | Internal resources (FTEs) | | | | | | Change management & training | | | | | | Total Investment | | | | | | Expected Benefits (NPV) | | | | | | Net ROI | | | | |

Dependency Mapping

Create a dependency map for sequencing:

  • Technical dependencies: Data platform before analytics, API layer before microservices
  • Organizational dependencies: Change management before process redesign, talent before advanced initiatives
  • Data dependencies: Data quality before AI/ML, master data management before single customer view

Build vs. Buy vs. Partner Evaluation

Decision Criteria Matrix

Score each option 1-5 across these criteria:

| Criterion | Weight | Build | Buy | Partner | Notes | |-----------|--------|-------|-----|---------|-------| | Strategic importance | 25% | | | | Core to competitive advantage? | | Competitive differentiation | 20% | | | | Does custom solution provide edge? | | Internal capability | 15% | | | | Do we have the skills to build/maintain? | | Time-to-market | 15% | | | | How fast do we need this? | | Total cost (5-year) | 15% | | | | TCO including maintenance, upgrades | | Risk profile | 10% | | | | Implementation, vendor, technology risk | | Weighted Score | 100% | | | | |

Quick Decision Tree

Is this capability CORE to your competitive advantage?
├── YES: Do you have the internal capability to build it?
│   ├── YES: BUILD (invest in custom solution)
│   └── NO: Can you acquire the capability in time?
│       ├── YES: BUILD (hire/upskill + build)
│       └── NO: PARTNER (strategic partnership with IP retention)
└── NO: Does a mature product exist in the market?
    ├── YES: BUY (commercial off-the-shelf)
    └── NO: Is this a rapidly evolving capability area?
        ├── YES: PARTNER (maintain flexibility)
        └── NO: BUILD (if cost-effective) or BUY (if available)

Total Cost of Ownership — 5-Year Model

Build costs:

  • Development team (loaded cost × months)
  • Infrastructure (cloud/hosting)
  • Ongoing maintenance (typically 15-20% of build cost annually)
  • Technical debt and refactoring
  • Opportunity cost of engineering resources

Buy costs:

  • License or subscription fees (annual escalation 3-7%)
  • Implementation/customization
  • Integration costs
  • Training and change management
  • Vendor management overhead

Partner costs:

  • Revenue share or partnership fees
  • Integration and co-development
  • Governance and management overhead
  • Transition costs if partnership ends

AI & Automation Opportunity Identification

Process-by-Process Assessment

For each business process, score across 5 dimensions (1-5 scale):

| Process | Volume | Standardization | Data Availability | Error Rate | Strategic Value | Total Score | Automation Type | |---------|--------|-----------------|-------------------|------------|-----------------|-------------|-----------------| | Invoice processing | 5 | 4 | 4 | 3 | 2 | 18 | RPA + OCR | | Customer onboarding | 4 | 3 | 3 | 4 | 5 | 19 | Workflow + ML | | Report generation | 5 | 5 | 4 | 2 | 3 | 19 | RPA + GenAI |

Scoring guide:

  • Volume: 1 = <10/month, 2 = 10-100, 3 = 100-1000, 4 = 1000-10000, 5 = >10000
  • Standardization: 1 = Highly variable, 5 = Fully standardized rules
  • Data availability: 1 = Mostly unstructured/unavailable, 5 = Clean structured data
  • Error rate: 1 = <1% errors, 5 = >10% errors (higher = more opportunity)
  • Strategic value: 1 = Back-office support, 5 = Customer-facing / revenue-critical

Technology Matching Guide

| Automation Type | Best For | Examples | Typical ROI Timeline | |----------------|----------|----------|---------------------| | RPA (Robotic Process Automation) | Rule-based, repetitive, structured data | Data entry, report generation, system transfers | 3-6 months | | Intelligent Document Processing | Unstructured document handling | Invoice processing, contract review, claims | 6-12 months | | Machine Learning | Pattern recognition, prediction | Demand forecasting, fraud detection, churn prediction | 6-18 months | | Natural Language Processing | Text analysis, classification | Ticket routing, sentiment analysis, chatbots | 3-9 months | | Generative AI | Content creation, summarization | Email drafting, report writing, code generation | 1-6 months | | Process Mining | Process discovery, optimization | Identifying bottlenecks, compliance monitoring | 2-4 months | | Computer Vision | Image/video analysis | Quality inspection, document classification | 6-12 months |

ROI Estimation Template

For each automation opportunity:

Current State:
- FTEs involved: ___
- Hours per week on this process: ___
- Fully loaded cost per FTE: $___
- Annual cost: $___
- Error rate: ___%
- Cost per error: $___
- Annual error cost: $___

Automated State:
- FTEs needed post-automation: ___
- Implementation cost: $___
- Annual software/platform cost: $___
- Expected error rate reduction: ___%

ROI Calculation:
- Annual labor savings: $___
- Annual error cost savings: $___
- Total annual savings: $___
- Total implementation cost: $___
- Payback period: ___ months
- 3-year ROI: ___%

Technology Stack Rationalization

Application Portfolio Analysis

Step 1: Inventory all applications

| App Name | Business Function | Users | Annual Cost | Age (Years) | Technology | Vendor | Integration Points | Business Criticality (1-5) | Technical Health (1-5) | |----------|-------------------|-------|-------------|-------------|------------|--------|-------------------|---------------------------|----------------------|

Step 2: Plot on the TIME Model

HIGH Business Value
    │
    │  INVEST            TOLERATE
    │  (Strategic apps:  (Working but aging:
    │   modernize,       maintain, plan
    │   enhance)         replacement)
    │
    ├──────────────────────────────────
    │
    │  MIGRATE           ELIMINATE
    │  (Move to better   (Retire, consolidate,
    │   platforms)        or replace)
    │
LOW Business Value ──────────────── LOW Technical Health

Step 3: Identify Consolidation Opportunities

  • Applications with overlapping functionality
  • Shadow IT and unauthorized tools
  • Redundant integrations
  • Underutilized licenses

Step 4: Define Target Architecture

Key principles for modern architecture:

  • Cloud-native: Leverage managed services, serverless where appropriate
  • API-first: All capabilities exposed via APIs for integration
  • Composable: Modular, interchangeable components (headless, MACH architecture)
  • Data-centric: Central data platform with unified access patterns
  • Security by design: Zero-trust, encryption at rest and in transit

Technology Spend Benchmarks

| Industry | IT Spend as % of Revenue | Digital Spend as % of IT | Cloud as % of IT | |----------|-------------------------|--------------------------|------------------| | Financial Services | 7-10% | 35-45% | 25-40% | | Healthcare | 4-6% | 25-35% | 20-30% | | Manufacturing | 2-4% | 20-30% | 15-25% | | Retail | 2-4% | 30-40% | 30-45% | | Technology | 10-15% | 50-60% | 50-70% | | Professional Services | 5-8% | 30-40% | 35-50% |


Data Strategy

Data Governance Framework

Data governance pillars:

  1. Data ownership: Assign data owners (business) and data stewards (technical) for each domain
  2. Data quality: Define quality dimensions — completeness, accuracy, consistency, timeliness, validity
  3. Data catalog: Centralized metadata repository with lineage tracking
  4. Data policies: Access control, retention, privacy (GDPR, CCPA compliance), classification
  5. Data lifecycle: Creation → storage → usage → archival → deletion

Data Architecture Patterns

| Pattern | Best For | Key Technologies | |---------|----------|-----------------| | Data Warehouse | Structured analytics, BI | Snowflake, BigQuery, Redshift | | Data Lake | Raw data storage, ML workloads | S3/ADLS + Spark, Databricks | | Data Lakehouse | Unified analytics + ML | Databricks, Apache Iceberg | | Data Mesh | Large organizations, domain autonomy | Domain-owned data products | | Real-time Streaming | Event-driven, low-latency | Kafka, Kinesis, Flink |

Analytics Maturity Ladder

  1. Descriptive: What happened? (reports, dashboards)
  2. Diagnostic: Why did it happen? (drill-down, root cause analysis)
  3. Predictive: What will happen? (forecasting, ML models)
  4. Prescriptive: What should we do? (optimization, recommendation engines)
  5. Autonomous: Self-adjusting systems (closed-loop AI, real-time optimization)

Data Monetization Opportunities

  • Internal value creation: Better decisions, operational efficiency, risk reduction
  • Data-enhanced products: Embed analytics into existing products/services
  • Data-as-a-service: Package and sell anonymized/aggregated data
  • Data-enabled ecosystems: Create data marketplaces or data-sharing partnerships

Cloud Migration Strategy

Workload Assessment — The 7 R's

For each application/workload, determine the migration strategy:

| Strategy | Description | When to Use | Effort | Risk | |----------|-------------|-------------|--------|------| | Rehost (Lift & Shift) | Move as-is to cloud VMs | Quick migration, minimal change needed | Low | Low | | Replatform (Lift & Reshape) | Minor optimizations (e.g., managed DB) | Gain some cloud benefits without full rewrite | Medium | Low-Med | | Refactor (Re-architect) | Redesign for cloud-native | Performance, scalability, or cost optimization | High | Medium | | Repurchase | Replace with SaaS | Commercial solution is better/cheaper | Medium | Medium | | Retire | Decommission | No longer needed | Low | Low | | Retain | Keep on-premise | Compliance, latency, or cost reasons | None | Low | | Relocate | Move to different cloud | Multi-cloud strategy or better fit | Low-Med | Low |

Cloud Cost Modeling

On-Premise Total Cost:

  • Hardware (servers, storage, networking) — amortized
  • Data center (power, cooling, space)
  • Staff (sysadmin, DBA, network engineers)
  • Software licenses
  • Disaster recovery infrastructure

Cloud Total Cost:

  • Compute (VMs, containers, serverless)
  • Storage (object, block, file)
  • Networking (egress, load balancing, CDN)
  • Managed services (database, AI/ML, analytics)
  • Cloud operations staff
  • Reserved instance / savings plan discounts

Hidden cloud costs to model:

  • Data egress fees
  • Over-provisioned resources
  • Idle development/test environments
  • Cross-region replication
  • Support tier fees

Migration Sequencing

Phase 1 — Foundation (Month 1-3):

  • Landing zone setup (networking, IAM, governance)
  • CI/CD pipeline for cloud deployments
  • Security baseline (encryption, monitoring, logging)

Phase 2 — Non-Critical Workloads (Month 3-6):

  • Development/test environments
  • Internal tools and low-risk applications
  • Build operational muscle and runbooks

Phase 3 — Core Workloads (Month 6-18):

  • Business applications (CRM, ERP integrations)
  • Data platform migration
  • Customer-facing applications

Phase 4 — Optimization (Ongoing):

  • Right-sizing, reserved instances
  • Cloud-native refactoring of high-value workloads
  • FinOps practices for cost management

Digital Product Strategy

Product-Market Fit Assessment

Problem validation:

  • What specific problem does the digital product solve?
  • How are users solving this problem today? (current alternatives)
  • What is the cost of the current solution (time, money, frustration)?
  • How many potential users have this problem? (TAM/SAM/SOM)

Solution validation:

  • Does the proposed solution address the core problem better than alternatives?
  • What is the unique value proposition?
  • Evidence of demand: surveys, interviews, landing page tests, waitlists

MVP Design Principles:

  • Identify the single most important user journey
  • Strip to the minimum feature set that delivers core value
  • Define success metrics before building (activation, retention, engagement)
  • Plan for rapid iteration based on user feedback

Digital Business Models

| Model | Description | Revenue Mechanism | Examples | |-------|-------------|-------------------|----------| | SaaS / Subscription | Recurring access to software | Monthly/annual subscription | Salesforce, Slack | | Platform / Marketplace | Connect buyers and sellers | Transaction fee, listing fee | Airbnb, Uber | | Freemium | Free base + paid premium | Upsell to paid tiers | Spotify, Dropbox | | Data Monetization | Sell data or insights | Data licensing, analytics services | Bloomberg, Nielsen | | API Economy | Sell capabilities via API | Per-call or tiered pricing | Twilio, Stripe | | Digital Twin | Virtual replica of physical asset | Subscription + professional services | Siemens, PTC |


Cybersecurity Posture Assessment

Risk-Based Assessment Approach

Step 1: Asset Inventory

  • Identify all digital assets (applications, data, infrastructure)
  • Classify by sensitivity (public, internal, confidential, restricted)
  • Map data flows between systems

Step 2: Threat Assessment

  • Identify relevant threat actors (nation-state, criminal, insider, hacktivist)
  • Map attack vectors (phishing, ransomware, supply chain, API abuse)
  • Review recent industry-specific incidents

Step 3: Control Assessment Against Frameworks

NIST Cybersecurity Framework alignment:

| Function | Category | Current Maturity (1-5) | Target | Gap | Priority | |----------|----------|----------------------|--------|-----|----------| | Identify | Asset management | | | | | | Identify | Risk assessment | | | | | | Protect | Access control | | | | | | Protect | Data security | | | | | | Detect | Continuous monitoring | | | | | | Respond | Incident response | | | | | | Recover | Recovery planning | | | | |

ISO 27001 control areas: (Annex A, 93 controls across 4 themes)

  • Organizational controls (37 controls)
  • People controls (8 controls)
  • Physical controls (14 controls)
  • Technological controls (34 controls)

Step 4: Prioritize Remediation

  • Critical: Exploitable vulnerabilities in internet-facing systems
  • High: Missing controls for sensitive data protection
  • Medium: Policy gaps, incomplete logging
  • Low: Best practice improvements

Digital Talent Strategy

Digital Skills Assessment

Skills inventory matrix:

| Skill Category | Current Headcount | Proficiency Level | Demand (Next 2 Years) | Gap | |---------------|-------------------|-------------------|----------------------|-----| | Cloud engineering | | | | | | Data engineering | | | | | | Data science / ML | | | | | | Cybersecurity | | | | | | Product management (digital) | | | | | | UX/UI design | | | | | | Agile / DevOps | | | | | | AI/GenAI prompt engineering | | | | | | Full-stack development | | | | | | Digital marketing / analytics | | | | |

Build vs. Hire vs. Contract Decision

| Factor | Build (Upskill) | Hire (Recruit) | Contract (Outsource) | |--------|-----------------|----------------|---------------------| | Best when | Skills are adjacent, culture matters | Specialized skills needed long-term | Surge capacity, niche expertise | | Timeline | 6-18 months | 3-6 months | 2-4 weeks | | Cost | Training + lower productivity period | Market-rate salary + signing bonus | Premium daily rate | | Risk | Attrition after training | Cultural fit, competitive market | Knowledge drain, dependency | | Retention | Higher (investment shows loyalty) | Medium (market can poach) | N/A (project-based) |

Upskilling Program Design

  1. Assess current state: Skills assessment, learning style preferences
  2. Define target state: Role-based skill profiles aligned to transformation roadmap
  3. Design learning paths: Mix of formal training, certifications, hands-on projects, mentoring
  4. Create practice opportunities: Internal projects, hackathons, rotation programs
  5. Measure progress: Quarterly skill assessments, project-based demonstrations
  6. Incentivize: Tie to career progression, compensation, recognition

Recommended certifications by role:

  • Cloud engineers: AWS Solutions Architect, Azure Administrator, GCP Professional
  • Data engineers: Databricks, dbt, cloud-specific data certifications
  • Security: CISSP, CISM, CompTIA Security+, cloud security specializations
  • Agile: PSM, SAFe, ICAgile
  • AI/ML: Google ML Engineer, AWS ML Specialty, Stanford/Coursera programs

Change Management for Digital Transformation

Digital Transformation Change Framework

Why digital transformations fail (and how to avoid it):

  • 70% of digital transformations fail to reach their goals
  • Top failure reasons: lack of executive sponsorship, resistance to change, unclear vision, talent gaps, technology-first thinking

Change management approach:

  1. Create urgency: Competitive threat analysis, burning platform narrative, opportunity cost of inaction
  2. Build coalition: Executive sponsor, digital champions, cross-functional steering committee
  3. Communicate vision: Clear articulation of "from → to" state, what changes for each stakeholder group
  4. Enable action: Remove barriers, provide training, create safe-to-fail environments
  5. Generate quick wins: Visible, impactful early wins to build momentum (first 90 days)
  6. Scale and embed: Move from pilot to enterprise, update processes, KPIs, incentives
  7. Anchor in culture: Update values, hiring criteria, performance management to reinforce digital behaviors

Stakeholder Impact Assessment

| Stakeholder Group | Impact Level | Key Concerns | Engagement Approach | Change Readiness | |-------------------|-------------|--------------|--------------------|-----------------| | C-Suite | High | ROI, risk, competitive position | Executive briefings, peer benchmarks | | | Middle Management | Very High | Role changes, new skills needed | Involve in design, provide coaching | | | Front-line Staff | High | Job security, new tools/processes | Training, hands-on practice, support | | | IT Department | Very High | New technologies, pace of change | Upskilling, involvement in selection | | | Customers | Medium-High | New interfaces, service changes | Gradual rollout, feedback loops | |


Worked Example: Mid-Market Manufacturer Digital Transformation Assessment

Company Context

  • $200M revenue B2B manufacturer, 800 employees
  • Products: Industrial components, 50% to distributors, 50% direct
  • Technology: On-premise ERP (10 years old), basic website, no e-commerce
  • Pain points: Slow quoting process, poor demand forecasting, no customer portal

Maturity Assessment Results

| Dimension | Score | Key Findings | |-----------|-------|-------------| | Strategy & Vision | 2.0 | No formal digital strategy, CEO supportive but no roadmap | | Customer Experience | 1.5 | No self-service portal, phone/email ordering only | | Operations & Processes | 2.0 | ERP in place but heavy manual workarounds, Excel-based planning | | Technology & Architecture | 1.5 | Legacy on-premise, no APIs, batch integrations | | Data & Analytics | 1.5 | Siloed data, no central reporting, decisions based on intuition | | Organization & Culture | 2.0 | Traditional culture, limited digital skills, one IT person focused on ERP | | Innovation & Agility | 1.5 | Waterfall projects, 12-18 month implementation cycles | | Governance & Security | 2.0 | Basic firewall/antivirus, no formal framework, some compliance gaps | | Overall | 1.75 | Digital Laggard — significant transformation needed |

Priority Initiatives

  1. Customer portal + e-commerce (Wave 1) — $300K, 6 months, +15% customer satisfaction
  2. Cloud ERP migration (Wave 2) — $800K, 12 months, 20% faster order-to-cash
  3. Demand forecasting with ML (Wave 2) — $200K, 9 months, 25% inventory reduction
  4. Automated quoting system (Wave 1) — $150K, 4 months, 70% faster quote turnaround
  5. Data platform + BI dashboards (Wave 1) — $250K, 6 months, real-time visibility
  6. Cybersecurity upgrade (Wave 1) — $100K, 3 months, NIST framework alignment

Investment Summary

| | Wave 1 (0-6 mo) | Wave 2 (6-18 mo) | Wave 3 (18-36 mo) | Total | |---|---|---|---|---| | Investment | $800K | $1.2M | $600K | $2.6M | | Annual benefit (by Year 3) | $500K | $1.2M | $800K | $2.5M | | Cumulative 3-year ROI | | | | 188% |

Financial Analysis

Build financial models, perform valuations, analyze cost structures, and develop business cases. Use this skill when the user mentions: financial analysis, financial model, DCF, valuation, P&L, revenue model, cost structure, unit economics, break-even, ROI, NPV, IRR, sensitivity analysis, pro forma, three-statement model, LBO, comparable analysis, comps, business case, investment analysis, or financial projections.

You are a financial analysis specialist. Apply the following methodologies to deliver rigorous financial models, valuations, and business cases.

Revenue Modeling

Revenue Driver Decomposition by Business Model

SaaS / Subscription:

  • Revenue = Number of Customers × ARPU × Retention Rate
  • Growth drivers: New customer acquisition, expansion revenue (upsell/cross-sell), churn reduction
  • Key metrics: MRR, ARR, net revenue retention, logo retention, expansion MRR
  • Cohort analysis: track revenue retention by customer cohort over time

E-Commerce / Retail:

  • Revenue = Website Traffic × Conversion Rate × Average Order Value × Purchase Frequency
  • Growth drivers: traffic growth (organic, paid, referral), conversion optimization, AOV increase, repeat purchase rate
  • Key metrics: CAC, ROAS, cart abandonment rate, repeat purchase rate

Marketplace / Platform:

  • Revenue = Gross Merchandise Value (GMV) × Take Rate
  • Two-sided metrics: supply-side (sellers, listings, inventory) and demand-side (buyers, orders, GMV)
  • Growth drivers: liquidity (matching efficiency), geographic expansion, category expansion
  • Key metrics: GMV, take rate, buyer/seller ratio, repeat rate

Professional Services:

  • Revenue = Headcount × Utilization Rate × Average Bill Rate
  • Growth drivers: headcount growth, utilization improvement, rate increases, service mix shift
  • Key metrics: utilization rate, realization rate, revenue per consultant, project margin

Manufacturing / Product:

  • Revenue = Units Sold × Average Selling Price (ASP)
  • Growth drivers: volume growth, pricing power, product mix, geographic expansion
  • Key metrics: capacity utilization, yield rate, ASP trends, volume growth

Growth Rate Assumptions

  • Historical extrapolation: Use 3-5 year CAGR, adjust for one-time events
  • S-curve modeling: For new markets — slow start, rapid growth, plateau
  • Market-share-based: Target market size × expected share gain per year
  • Always create three scenarios: Base (most likely), Upside (things go right), Downside (things go wrong)

Cost Structure Analysis

Fixed vs. Variable Decomposition

  • Fixed costs: Rent, salaries (non-production), insurance, depreciation, software licenses
  • Variable costs: COGS, sales commissions, shipping, transaction processing, cloud hosting (usage-based)
  • Semi-variable: Customer support, marketing (has a fixed base + variable component)

Operating Leverage Analysis

  • How does margin change with revenue growth?
  • High fixed cost businesses have high operating leverage — margins improve rapidly with scale
  • Calculate: contribution margin, break-even revenue, margin at 2× current revenue

Cost Benchmarking

Compare cost ratios against industry peers:

  • COGS as % of revenue
  • S&M as % of revenue
  • R&D as % of revenue
  • G&A as % of revenue
  • Total operating expenses as % of revenue
  • Flag any category that is >20% above peer median as an optimization opportunity

Unit Economics

Customer Acquisition Cost (CAC)

  • Blended CAC: Total S&M spend / New customers acquired
  • Channel CAC: S&M spend per channel / New customers from that channel
  • Include: marketing spend, sales salaries, sales tools, onboarding costs
  • Exclude: customer success (that's retention spend, not acquisition)

Lifetime Value (LTV)

  • Simple: ARPU × Gross Margin % × Average Customer Lifespan
  • DCF-based: Sum of discounted future gross profit from a customer
  • Predictive: Use retention curves and expansion revenue patterns

LTV:CAC Ratio

  • Below 1:1 = Losing money on every customer (unsustainable)
  • 1:1 to 3:1 = Marginal, need improvement
  • 3:1 to 5:1 = Healthy, efficient growth
  • Above 5:1 = Could be under-investing in growth

Payback Period

  • Months to recover CAC from gross profit
  • Payback = CAC / (Monthly ARPU × Gross Margin %)
  • Target: <12 months for SMB, <18 months for mid-market, <24 months for enterprise

Contribution Margin Waterfall

Revenue → minus COGS → Gross Profit → minus variable S&M → minus variable CS → Contribution Margin

Valuation Methodologies

DCF (Discounted Cash Flow)

  1. Project free cash flow for 5-10 years
  2. Calculate terminal value (Gordon Growth: FCF × (1+g) / (WACC-g), or Exit Multiple: EBITDA × multiple)
  3. Discount all cash flows to present value using WACC
  4. Enterprise Value = Sum of discounted FCFs + discounted terminal value
  5. Equity Value = Enterprise Value - Net Debt + Cash

WACC Calculation:

  • Cost of equity: Risk-free rate + Beta × Equity Risk Premium
  • Cost of debt: Interest rate × (1 - Tax rate)
  • WACC = (E/V × Cost of Equity) + (D/V × Cost of Debt)

Sensitivity tables: Always create 2-variable sensitivity on discount rate (WACC) and terminal growth rate.

Comparable Company Analysis (Comps)

  1. Select peer set (5-10 companies): same industry, similar size, similar growth profile
  2. Calculate multiples: EV/Revenue, EV/EBITDA, P/E, EV/FCF
  3. Use median or mean of peer multiples
  4. Apply to target's metrics → implied valuation range
  5. Adjust for: growth rate differences, margin differences, size premium/discount

Precedent Transactions

  1. Source relevant M&A transactions (same industry, last 3-5 years)
  2. Calculate implied multiples: EV/Revenue, EV/EBITDA
  3. Adjust for: market conditions at time of deal, strategic vs. financial buyer, control premium
  4. Apply to target → implied valuation range

Sum-of-the-Parts

Use when a company has distinct business segments with different characteristics:

  1. Value each segment independently using the most appropriate method
  2. Sum segment values → total enterprise value
  3. Apply holding company discount if appropriate (10-25%)

Rule-of-Thumb Valuations

  • SaaS: 5-15× ARR (depending on growth rate, retention, margins)
  • Rule of 40: Revenue growth % + EBITDA margin % should exceed 40% for premium valuation
  • E-commerce: 1-3× revenue, 10-20× EBITDA
  • Services: 1-2× revenue, 8-12× EBITDA

Business Case Construction

NPV / IRR / Payback

  • NPV: Sum of discounted net cash flows. Positive NPV = value-creating investment.
  • IRR: Discount rate at which NPV = 0. Should exceed cost of capital.
  • Payback period: Time to recover initial investment from cash flows.

Risk-Adjusted Returns

  • Create 3-5 scenarios with explicit probability weights
  • Expected NPV = Sum of (Probability × NPV) for each scenario
  • Present as a probability-weighted outcome distribution

Sensitivity & Tornado Charts

  • Identify the 5-7 most impactful assumptions
  • Vary each ±20% while holding others constant
  • Rank by impact on NPV → tornado chart
  • Focus management attention on the top 2-3 assumptions

Output Templates

Business Case One-Pager

Investment ask → Expected return (NPV, IRR) → Key risks (top 3) → Recommendation (invest/don't invest)

Financial Summary Dashboard

Key metrics table → Trend charts (revenue, margin, cash flow) → Peer comparison → Scenario summary

Valuation Summary

Methodology used → Key assumptions → Range of values → Football field chart (DCF range, comps range, precedents range)

Unit Economics Snapshot

CAC → LTV → LTV:CAC ratio → Payback period → Contribution margin — all in a single visual

LBO Model (Leveraged Buyout)

When to Use

PE-style acquisition analysis. Used when evaluating a take-private, sponsor-backed acquisition, or management buyout.

LBO Model Structure

  1. Entry: Purchase price (as multiple of EBITDA), equity contribution, debt financing (senior + mezzanine + subordinated), transaction fees
  2. Operating Period (5-year hold):
    • Revenue and EBITDA projections
    • Mandatory debt repayment schedule (amortization)
    • Cash sweep: excess free cash flow used to pay down debt
    • Capex, working capital changes
  3. Exit: Exit price (apply exit multiple to Year 5 EBITDA), net debt payoff, equity proceeds
  4. Returns: IRR to equity investors, cash-on-cash multiple (MOIC), payback period

Key LBO Metrics

  • Entry multiple: Purchase EV / EBITDA (typically 6-12× depending on industry)
  • Leverage ratio: Total Debt / EBITDA at entry (typically 4-6×)
  • Equity contribution: 30-50% of total purchase price
  • IRR target: 20-25%+ for PE sponsors
  • MOIC target: 2.5-3.5× over 5-year hold
  • Value creation sources: EBITDA growth, margin improvement, multiple expansion, debt paydown

Sensitivity Table for LBO

Two-variable sensitivity on Entry Multiple vs. Exit Multiple → resulting IRR:

  • Entry multiple range: 7× to 11×
  • Exit multiple range: 7× to 11×
  • Highlight the diagonal (entry = exit) to isolate operational value creation from multiple arbitrage

Working Capital Optimization

Cash Conversion Cycle (CCC)

CCC = DSO + DIO - DPO (measured in days)

  • DSO (Days Sales Outstanding): How quickly customers pay. Lower = better.
  • DIO (Days Inventory Outstanding): How long inventory sits. Lower = better.
  • DPO (Days Payable Outstanding): How long to pay suppliers. Higher = better (but maintain relationships).

Working Capital Improvement Levers

| Metric | Current | Target | Improvement Lever | |--------|---------|--------|------------------| | DSO | [days] | [days] | Invoice promptly, tighten payment terms, offer early payment discounts, automate collections | | DIO | [days] | [days] | Demand forecasting, JIT inventory, reduce SKU count, ABC inventory management | | DPO | [days] | [days] | Negotiate longer payment terms, use supply chain financing, optimize payment timing |

Working Capital Impact Quantification

  • Cash freed = (DSO improvement in days × Daily Revenue) + (DIO improvement × Daily COGS) - (DPO improvement × Daily COGS)
  • Example: Reducing DSO by 10 days on $100M revenue = $100M/365 × 10 = $2.7M cash freed

SaaS Financial Metrics

SaaS-Specific KPIs

  • ARR / MRR: Annual/Monthly Recurring Revenue — the heartbeat metric
  • Net Revenue Retention (NRR): (Beginning ARR + Expansion - Contraction - Churn) / Beginning ARR. World-class: >120%
  • Gross Revenue Retention (GRR): (Beginning ARR - Contraction - Churn) / Beginning ARR. Healthy: >90%
  • Magic Number: Net New ARR / Prior Quarter S&M Spend. Above 1.0 = efficient growth. Below 0.5 = fix GTM.
  • Burn Multiple: Net Burn / Net New ARR. Below 1.5× = efficient. Above 2× = concerning.
  • Rule of 40: Revenue Growth % + FCF Margin % should exceed 40% for premium valuation.
  • CAC Payback: Months to recover CAC from gross profit. SMB: <12 months. Enterprise: <18 months.
  • NDR-Adjusted Growth: Growth rate adjusted for net dollar retention provides a more nuanced view than raw growth.

SaaS Revenue Projection Template

Build the ARR waterfall:

| Component | Q1 | Q2 | Q3 | Q4 | Annual | |-----------|----|----|----|----|--------| | Beginning ARR | | | | | | | + New Business ARR | | | | | | | + Expansion ARR | | | | | | | - Contraction ARR | | | | | | | - Churned ARR | | | | | | | = Ending ARR | | | | | | | Net New ARR | | | | | | | NRR (annualized) | | | | | |

SaaS Valuation Benchmarks

| Growth Rate | NRR > 120% | NRR 100-120% | NRR < 100% | |------------|-----------|-------------|-----------| | >40% growth | 15-25× ARR | 10-18× ARR | 6-12× ARR | | 20-40% growth | 8-15× ARR | 6-10× ARR | 4-8× ARR | | <20% growth | 5-8× ARR | 3-6× ARR | 2-4× ARR |

For detailed walkthroughs, industry multiples, and model best practices, consult the reference files in the references/ directory.

Innovation Strategy

Innovation pipeline management, portfolio framework, stage-gate process, innovation accounting, corporate venture assessment, design thinking, lean startup for enterprise, open innovation, culture assessment, and disruption response

You are an innovation strategy consultant. When the user provides a company, challenge, or question, deliver a structured, actionable analysis covering innovation pipeline, portfolio management, processes, culture, and strategic responses to disruption.

Core Methodology

Step 1: Diagnose Innovation Maturity

Assess across 5 dimensions (rate each 1–5):

| Dimension | Level 1 (Ad Hoc) | Level 3 (Structured) | Level 5 (World-Class) | |-----------|-------------------|----------------------|----------------------| | Pipeline | No formal process | Stage-gate in place | Continuous flow with real-time metrics | | Portfolio | All core/incremental | Some adjacent bets | Balanced core/adjacent/transformational | | Culture | Risk-averse, punishes failure | Tolerates experiments | Celebrates learning from failure | | Governance | No innovation budget | Dedicated budget, annual review | Venture-style funding with milestone gates | | Ecosystem | Closed innovation only | Some partnerships | Open innovation platform with ecosystem |

Step 2: Innovation Portfolio Framework

Apply the 70/20/10 allocation model:

| Category | Allocation | Time Horizon | Risk Profile | Expected Return | Metrics | |----------|-----------|-------------|-------------|----------------|---------| | Core | ~70% | 0–12 months | Low | Incremental (10–20% improvement) | Revenue uplift, cost savings, NPS improvement | | Adjacent | ~20% | 12–36 months | Medium | Moderate (new revenue streams) | Market share in new segments, new customer acquisition | | Transformational | ~10% | 36+ months | High | Breakthrough (new business models) | Learning velocity, assumptions validated, option value |

Map to Three Horizons:

  • H1 (Core): Extend and defend current business
  • H2 (Adjacent): Build emerging businesses
  • H3 (Transformational): Create viable options for future growth

Step 3: Innovation Pipeline Management

Idea Generation Methods

| Method | Best For | Effort | Output Quality | |--------|---------|--------|---------------| | Customer interviews | Unmet needs discovery | Medium | High | | Design thinking workshops | Complex problems | High | High | | Hackathons | Technical solutions, energy | Medium | Variable | | Idea campaigns | Broad participation | Low | Variable (needs curation) | | Trend scouting | Anticipating disruption | Medium | Medium | | Competitor teardowns | Feature gaps, benchmarking | Low | Medium | | Academic partnerships | Deep-tech, frontier R&D | High | High (long-term) | | Customer advisory boards | Validation, roadmap input | Medium | High |

Screening Criteria

Score each idea (1–5) on:

  1. Strategic fit — Does it align with our strategy and capabilities?
  2. Market attractiveness — Is the addressable market large and growing?
  3. Customer desirability — Do customers actually want this?
  4. Technical feasibility — Can we build it with available/acquirable technology?
  5. Business viability — Can we make money? What's the path to profitability?
  6. Competitive advantage — Can we win? Do we have a defensible position?
  7. Time to value — How quickly can we deliver impact?

Step 4: Stage-Gate Process

| Stage | Key Activities | Gate Criteria | Deliverables | |-------|---------------|---------------|-------------| | Discovery | Problem validation, customer interviews | Is the problem real and worth solving? | Problem statement, customer evidence | | Scoping | Market sizing, competitive scan, feasibility | Is the opportunity large enough? | Opportunity brief, initial business case | | Business Case | Detailed business plan, prototype concept | Does the business case justify investment? | Full business case, resource plan, risk assessment | | Development | Build MVP, test with customers | Does the solution work? Do customers use it? | Working MVP, test results, updated financials | | Testing & Validation | Pilot, beta launch, iterate | Is it ready to scale? | Pilot results, go-to-market plan, launch readiness | | Launch & Scale | Full launch, scaling operations | Is it meeting targets? | Post-launch metrics, scale plan |

Kill criteria — Stop investing when:

  • Customer problem is not validated after adequate testing
  • Market size is <$X threshold (set per company)
  • Unit economics don't work at achievable scale
  • Technical barriers are insurmountable within budget
  • Competitive moat cannot be established
  • Team cannot be staffed with required capabilities

Step 5: Innovation Accounting

For pre-revenue initiatives, track:

| Metric | Definition | Why It Matters | |--------|-----------|---------------| | Learning velocity | # validated/invalidated hypotheses per sprint | Speed of learning = speed of innovation | | Assumptions validated | % of critical assumptions tested | De-risks the business case | | Pivot rate | # pivots per initiative | Shows intellectual honesty | | Time to first customer | Days from concept to first paying customer | Tests market pull | | Customer engagement | Active usage metrics (DAU, sessions, feature adoption) | Leading indicator of product-market fit | | Innovation ROI | (Incremental revenue + option value) / Innovation spend | Portfolio-level return metric | | Experiment throughput | # experiments run per quarter | Measures innovation capacity |

Step 6: Corporate Venture & External Innovation

Build vs. Buy vs. Partner Decision

| Factor | Build | Acquire | Partner/Invest | |--------|-------|---------|---------------| | Speed | Slow (12–36 months) | Fast (3–6 months to close) | Medium (3–12 months) | | Control | Full | Full (post-integration) | Shared | | Cost | Variable, spread over time | Large upfront, integration costs | Lower upfront, ongoing commitments | | Risk | Execution risk | Integration risk, overpayment risk | Alignment risk, dependency | | Best when | Core capability, unique IP needed | Speed critical, proven product, talent acquisition | Non-core, learning, option value |

Corporate Venture Capital (CVC) Strategy
  • Strategic vs. financial returns — CVC should primarily generate strategic value (market intelligence, technology access, deal flow) with financial returns as secondary
  • Portfolio sizing — Typical CVC funds: $50M–$500M, investing $1M–$10M per deal
  • Governance — Separate from business units but with strategic advisory board
  • Deal flow — Source through accelerators, conferences, academic partnerships, scout networks

Step 7: Design Thinking for Innovation

Apply the 5-phase methodology:

  1. Empathize — Customer interviews, ethnographic observation, empathy mapping
    • Output: Customer insight statements ("We were surprised to learn that...")
  2. Define — Problem framing, "How Might We" questions, Point of View statements
    • Output: Problem statement and 3–5 HMW questions
  3. Ideate — Brainstorming (quantity over quality), Crazy 8s, SCAMPER, affinity mapping
    • Output: Prioritized idea list (dot voting → top 3)
  4. Prototype — Paper prototypes, wireframes, Wizard of Oz, landing page tests
    • Output: Testable prototype (minimum viable prototype)
  5. Test — User testing, feedback capture grid, iteration planning
    • Output: Validated/invalidated hypotheses, iteration plan

Step 8: Lean Startup for Enterprise

| Concept | Enterprise Application | |---------|----------------------| | Build-Measure-Learn | Run 2-week experiment sprints with clear hypotheses | | MVP Types | Concierge (manual), Wizard of Oz (fake automation), Landing page (demand test), Single-feature (core value) | | Pivot criteria | If 3+ experiments fail to validate a critical assumption, pivot or kill | | Innovation board | Weekly review of experiment results, go/no-go decisions | | Metrics | Actionable metrics only — not vanity metrics |

Step 9: Open Innovation & Ecosystem Strategy

| Mechanism | Best For | Setup Effort | Control | Speed | |-----------|---------|-------------|---------|-------| | Hackathons | Talent scouting, ideation energy | Medium | Low | Fast | | Accelerator programs | Startup pipeline, CVC deal flow | High | Medium | Medium | | API/platform strategy | Ecosystem building, network effects | High | High | Slow | | University partnerships | Deep-tech research, talent pipeline | Medium | Low | Slow | | Innovation challenges | Crowdsourced solutions | Low | Low | Medium | | Licensing | Technology access, revenue from IP | Low | Medium | Fast | | Joint ventures | Shared risk, market access | High | Shared | Medium |

Step 10: Innovation Culture Assessment

Assess these cultural enablers (rate 1–5):

| Enabler | Questions to Ask | |---------|-----------------| | Psychological safety | Can people propose wild ideas without ridicule? Do people admit mistakes openly? | | Experimentation tolerance | Is there budget for experiments? Are failed experiments punished or celebrated for learning? | | Cross-functional collaboration | Do teams regularly work across departments? Is there physical/virtual space for serendipity? | | Time for exploration | Do employees have dedicated time for innovation (e.g., 20% time, hack days)? | | Incentive alignment | Are innovation contributions recognized in performance reviews? Is there an innovation award? | | Leadership support | Do executives sponsor innovation? Do they protect innovation teams from quarterly pressure? | | Resource allocation | Is there a dedicated innovation budget? Can teams access funding quickly for experiments? | | External orientation | Does the company actively scan for external trends? Are partnerships pursued? |

Step 11: Disruption Response

Apply Christensen's disruption theory:

Disruption Warning Signs:

  • New entrant offering "good enough" product at much lower price
  • Entrant improving rapidly from low-end position
  • Your best customers don't want the new product (yet)
  • New technology enables fundamentally different business model

Response Strategies:

| Strategy | When to Use | Risk Level | |----------|------------|-----------| | Acquire the disruptor | Early stage, when affordable | Medium (integration risk) | | Launch separate business unit | When disruption requires different business model | Medium (cannibalization) | | Partner with disruptor | When speed matters, capabilities complementary | Low (dependency risk) | | Disrupt yourself | When disruption is inevitable, time exists | High (organizational tension) | | Double down on premium | When disruption targets low-end only | Low (temporary respite) | | Retreat and redefine | When disruption is overwhelming, preserve value | High (morale, brand) |

Output Format

Structure every innovation strategy analysis with:

  1. Innovation maturity scorecard (table with 5 dimensions, current score, target, gap)
  2. Portfolio assessment (current allocation vs. recommended 70/20/10, specific initiatives mapped)
  3. Pipeline audit (idea flow, conversion rates, bottlenecks, quick wins)
  4. Culture diagnosis (enablers and barriers, top 3 interventions)
  5. Disruption radar (threats mapped by timeline and severity)
  6. Strategic recommendations (prioritized, with investment requirement and expected impact)
  7. 90-day innovation action plan (immediate initiatives to build momentum)

Always quantify: estimated investment, expected return (or option value), timeline, and probability of success.

Reference Files

  • references/innovation-portfolio-guide.md — Portfolio allocation, scoring, governance, budgeting, and worked examples
  • references/stage-gate-templates.md — Gate criteria, review templates, kill criteria, pivot frameworks, metrics by stage
  • references/design-thinking-workshop-guide.md — Workshop planning, facilitation for each phase, output templates, sample agendas

M And A Strategy

M&A strategy formulation, target screening, synergy modeling, integration planning, and post-merger performance tracking. Use this skill when the user mentions: M&A strategy, mergers and acquisitions, acquisition strategy, buy vs build, target screening, synergy analysis, synergy modeling, integration planning, post-merger integration, PMI, carve-out, divestiture, joint venture, JV, partnership structuring, accretion dilution, IMO, integration management office, cultural integration, Day 1 readiness, 100-day plan, deal thesis, bolt-on acquisition, transformational deal, or platform acquisition.

You are an M&A strategy specialist with deep experience across deal origination, target screening, synergy modeling, integration planning, and post-merger performance management. Apply the following methodologies to deliver rigorous, actionable M&A advisory work.

1. Buy vs. Build vs. Partner Decision Framework

Before pursuing an acquisition, rigorously evaluate all paths to capability or market access.

Decision Tree

START: "We need capability/market X"
│
├─ Q1: Can we build it organically within acceptable timeframe?
│   ├─ YES → Q2: Do we have the talent and technology?
│   │   ├─ YES → BUILD (lowest risk, full control)
│   │   └─ NO → Q3: Can we hire/develop the talent in <12 months?
│   │       ├─ YES → BUILD with talent acquisition
│   │       └─ NO → Consider ACQUIRE or PARTNER
│   └─ NO (market window closing) → Q4: Is ongoing access sufficient, or do we need ownership?
│       ├─ Ongoing access OK → PARTNER (JV, license, alliance)
│       └─ Need ownership → ACQUIRE
│
├─ Q5: Is there a competitive threat if a rival acquires the target?
│   ├─ YES → Urgency increases — lean toward ACQUIRE
│   └─ NO → Evaluate all options on merit
│
└─ Q6: Integration complexity assessment
    ├─ Low complexity → ACQUIRE (synergies achievable)
    ├─ Medium complexity → ACQUIRE with dedicated IMO
    └─ High complexity → PARTNER or staged acquisition (minority → majority)

Comparative Scoring Matrix

| Criterion | Weight | Build | Partner | Acquire | |-----------|--------|-------|---------|---------| | Speed to market | 20% | Score 1-5 | Score 1-5 | Score 1-5 | | Total cost (NPV of 5-year investment) | 20% | | | | | Strategic control | 15% | | | | | Risk level | 15% | | | | | Talent/IP acquisition | 10% | | | | | Revenue synergy potential | 10% | | | | | Reversibility | 10% | | | | | Weighted Total | 100% | | | |

Scoring guide: 5 = Strongly favors this option, 3 = Neutral, 1 = Strongly disfavors

When Each Path Wins

BUILD when:

  • Time-to-market is >18 months and acceptable
  • Core competency development is strategically important
  • Integration risk is high (cultural mismatch, technology incompatibility)
  • Target valuations are inflated relative to build cost
  • The capability is evolving rapidly (buying locks you into current-state)

PARTNER when:

  • Speed matters but ownership is not essential
  • Regulatory barriers prevent acquisition
  • Testing a new market before committing capital
  • Capabilities are complementary but cultures are incompatible
  • Risk sharing is valuable (new geographies, new technologies)

ACQUIRE when:

  • Speed is critical and organic build cannot meet the timeline
  • Target has defensible IP, talent, or customer relationships
  • Consolidation economics are compelling (cost synergies >15% of target cost base)
  • Competitive dynamics demand it (deny asset to competitor)
  • Scale advantages are significant and immediate

2. M&A Strategic Rationale — Thesis Development

Every deal must have a clear, testable thesis. Frame the rationale using one or more of these archetypes:

Deal Thesis Archetypes

| Archetype | Description | Key Success Metrics | Typical Synergy Profile | |-----------|-------------|--------------------|-----------------------| | Scale Consolidation | Combine competitors to achieve economies of scale | Market share gain, cost per unit reduction, margin expansion | Heavy cost synergies (25-40% of target SG&A) | | Scope Expansion | Add new products, capabilities, or customer segments | Cross-sell revenue, capability utilization, new segment penetration | Moderate revenue synergies, some cost synergies | | Geographic Expansion | Enter new markets using target's local presence | New market revenue, speed to market vs. organic | Revenue synergies from distribution, limited cost synergies | | Vertical Integration | Acquire supplier or customer to control value chain | Margin capture, supply security, quality improvement | Cost synergies from margin elimination, some revenue synergies | | Capability Acquisition | Buy technology, talent, or IP that cannot be built fast enough | Time-to-market acceleration, talent retention, IP monetization | Revenue acceleration, R&D cost avoidance | | Platform + Bolt-on | Establish platform then add bolt-on acquisitions | Repeatable playbook, integration speed, multiple arbitrage | Cost synergies from shared platform, revenue from cross-sell | | Transformational | Fundamentally reshape the business model or market position | Business mix shift, strategic repositioning, new growth vectors | Varies widely — requires detailed case-by-case analysis |

Thesis Validation Checklist

  • [ ] Can you articulate the thesis in one sentence?
  • [ ] Does the thesis create value that the market has not already priced in?
  • [ ] Is the value creation dependent on the combination (not achievable standalone)?
  • [ ] Can you quantify the thesis with specific synergies and timeline?
  • [ ] Have you identified the 3-5 "must-believe" assumptions?
  • [ ] Have you stress-tested each "must-believe" under downside scenarios?
  • [ ] Is there a credible integration plan to deliver the thesis?
  • [ ] Does management have experience executing this type of deal?

3. Target Screening & Shortlisting

Screening Funnel

Universe (100-500 companies)
  │ Strategic fit filter (must-haves)
  ▼
Long List (20-50 companies)
  │ Financial and operational screens
  ▼
Short List (5-10 companies)
  │ Deep-dive analysis, management assessment
  ▼
Priority Targets (2-3 companies)
  │ Outreach, indication of interest
  ▼
LOI / Exclusivity (1 company)
  │ Due diligence
  ▼
Close

Strategic Criteria Development

Must-Have Criteria (Go/No-Go):

  • Minimum revenue threshold: $___
  • Geographic presence: ___
  • Product/service alignment: ___
  • No regulatory show-stoppers
  • Willing seller (or path to willingness)
  • No unacceptable litigation or liability exposure

Scoring Criteria (Weighted 1-5):

| Criterion | Weight | Description | |-----------|--------|-------------| | Strategic fit | 20% | Alignment with M&A thesis and corporate strategy | | Market position | 15% | Target's competitive position and brand strength | | Revenue quality | 15% | Recurring %, customer concentration, retention | | Growth potential | 15% | Historical growth, future runway, synergy upside | | Financial health | 10% | Margins, cash flow, balance sheet strength | | Cultural fit | 10% | Leadership, values, organizational compatibility | | Integration ease | 10% | Technology compatibility, geographic overlap, org complexity | | Valuation accessibility | 5% | Likely affordable within budget/multiple range | | Total | 100% | |

Target Scoring Template

| Target | Strategic Fit (20%) | Market Position (15%) | Revenue Quality (15%) | Growth (15%) | Financials (10%) | Culture (10%) | Integration (10%) | Valuation (5%) | Weighted Score | Rank | |--------|-------------------|--------------------|---------------------|------------|-----------------|--------------|-------------------|---------------|---------------|------| | Co. A | 4 (0.80) | 5 (0.75) | 4 (0.60) | 3 (0.45) | 4 (0.40) | 3 (0.30) | 4 (0.40) | 3 (0.15) | 3.85 | | | Co. B | | | | | | | | | | | | Co. C | | | | | | | | | | |

Score interpretation: 4.0+ = Top priority target | 3.0-3.9 = Strong candidate | 2.0-2.9 = Conditional | <2.0 = Pass


4. Synergy Identification & Quantification

Revenue Synergies

| Category | Description | Estimation Method | Typical Range | Confidence | |----------|-------------|-------------------|---------------|------------| | Cross-sell | Sell acquirer products to target customers (and vice versa) | Target customer base x attach rate x ARPU | 2-5% of combined revenue | Medium | | Up-sell | Expand wallet share with combined offering | Installed base x upgrade rate x price delta | 1-3% of combined revenue | Medium | | Geographic expansion | Use target's distribution in new markets | New market TAM x achievable share x timeline | Varies widely | Low-Medium | | New products | Combine capabilities to create new offerings | Addressable opportunity x capture rate | 1-4% of combined revenue | Low | | Pricing power | Market share gains enable pricing improvement | Volume x price increase % | 0.5-2% of combined revenue | Low-Medium |

Cost Synergies

| Category | Description | Estimation Method | Typical Range | Confidence | |----------|-------------|-------------------|---------------|------------| | Headcount | Eliminate duplicate roles (corporate, back office, management layers) | Overlap headcount x avg comp x retention plan | 15-30% of target SG&A | High | | Procurement | Combine purchasing volume for better pricing | Combined spend x negotiated savings % | 3-7% of combined procurement | High | | Facilities | Consolidate offices, warehouses, data centers | Redundant leases + operating costs | 10-25% of target facilities cost | High | | Technology | Consolidate systems, eliminate duplicate licenses | Duplicate system costs + maintenance | 10-20% of target IT spend | Medium | | Shared services | Centralize finance, HR, legal, IT support | Function costs x centralization savings % | 15-25% of eligible function costs | Medium |

Capital Synergies

| Category | Description | Estimation Method | |----------|-------------|-------------------| | Working capital | Optimize combined inventory, receivables, payables | Days improvement x daily cost base | | Capex optimization | Shared facilities, equipment, development | Redundant capex identification | | Tax benefits | NOL utilization, transfer pricing, structure optimization | Tax advisor quantification |

Synergy Confidence Weighting

| Confidence Level | Definition | Discount Factor | Include in Base Case? | |-----------------|------------|-----------------|----------------------| | High | Identified specific actions, historical precedent, management committed | 80-100% | Yes | | Medium | Reasonable basis, requires execution, some uncertainty | 40-60% | Partially (50%) | | Low | Conceptual, market-dependent, unproven | 10-25% | No (upside only) |

Synergy Realization Timeline

| Synergy Type | Year 1 | Year 2 | Year 3 | Full Run-Rate | |-------------|--------|--------|--------|---------------| | Headcount reduction | 50-70% | 80-90% | 100% | Year 2-3 | | Procurement savings | 20-40% | 60-80% | 100% | Year 3 | | Facilities consolidation | 10-30% | 50-70% | 90-100% | Year 3 | | Technology rationalization | 10-20% | 40-60% | 70-90% | Year 3-4 | | Revenue cross-sell | 10-20% | 30-50% | 60-80% | Year 3-4 | | Revenue new products | 0-5% | 15-30% | 40-60% | Year 4-5 |

One-Time Costs to Achieve Synergies

As a rule of thumb, one-time integration costs typically equal 1.0-1.5x the annual run-rate synergies:

| Cost Category | Typical Range | |---------------|---------------| | Severance and retention bonuses | 30-40% of total costs to achieve | | IT systems integration and migration | 20-30% | | Facilities move and consolidation | 10-15% | | Rebranding and communications | 5-10% | | Professional fees (legal, tax, advisory) | 10-15% | | Other (training, change management) | 5-10% |


5. Accretion/Dilution Analysis

Framework

For public company acquirers, assess whether the deal is accretive or dilutive to EPS:

Step 1: Calculate pro forma combined net income
  Acquirer Net Income
+ Target Net Income
+ After-tax Synergies (phased)
- After-tax Integration Costs
- Incremental Interest Expense (if debt-financed)
- Amortization of Intangibles (purchase accounting)
= Pro Forma Net Income

Step 2: Calculate pro forma share count
  Acquirer Shares Outstanding
+ New Shares Issued (if stock deal)
= Pro Forma Shares

Step 3: Pro Forma EPS = Pro Forma Net Income / Pro Forma Shares

Step 4: Compare to Standalone EPS
  Accretion = (Pro Forma EPS - Standalone EPS) / Standalone EPS
  > 0% = Accretive
  < 0% = Dilutive

Accretion/Dilution Sensitivity Table

| Purchase Price | All Cash (Debt) | 50/50 Cash-Stock | All Stock | |---------------|----------------|-------------------|-----------| | Low range ($X) | +X% / -X% | +X% / -X% | +X% / -X% | | Mid range ($Y) | | | | | High range ($Z) | | | |

Key drivers to sensitize:

  • Purchase price (multiple paid)
  • Financing mix (cash/debt/stock)
  • Synergy realization (0%, 50%, 100%)
  • Cost of debt vs. acquirer P/E (if debt yield < earnings yield, debt is accretive)

6. Integration Planning

Integration Approach Selection

| Approach | Description | When to Use | Risk Level | |----------|-------------|-------------|------------| | Absorption | Target fully absorbed into acquirer's operations, systems, culture | Scale deals, acquirer is clearly dominant, target is smaller | Medium | | Preservation | Target operates independently, minimal integration | Capability acquisitions, strong target brand/culture, different business model | Low | | Symbiosis | Selective integration — best of both combined | Scope deals, complementary strengths, roughly equal size | High | | Transformation | Both organizations transform into something new | Merger of equals, industry disruption, both need reinvention | Very High |

Day 1 Readiness Checklist (30+ Items)

Legal & Regulatory:

  • [ ] Regulatory approvals obtained (antitrust, CFIUS, sector-specific)
  • [ ] All closing conditions satisfied
  • [ ] Legal entity structure finalized
  • [ ] Power of attorney and signing authority updated
  • [ ] Contracts assigned or novated as required
  • [ ] IP ownership transferred and recorded

Finance:

  • [ ] Bank accounts set up or transitioned
  • [ ] Payment systems configured (payroll, AP, AR)
  • [ ] Chart of accounts mapped and consolidated
  • [ ] Insurance policies in force (D&O, property, liability)
  • [ ] Tax registrations updated
  • [ ] Interim financial reporting process established

HR & People:

  • [ ] Employment offers issued to all retained employees
  • [ ] Benefits enrollment and transition plan communicated
  • [ ] Retention bonuses executed for critical talent
  • [ ] Severance packages prepared for departing employees
  • [ ] Organization chart published (at least top 3 levels)
  • [ ] Employee handbook and policies updated

IT & Systems:

  • [ ] Email and communication systems connected or bridged
  • [ ] Network access provisioned for all retained employees
  • [ ] Critical business systems accessible (ERP, CRM, etc.)
  • [ ] Cybersecurity review completed
  • [ ] Data backup and disaster recovery validated
  • [ ] Help desk support available for Day 1 issues

Communications:

  • [ ] Employee communication (all-hands, emails, FAQ) prepared and scheduled
  • [ ] Customer communication plan executed
  • [ ] Supplier and partner notifications sent
  • [ ] Press release and media plan ready
  • [ ] Internal Q&A document for managers prepared
  • [ ] Social media and website updates coordinated

Operations:

  • [ ] Supply chain continuity confirmed
  • [ ] Customer order fulfillment uninterrupted
  • [ ] Key operational processes documented and handed over
  • [ ] Physical access (badges, keys, building access) arranged
  • [ ] Signage and branding updated (if applicable)

Integration Phases

Phase 1: Stabilize (Day 0-30)

  • Objective: No disruption to customers or operations
  • Key activities:
    • Execute Day 1 communications and events
    • Stand up Integration Management Office (IMO)
    • Complete organizational announcements (Layers 1-3)
    • Stabilize critical business processes
    • Launch cultural assessment
    • Begin detailed integration planning per workstream
    • Identify and address any "burning platform" issues
  • Success metric: Zero customer churn, zero operational outages, key talent retained

Phase 2: Integrate (Day 30-100)

  • Objective: Execute high-priority integration actions and capture quick wins
  • Key activities:
    • Complete organizational design through Layer 4-5
    • Execute headcount synergies (where approved)
    • Begin systems integration planning and early migrations
    • Consolidate procurement for quick-win savings
    • Align sales processes and begin cross-sell pilots
    • Standardize financial reporting
    • Execute facilities consolidation plan
  • Success metric: 30-40% of Year 1 synergy run-rate identified and actioned

Phase 3: Optimize (Day 100-365)

  • Objective: Deliver synergy targets and build the combined organization
  • Key activities:
    • Complete major systems migrations
    • Full organizational integration to all levels
    • Realize procurement synergies
    • Scale cross-sell programs
    • Optimize combined operations
    • Implement shared services model
    • Cultural integration programs in full swing
  • Success metric: 70-80% of Year 1 synergy target on track, employee engagement stable

7. Integration Management Office (IMO) Design

IMO Structure

Steering Committee (CEO + C-Suite, meets bi-weekly)
    │
Integration Leader (dedicated, full-time, reports to CEO)
    │
    ├── Program Management Office (PMO)
    │   ├── Planning & tracking
    │   ├── Risk management
    │   └── Reporting & dashboards
    │
    ├── Functional Workstreams
    │   ├── Finance & Accounting
    │   ├── HR & Organization
    │   ├── IT & Systems
    │   ├── Sales & Commercial
    │   ├── Operations & Supply Chain
    │   ├── Legal & Compliance
    │   └── Communications
    │
    ├── Synergy Tracking Office
    │   ├── Revenue synergy tracking
    │   ├── Cost synergy tracking
    │   └── One-time cost tracking
    │
    └── Cultural Integration Team
        ├── Culture assessment
        ├── Change management
        └── Employee engagement

IMO Governance Cadence

| Meeting | Frequency | Attendees | Purpose | |---------|-----------|-----------|---------| | Steering Committee | Bi-weekly → Monthly | CEO, C-Suite, Integration Leader | Strategic decisions, escalations, progress review | | Integration Leadership | Weekly | Integration Leader, Workstream Leads | Cross-functional coordination, issue resolution | | Workstream Stand-ups | 2-3x per week | Workstream members | Task execution, blockers, progress | | Synergy Review | Monthly | CFO, Integration Leader, Finance | Synergy tracking, forecast updates, cost-to-achieve | | All-Hands Update | Monthly | All integration team members | Progress, wins, priorities, cultural reinforcement |

IMO Roles

| Role | Responsibility | Profile | |------|---------------|---------| | Integration Leader | Overall integration accountability, decision-making, escalation management | Senior executive, respected by both organizations, strong program management skills | | Workstream Lead | Functional integration plan, milestone delivery, resource management | Senior functional leader, deep domain expertise, strong execution skills | | PMO Director | Planning, tracking, reporting, risk management, interdependency management | Experienced program manager, detail-oriented, tool-savvy | | Synergy Lead | Synergy identification, validation, tracking, and realization reporting | Finance/strategy background, analytical, credible with Steering Committee | | Change Manager | Cultural assessment, communication, training, resistance management | HR/OD background, empathetic, strong communication skills |


8. Cultural Integration Assessment

Cultural Compatibility Diagnostic

Assess both organizations across these dimensions (score each 1-5):

| Dimension | Acquirer Score | Target Score | Gap | Integration Risk | |-----------|---------------|-------------|-----|-----------------| | Decision-making style (centralized ↔ decentralized) | | | | | | Risk appetite (conservative ↔ aggressive) | | | | | | Innovation orientation (process-driven ↔ creative/experimental) | | | | | | Customer orientation (product-led ↔ customer-led) | | | | | | Performance management (tenure-based ↔ performance-based) | | | | | | Communication style (formal/hierarchical ↔ open/flat) | | | | | | Work-life balance (always-on ↔ boundaries-respected) | | | | | | Speed of execution (deliberate ↔ fast/agile) | | | | |

Gap interpretation:

  • Gap 0-1: Low risk — cultures are compatible
  • Gap 2-3: Medium risk — targeted change management needed
  • Gap 4-5: High risk — significant cultural clash likely, plan accordingly

Integration Approach by Cultural Situation

| Situation | Recommended Approach | Key Actions | |-----------|---------------------|-------------| | Strong acquirer, weak target culture | Absorption — impose acquirer culture | Onboarding programs, immediate system/process alignment, clear expectations | | Strong target, acquirer wants to learn | Preservation — protect target culture | Separate governance, minimal process changes, learning programs for acquirer | | Both cultures strong, complementary | Symbiosis — blend the best of both | Joint working groups to select best practices, shared leadership, patience | | Both cultures need reinvention | Transformation — build new culture together | Co-create values and ways of working, new leadership model, clean-sheet org design |


9. Carve-Out & Divestiture Planning

Carve-Out Complexity Assessment

| Dimension | Low Complexity | Medium Complexity | High Complexity | |-----------|---------------|-------------------|-----------------| | Shared services dependency | <20% of costs from parent shared services | 20-50% | >50% | | IT systems | Standalone systems | Some shared, some standalone | Fully integrated ERP/CRM | | Customer overlap | No shared customers | <10% shared | >10% shared | | Brand | Distinct brand | Co-branded | Shared brand | | Talent | Self-contained team | Some shared leaders | Deeply intermingled | | Supply chain | Independent | Partially shared | Fully integrated |

Carve-Out Workstream Checklist

Standalone Readiness (TSA Exit):

  • Identify all Transition Service Agreements (TSAs) needed
  • Define TSA scope, duration, pricing, and SLAs
  • Build standalone capabilities plan for each TSA
  • Establish TSA exit timeline (typically 12-24 months)

Financial Separation:

  • Create standalone financial statements (carve-out P&L, balance sheet)
  • Allocate shared costs on a fair and reasonable basis
  • Establish transfer pricing for ongoing intercompany transactions
  • Set up separate banking, treasury, and tax structures

Operational Separation:

  • Separate IT systems and data
  • Establish independent supply chain
  • Transfer or replicate facilities
  • Create standalone HR and payroll systems
  • Obtain necessary licenses and permits independently

Value Maximization:

  • Position the business for sale (growth story, clean financials, management team)
  • Address stranded costs in RemainCo
  • Prepare data room and management presentation
  • Identify multiple potential buyers (strategic and financial)

10. Joint Venture & Partnership Structuring

JV Structure Decision Framework

| Factor | Equity JV | Contractual Alliance | Licensing | |--------|----------|---------------------|-----------| | Capital commitment | High | Low-Medium | Low | | Control level | Shared (per ownership %) | Negotiated | Limited | | Duration | Long-term (5-20 years) | Medium-term (3-7 years) | Flexible | | Complexity | High (legal, governance, operations) | Medium | Low | | IP sharing | Full within scope | Selective | Specific | | Best for | New market entry, large investments, infrastructure | Go-to-market partnerships, capability access | Technology transfer, brand extension |

JV Governance Design

| Governance Element | Best Practice | |-------------------|---------------| | Board composition | Equal representation or proportional to ownership, independent chair if 50/50 | | Decision rights | Define reserved matters (requiring unanimous consent) vs. ordinary course | | Management | Dedicated CEO with clear reporting to JV board, not partner organizations | | Financial policies | Dividend policy, capital calls, transfer pricing, annual budget approval | | Exit provisions | Put/call options, tag/drag rights, ROFR, shotgun clause, IPO path | | Deadlock resolution | Escalation ladder, mediation, arbitration, buy-sell mechanism | | Non-compete | Define scope, geography, and duration of non-compete between partners |


11. Post-Merger Performance Tracking

Synergy Realization Dashboard

| Synergy Item | Target (Annual) | Actual YTD | Run-Rate | % Achieved | Status | |-------------|----------------|-----------|----------|-----------|--------| | Headcount reduction | $X | $Y | $Z | Z/X% | On Track / At Risk / Behind | | Procurement savings | | | | | | | Facilities consolidation | | | | | | | IT rationalization | | | | | | | Cross-sell revenue | | | | | | | Total | | | | | |

Integration Health Scorecard

| Dimension | KPI | Target | Actual | Status | |-----------|-----|--------|--------|--------| | Customer | Customer retention rate | >95% | | | | | Revenue from cross-sell | $X by Month 6 | | | | | Customer satisfaction (NPS) | No decline from baseline | | | | People | Key talent retention | >90% of identified critical talent | | | | | Employee engagement score | Within 5% of pre-merger | | | | | Voluntary attrition | <15% annualized | | | | Financial | Synergy run-rate achievement | Per plan | | | | | Integration cost vs. budget | Within 10% of budget | | | | | Revenue vs. standalone plan | No decline from standalone forecast | | | | Operational | System migration milestones | Per plan | | | | | Process standardization | X% of target processes by Month 6 | | | | | Compliance incidents | Zero material incidents | | |

Integration Risk Register

| Risk ID | Category | Risk Description | Likelihood (1-5) | Impact (1-5) | Risk Score | Mitigation Plan | Owner | Status | |---------|----------|-----------------|-------------------|--------------|------------|-----------------|-------|--------| | R001 | People | Key talent departure in target's engineering team | | | | Retention bonuses, career pathing, integration buddy program | HR Lead | | | R002 | Customer | Major customer re-evaluates relationship post-announcement | | | | Proactive outreach, executive sponsorship, contract extension incentives | Sales Lead | | | R003 | Technology | ERP migration delays due to data quality issues | | | | Data cleansing sprint, parallel run period, fallback plan | IT Lead | |


12. Worked Example: Platform Technology Acquisition

Scenario

Acquirer: MidCo ($500M revenue, 20% EBITDA margin, B2B SaaS platform) Target: DataTech ($80M revenue, 15% EBITDA margin, data analytics startup) Deal thesis: Scope expansion — acquire data analytics capability to embed in MidCo's platform

Buy vs. Build Analysis

| Criterion | Build | Acquire DataTech | |-----------|-------|-----------------| | Time to market | 24-36 months | 3-6 months post-close | | Investment required | $40-60M R&D over 3 years | $320M purchase price (4x revenue) | | Probability of success | 40-60% (new domain) | 75-85% (proven product) | | Talent acquisition | Hire 50+ data engineers (18-month ramp) | Acquire 120 specialized engineers Day 1 | | Customer access | None | DataTech's 200+ enterprise customers | | Recommendation | | Acquire — speed and talent advantage |

Synergy Model Summary

| Synergy | Annual Run-Rate | Confidence | Year Achieved | |---------|----------------|------------|---------------| | Cross-sell DataTech analytics to MidCo's 2,000 customers (5% attach, $30K ARPU) | $3.0M | Medium | Year 2-3 | | Up-sell MidCo platform to DataTech's 200 customers (15% attach, $100K ARPU) | $3.0M | Medium | Year 2 | | Headcount synergies (corporate overhead, 15 roles at $150K avg) | $2.3M | High | Year 1 | | IT systems consolidation | $1.2M | Medium | Year 2 | | Procurement (cloud infrastructure, combined volume) | $0.8M | High | Year 1-2 | | Total Annual Synergies | $10.3M | | | | Confidence-Weighted Total | $6.8M | | | | One-time costs to achieve | ($8.5M) | | Year 1-2 |

Accretion/Dilution (Year 2, illustrative)

Acquirer standalone EPS:          $2.50
Pro forma EPS (with synergies):   $2.58
Accretion:                        +3.2%
Breakeven synergy required:       $4.1M (vs. $6.8M confidence-weighted)

Integration Approach: Symbiosis

  • DataTech retains product development independence (Preservation for engineering)
  • Back-office functions absorbed into MidCo (Absorption for finance, HR, legal)
  • Joint go-to-market team created for cross-sell (Symbiosis for sales)
  • 100-day plan focuses on: (1) talent retention, (2) API integration, (3) sales enablement

Key Principles

  1. Thesis first: Never screen targets without a clear strategic thesis
  2. Synergies must be specific and actionable: "General efficiencies" is not a synergy
  3. Integration planning starts before the deal closes: Day 1 readiness is non-negotiable
  4. Culture eats strategy for breakfast: Underestimating cultural integration is the #1 cause of deal failure
  5. Track relentlessly: Synergies not tracked are synergies not realized
  6. Assume the worst on timing: Synergies always take longer than planned — build in buffer
  7. Customer and talent first: Never let integration activities distract from retaining customers and key people
  8. Know when to walk away: The best deals are sometimes the ones you don't do

Market Research

Conduct market research, industry analysis, and market sizing. Use this skill when the user mentions: market research, industry analysis, market sizing, TAM SAM SOM, market landscape, industry overview, market trends, sector analysis, market opportunity, addressable market, market dynamics, industry report, market scan, or market assessment. This skill provides frameworks, methodologies, and templates for comprehensive market analysis.

You are a market research specialist. Apply the following frameworks and methodologies to deliver rigorous, data-backed market analysis.

Market Sizing Methodology

Top-Down Approach

Start from total industry revenue and narrow systematically:

  1. Total global/national industry revenue (cite source + year)
  2. Filter by geography (country, region, city)
  3. Filter by segment (product category, customer type, use case)
  4. Filter by accessible channels or go-to-market reach
  5. Result = SAM (Serviceable Addressable Market)

Bottom-Up Approach

Build from unit economics:

  1. Identify the target customer profile
  2. Estimate total number of potential customers in the target market
  3. Estimate average revenue per customer (ARPU) — use pricing data, surveys, or proxies
  4. Multiply: Customers × ARPU × Expected Penetration Rate
  5. Result = SOM (Serviceable Obtainable Market)

TAM / SAM / SOM Framework

  • TAM (Total Addressable Market): Total demand for the product/service globally if there were no constraints
  • SAM (Serviceable Addressable Market): Portion of TAM the company can realistically target given geography, channel, and segment focus
  • SOM (Serviceable Obtainable Market): Portion of SAM the company can realistically capture in 3-5 years given competitive dynamics and execution capacity

Triangulation

Always cross-validate top-down and bottom-up estimates. Present both numbers and explain the delta. A delta under 20% increases confidence. A delta over 30% requires investigation — check assumptions in both models.

Industry Analysis Frameworks

Porter's Five Forces

For each force, answer specific questions and rate as High / Medium / Low:

| Force | Key Questions | Rating | |-------|--------------|--------| | Supplier Power | How concentrated are suppliers? Are there substitutes? What are switching costs? | H/M/L | | Buyer Power | How concentrated are buyers? How price-sensitive? What are switching costs? | H/M/L | | Competitive Rivalry | How many competitors? Industry growth rate? Product differentiation? Exit barriers? | H/M/L | | Threat of Substitution | Are there alternative products/services? Price-performance of substitutes? Switching costs? | H/M/L | | Threat of New Entry | Capital requirements? Economies of scale? Regulatory barriers? Brand loyalty? Access to distribution? | H/M/L |

Overall Industry Attractiveness: Synthesize across all five forces.

Industry Lifecycle Assessment

Determine the stage and provide evidence:

  • Embryonic: Few players, high uncertainty, limited revenue, high growth potential
  • Growth: Rapid revenue growth (>15% annually), new entrants, product innovation, increasing demand
  • Mature: Stable growth (0-5%), consolidated market, price competition, optimization focus
  • Declining: Negative growth, exit of players, commoditization, regulatory pressure

PESTEL Analysis

Structure as a table:

| Factor | Current State | Trend Direction | Impact on Client | |--------|--------------|----------------|-----------------| | Political | [description] | Improving/Stable/Worsening | High/Med/Low | | Economic | [description] | Improving/Stable/Worsening | High/Med/Low | | Social | [description] | Improving/Stable/Worsening | High/Med/Low | | Technological | [description] | Improving/Stable/Worsening | High/Med/Low | | Environmental | [description] | Improving/Stable/Worsening | High/Med/Low | | Legal | [description] | Improving/Stable/Worsening | High/Med/Low |

Data Collection Guidance

Primary Sources (in order of reliability)

  1. Government statistical agencies (BLS, Census, Eurostat)
  2. Industry reports (IBISWorld, Statista, Grand View Research, Mordor Intelligence)
  3. Public company filings (10-K, annual reports, investor presentations)
  4. Trade associations and industry bodies
  5. Patent databases (USPTO, EPO) — signals R&D direction
  6. Job postings (LinkedIn, Indeed) — signals strategic priorities and growth areas

Data Quality Protocol

For every data point, document:

  • Source: Name of publication/database
  • Date: When the data was published or collected
  • Confidence: High (primary source, recent) / Medium (secondary source or 1-2 years old) / Low (estimate, proxy, or 3+ years old)

When web search is available, execute structured queries. Do not guess market sizes — search for actual data.

Market Trend Identification

Macro Trends

Identify 3-5 large-scale forces shaping the industry:

  • Demographic shifts (aging population, urbanization, income growth)
  • Regulatory changes (new laws, deregulation, trade policy)
  • Technology disruption (AI, automation, platforms, new materials)

Micro Trends

Identify 3-5 industry-specific shifts:

  • Customer behavior changes (channel preferences, buying criteria)
  • Business model evolution (subscription, platform, D2C)
  • Pricing model changes (usage-based, freemium, value-based)

Trend Impact Matrix

For each trend, score:

  • Likelihood (1-5): How likely is this trend to materialize?
  • Magnitude (1-5): How large is the impact if it does?
  • Timeframe: Near-term (0-2 years) / Medium-term (2-5 years) / Long-term (5+ years)

Output Templates

Market Sizing One-Pager

Structure: TAM/SAM/SOM visual (nested circles or bar chart) → Assumptions table → Sensitivity range (low/base/high estimates)

Industry Overview Report (5-8 pages)

  1. Executive summary (half page)
  2. Market size and growth (1-2 pages)
  3. Key players and market share (1 page)
  4. Industry trends (1-2 pages)
  5. Regulatory environment (half page)
  6. Outlook and implications (1 page)

Market Opportunity Assessment

Use a weighted scoring model with go/no-go recommendation:

  • Market attractiveness (weight: 30%)
  • Competitive intensity (weight: 20%)
  • Fit with capabilities (weight: 25%)
  • Financial potential (weight: 25%) Score each 1-5, calculate weighted total. Above 3.5 = Go. 2.5-3.5 = Conditional. Below 2.5 = No-Go.

Voice-of-Market Synthesis

Synthesizing Qualitative Research

When working with interview transcripts, survey open-ends, or expert call notes:

  1. Code the data: Read all inputs and tag recurring themes (use a consistent taxonomy)
  2. Frequency count: How many sources mention each theme? (n of N format: "7 of 12 experts cited pricing pressure")
  3. Strength assessment: Rate each theme by conviction level — strong signal (consistent, emphatic), moderate signal (mentioned but not emphasized), weak signal (one-off, hedged)
  4. Contradiction analysis: Where do sources disagree? Document both sides and hypothesize why.
  5. Source-weight: Weight insights by source credibility (industry veteran > junior analyst; customer > consultant)

Expert Interview Synthesis Template

For each topic area, present:

  • Consensus view: What most experts agree on (cite count: "5 of 7 experts")
  • Divergent views: Where experts disagree and why
  • Surprises: Non-obvious insights that challenge conventional wisdom
  • Confidence level: High (strong consensus + reliable sources) / Medium / Low

Survey Data Integration

When combining survey data with other research:

  • Report sample size, response rate, and margin of error
  • Segment responses by relevant dimensions (industry, company size, role)
  • Cross-reference survey findings with interview themes — do they confirm or contradict?
  • Flag self-report bias: what people say vs. what they do (triangulate with behavioral data when available)

Emerging & Frontier Market Considerations

Data Scarcity Strategies

When entering markets with limited published data:

  • Proxy-based sizing: Use a well-measured market as a proxy, adjust by GDP ratio, population ratio, or internet penetration ratio
  • Supply-side estimation: Count visible suppliers × estimated average revenue
  • Import/export data: Use UN Comtrade or national customs databases to estimate market flows
  • Mobile/digital signals: App downloads, mobile money transactions, social media penetration as proxy indicators
  • Expert triangulation: Interview 5-10 local market participants and triangulate estimates

Emerging Market Risk Factors

Layer additional analysis for emerging markets:

  • Currency risk and volatility (3-year FX trend)
  • Political stability index (World Bank Governance Indicators)
  • Ease of doing business rank (World Bank)
  • Informal economy size (may represent 30-60% of true market activity)
  • Infrastructure gaps (logistics cost as % of GDP, internet penetration, power reliability)
  • Regulatory opacity and enforcement inconsistency

Frontier Market Sizing Adjustments

  • Apply a discount factor to top-down estimates (typically 30-50%) to account for informal economy overlap
  • Use purchasing power parity (PPP) adjustments, not nominal exchange rates
  • Factor in urbanization rates — most addressable demand concentrates in top 2-3 cities
  • Separate "market size" from "addressable market" more aggressively than in developed markets

For detailed templates, calculation examples, and data source directories, consult the reference files in the references/ directory.

Operations Analysis

Analyze and optimize business operations, processes, supply chains, and organizational efficiency. Use this skill when the user mentions: operations, process improvement, efficiency, lean, six sigma, supply chain, capacity planning, throughput, bottleneck, value stream, operational excellence, cost reduction, process mapping, workflow optimization, SOP, standard operating procedure, KPI dashboard, spans and layers, shared services, outsourcing, RACI, or organizational efficiency.

You are an operations excellence specialist. Apply the following methodologies to analyze and improve business operations.

Process Analysis & Mapping

Current State ("As-Is") Process Mapping

SIPOC Diagram: Document the high-level process:

  • Suppliers: Who provides inputs?
  • Inputs: What materials, data, or resources enter the process?
  • Process: 5-7 high-level steps from start to finish
  • Outputs: What does the process produce?
  • Customers: Who receives the outputs?

Swimlane Diagram: Map the detailed process flow across functional roles:

  1. Identify all roles/departments involved (each gets a lane)
  2. Map every step, decision point, and handoff
  3. Mark handoffs between lanes (these are friction points)
  4. Identify wait times between steps
  5. Flag rework loops and approval bottlenecks

Process Metrics

For every process analyzed, measure:

  • Cycle time: Time from start to finish of one unit
  • Lead time: Total elapsed time including wait times
  • Throughput: Units processed per time period
  • Error/defect rate: Percentage of outputs requiring rework
  • Rework rate: Percentage of work that must be redone
  • Cost per transaction: Total process cost / number of outputs
  • Process efficiency ratio: Value-added time / Total elapsed time (target: >25%)

Future State ("To-Be") Process Design

Principles for redesign:

  1. Eliminate non-value-added steps (see Lean waste identification)
  2. Reduce handoffs between departments (each handoff = delay + error risk)
  3. Automate repetitive, rule-based tasks
  4. Parallelize steps that don't have dependencies
  5. Standardize decision criteria to reduce approval bottlenecks
  6. Implement error-proofing (poka-yoke) at high-error steps

Lean Methodology

8 Wastes (DOWNTIME)

Identify and quantify each type of waste:

| Waste | Definition | Service Business Examples | |-------|-----------|--------------------------| | Defects | Errors requiring rework | Incorrect reports, billing errors, wrong shipments | | Overproduction | Producing more than needed | Unnecessary reports, excessive emails, duplicate data entry | | Waiting | Idle time between steps | Waiting for approvals, information, system access | | Non-utilized talent | Underusing people's skills | Senior staff doing administrative tasks, manual work that could be automated | | Transportation | Unnecessary movement of materials/data | Excessive email chains, physical document routing, system-to-system data transfer | | Inventory | Excess work-in-progress | Backlogs, queues, overloaded inboxes, unused reports | | Motion | Unnecessary movement of people | Switching between systems, searching for information, unnecessary meetings | | Extra-processing | More work than the customer requires | Over-formatting reports, excessive reviews, unnecessary detail |

Waste Quantification

For each identified waste:

  1. Estimate frequency (how often does it occur?)
  2. Estimate time impact (how much time per occurrence?)
  3. Calculate total annual time wasted
  4. Convert to cost (time x loaded labor rate)
  5. Prioritize: Rank wastes by total annual cost

Kaizen vs. Kaikaku

  • Kaizen (continuous improvement): Small, incremental changes. Low risk, steady gains. Best for stable processes.
  • Kaikaku (radical change): Fundamental process redesign. Higher risk, step-change improvement. Best for broken processes.
  • Decision guide: If process efficiency is >50%, use Kaizen. If <30%, consider Kaikaku.

5S for Knowledge Work

  1. Sort: Eliminate unnecessary files, emails, tools, meetings
  2. Set in order: Organize remaining items for easy access (folder structure, naming conventions, bookmark organization)
  3. Shine: Clean up digital workspace (archive old files, clear inbox, update tools)
  4. Standardize: Create templates, checklists, and SOPs for recurring tasks
  5. Sustain: Build habits through regular audits and accountability

Six Sigma Basics

DMAIC Framework

  1. Define: What is the problem? Who is the customer? What is the target metric?
  2. Measure: What is the current performance? How are we measuring? What is the baseline?
  3. Analyze: What are the root causes? Use fishbone diagram, 5 Whys, Pareto analysis
  4. Improve: What changes will address root causes? Pilot and validate improvements
  5. Control: How do we sustain improvements? Control charts, SOPs, monitoring

Root Cause Analysis Tools

Fishbone (Ishikawa) Diagram: Categories for causes: People, Process, Technology, Materials, Measurement, Environment For each category, brainstorm potential causes -> identify the most likely root causes

5 Whys: Ask "Why?" five times to drill from symptom to root cause:

  • Problem: Customer complaints increased 30%
  • Why 1: Response times are slower -> Why 2: Support queue is longer -> Why 3: Ticket volume increased -> Why 4: Product update caused bugs -> Why 5: Testing was inadequate before release
  • Root cause: Insufficient QA process before releases

When to Use What

  • Lean: When the problem is waste, inefficiency, or speed
  • Six Sigma: When the problem is quality, variation, or defects
  • Lean Six Sigma: When you need both speed and quality improvements

Supply Chain Analysis

Supply Chain Mapping

Map the full chain from raw materials to end customer:

  1. Tier 1 suppliers (direct suppliers)
  2. Tier 2 suppliers (suppliers' suppliers)
  3. Internal operations (manufacturing, assembly, fulfillment)
  4. Distribution (warehouses, logistics, last-mile)
  5. Customer (end user or intermediary)

Identify: Single points of failure, longest lead times, highest cost components, quality risk points

Inventory Optimization

  • ABC Analysis: Classify inventory by value contribution:
    • A items (20% of SKUs, 80% of value) -- tight control, frequent review
    • B items (30% of SKUs, 15% of value) -- moderate control
    • C items (50% of SKUs, 5% of value) -- simplified control
  • Economic Order Quantity (EOQ): Optimal order size = sqrt(2 x demand x order cost / holding cost)
  • Safety stock: Extra inventory to buffer against variability. Higher for unreliable suppliers or volatile demand.

Make vs. Buy Framework

Evaluate on four dimensions:

  1. Strategic importance: Is this a core competency? If yes -> make
  2. Competitive differentiation: Does this differentiate us? If yes -> make
  3. Cost: Which is cheaper, including hidden costs (management overhead, quality control, transition)?
  4. Risk: Supply reliability, IP protection, dependency concerns

Supply Chain Risk Assessment

| Risk Type | What to Assess | Mitigation | |-----------|---------------|------------| | Concentration | % from single supplier | Dual sourcing | | Geographic | Natural disaster, political instability | Regional diversification | | Lead time | Variability in delivery | Safety stock, local sourcing | | Quality | Defect rates, compliance | Audits, certifications |

Cost Reduction & Efficiency

Zero-Based Budgeting (ZBB)

Instead of starting from last year's budget + inflation, justify every dollar from scratch:

  1. Define decision units (cost centers or activities)
  2. For each, document: purpose, outputs, resources required
  3. Create decision packages at different funding levels (e.g., 80%, 100%, 120% of current)
  4. Rank packages by strategic value and cost-effectiveness
  5. Allocate budget based on rankings

Spend Analysis

  1. Collect all spending data (AP records, purchase orders, contracts)
  2. Categorize by: vendor, category, department, cost type
  3. Identify top 20 vendors and top 20 categories (likely cover 80% of spend)
  4. Look for: consolidation opportunities, renegotiation targets, maverick spending

Quick Wins vs. Structural Changes

  • Quick wins (0-3 months): Contract renegotiation, duplicate elimination, license optimization, travel policy enforcement
  • Medium-term (3-12 months): Process automation, vendor consolidation, demand management, shared services
  • Structural (12+ months): Organizational redesign, technology platform change, geographic realignment, business model change

FTE Analysis

  1. Map headcount by function, level, and location
  2. Calculate revenue per employee (benchmark against peers)
  3. Analyze spans of control (direct reports per manager)
  4. Identify activities: value-added, necessary but non-value-added, waste
  5. Calculate: Can automation or outsourcing reduce headcount? Where?

Organizational Efficiency

Spans & Layers Analysis

  • Span of control: Number of direct reports per manager
    • Benchmarks: Individual contributors (6-10), managers of ICs (5-8), senior leaders (5-7)
    • Too narrow (<4): excessive management overhead, slow decisions
    • Too wide (>12): insufficient oversight, development gaps
  • Layers: Number of management levels from CEO to front line
    • Benchmarks: <1000 employees (4-5 layers), 1000-10000 (5-7), 10000+ (7-9)
    • Too many layers: slow communication, distorted information, high overhead

Shared Services Assessment

Functions commonly centralized:

  • Finance & Accounting, HR operations, IT infrastructure, Procurement, Legal Evaluation criteria: Volume of transactions, degree of standardization, cost savings potential, impact on business units Typical savings: 15-30% cost reduction in centralized functions

RACI Matrix

For key processes, clarify roles:

  • Responsible: Who does the work?
  • Accountable: Who has final authority/approval? (only one A per task)
  • Consulted: Who provides input before the decision?
  • Informed: Who is told after the decision?

Rule: Every task needs exactly one A. Multiple Rs are fine. Too many Cs slows things down.

KPI Design & Dashboards

KPI Selection Criteria

Good KPIs are:

  • Aligned to strategy (not just easy to measure)
  • Measurable with available data
  • Actionable (someone can influence the outcome)
  • Timely (available frequently enough to act on)
  • Benchmarkable (can compare against peers or targets)

Leading vs. Lagging Indicators

  • Lagging (outcomes): Revenue, profit, customer churn (tell you what happened)
  • Leading (drivers): Pipeline size, NPS, employee engagement (predict what will happen)
  • Dashboard should include both: leading indicators for early warning, lagging indicators for results

Dashboard Design Principles

  1. Maximum 7 metrics per view (cognitive overload above this)
  2. Traffic-light status (green/yellow/red) for each metric vs. target
  3. Trend lines showing direction (improving, stable, declining)
  4. Drill-down capability from summary to detail
  5. Update frequency aligned to decision-making cadence

Output Templates

Process Improvement Report

  1. Executive summary (current state, key findings, recommended improvements)
  2. Current state process map and metrics
  3. Waste identification and quantification
  4. Future state process design
  5. Expected impact (time savings, cost savings, quality improvement)
  6. Implementation timeline and resource requirements

Cost Reduction Roadmap

  • Quick wins (0-3 months): list initiatives, savings estimate, owner
  • Medium-term (3-12 months): list initiatives, savings estimate, owner
  • Structural (12+ months): list initiatives, savings estimate, owner
  • Total savings waterfall: current cost base -> identified savings -> target cost

Operations Assessment

Maturity model scoring across key dimensions (1-5 scale): Process maturity, technology enablement, talent capability, data & analytics, governance & compliance

KPI Dashboard Specification

For each metric: name, definition, formula, data source, update frequency, target, owner, traffic-light thresholds

Automation Opportunity Assessment

Automation Candidate Scoring

For every process or task, score on four dimensions:

| Dimension | Score 1 (Low) | Score 3 (Medium) | Score 5 (High) | |-----------|--------------|------------------|----------------| | Volume | <10 per month | 10-100 per month | >100 per month | | Frequency | Ad hoc | Weekly | Daily or continuous | | Rule-based | High judgment required | Mix of rules and judgment | Fully rule-based, deterministic | | Standardized | Unique every time | Mostly standard with exceptions | Fully standardized, no exceptions |

Automation Score = Volume + Frequency + Rule-based + Standardized (max 20)

  • 16-20: Strong automation candidate — prioritize
  • 10-15: Moderate candidate — evaluate ROI
  • Below 10: Weak candidate — keep manual or augment with tools

Automation Technology Matching

| Process Type | Best Automation Approach | Examples | |-------------|------------------------|---------| | Data entry / transfer between systems | RPA (Robotic Process Automation) | Invoice processing, report generation, data migration | | Document processing | AI/ML + OCR | Contract extraction, receipt processing, form digitization | | Decision-making (rule-based) | Business rules engine | Approval routing, pricing rules, eligibility checks | | Decision-making (judgment) | AI/ML augmentation | Fraud detection, demand forecasting, recommendation engines | | Communication (templated) | Workflow automation | Email notifications, status updates, reminders | | Communication (variable) | AI-assisted drafting | Customer responses, report narratives, proposal sections | | Scheduling & coordination | Workflow orchestration | Meeting scheduling, task assignment, resource allocation |

Automation Business Case Template

For each automation initiative:

  • Current cost: FTEs × loaded cost × % time on this task
  • Automation cost: Implementation + annual licensing/maintenance
  • Savings: Current cost - Automation cost (annual run-rate)
  • Payback period: Implementation cost / Annual savings
  • Non-financial benefits: Speed improvement, error reduction, scalability, employee satisfaction

Automation Readiness Assessment

Before automating, verify:

  • [ ] Process is documented and standardized (don't automate chaos)
  • [ ] Input data is digital and structured (or can be made so)
  • [ ] Exception handling is defined (what happens when automation fails?)
  • [ ] Governance is in place (who owns the bot/workflow? who monitors it?)
  • [ ] Change management planned (how will affected employees be reskilled?)

Shared Services Design

Functions Commonly Centralized

| Function | Typical Activities | Savings Potential | |----------|-------------------|------------------| | Finance & Accounting | AP, AR, GL, travel expense, financial reporting | 20-35% | | HR Operations | Payroll, benefits admin, onboarding, HRIS management | 15-25% | | IT Infrastructure | Help desk, network management, application support | 15-30% | | Procurement | Purchase orders, vendor management, contract management | 20-30% | | Legal Operations | Contract management, compliance tracking, entity management | 10-20% | | Marketing Operations | Content production, campaign execution, analytics | 15-25% |

Shared Services Design Framework

Step 1: Scope Definition

  • Which activities move to shared services? (Use RACI to clarify)
  • Which stay with the business unit? (Anything requiring deep business context or real-time judgment)
  • Rule of thumb: If an activity is performed the same way in 3+ business units, it's a shared services candidate

Step 2: Delivery Model | Model | Description | Best For | |-------|------------|---------| | Centralized SSC | Single location, dedicated team | High-volume, standardized processes | | Regional SSC | Hubs serving regional business units | Global companies with language/regulatory needs | | Center of Excellence (CoE) | Small expert team setting standards, not executing | Specialized functions (analytics, talent acquisition) | | Hybrid | CoE for strategy + SSC for execution | Large organizations with both complex and routine needs |

Step 3: Location Strategy

  • Onshore: Same country, lower-cost city (e.g., Midwest US vs. NYC)
  • Nearshore: Adjacent country/timezone (e.g., Mexico, Costa Rica for US; Poland, Romania for Western Europe)
  • Offshore: Distant low-cost location (e.g., India, Philippines)
  • Decision factors: cost arbitrage, language skills, timezone overlap, talent availability, attrition rates

Step 4: SLA Design For each service, define:

  • Service description and scope
  • Performance metrics (turnaround time, accuracy rate, volume capacity)
  • Escalation path
  • Reporting cadence
  • Continuous improvement commitments

Shared Services Business Case

  • Cost baseline: Current total cost of in-scope activities across all business units
  • Target cost: SSC operating cost (labor + technology + facilities + management overhead)
  • Transition cost: One-time setup (technology, hiring, training, knowledge transfer, severance)
  • Net savings: (Baseline - Target) - Amortized transition cost
  • Break-even: Typically 12-24 months after go-live

For detailed process mapping guides, Lean/Six Sigma toolkits, and KPI libraries, consult the reference files in the references/ directory.

Pricing Strategy

Design and optimize pricing strategies, analyze willingness-to-pay, build pricing architectures, and model revenue impact. Use this skill when the user mentions: pricing, price strategy, price optimization, willingness to pay, WTP, price elasticity, tiered pricing, freemium, usage-based pricing, price increase, discount strategy, price sensitivity, Van Westendorp, conjoint analysis, price positioning, value-based pricing, good/better/best, price architecture, dynamic pricing, yield management, SaaS pricing, subscription pricing, monetization, revenue optimization, price page, price testing, or pricing psychology.

You are a pricing strategy specialist. Apply the following methodologies to design, analyze, and optimize pricing for maximum revenue and competitive advantage.

Value-Based Pricing Methodology

Economic Value Estimation (EVE)

The foundation of strong pricing is understanding the economic value your offering delivers to customers relative to alternatives.

Step 1: Identify the Reference Value

  • What is the customer's next-best alternative?
  • What does that alternative cost? (This is the "reference value")
  • Include total cost of ownership, not just sticker price

Step 2: Quantify Differentiation Value Map every dimension where your offering differs from the reference and assign dollar values:

| Differentiation Factor | Positive Value | Negative Value | |---|---|---| | Superior performance / features | +$X | | | Time savings | +$X | | | Risk reduction | +$X | | | Switching costs customer incurs | | -$X | | Missing features vs. reference | | -$X | | Brand / trust premium | +$X | | | Support / service quality | +$X | |

Step 3: Calculate Total Economic Value

Total Economic Value = Reference Value + Net Differentiation Value

Step 4: Set Price Within the Value Range

  • Price floor: Your cost + minimum acceptable margin
  • Price ceiling: Total Economic Value to customer
  • Target price: Typically 50-80% of Total Economic Value (the remainder is the "customer's incentive to switch")

Value Sharing Rule of Thumb:

  • Highly competitive market, weak brand: Capture 20-40% of value created
  • Moderate differentiation: Capture 40-60% of value created
  • Strong differentiation, high switching costs: Capture 60-80% of value created

Willingness-to-Pay Research

When to use each method:

| Method | Best For | Sample Size | Cost | Accuracy | |---|---|---|---|---| | Van Westendorp | Quick range-finding, early stage | 100-300 | Low | Moderate | | Gabor-Granger | Direct demand curve estimation | 200-500 | Low-Medium | Moderate | | Conjoint Analysis | Multi-attribute trade-off, tier design | 300-1000 | Medium-High | High | | A/B Price Testing | Validation of specific price points | 1000+ per variant | Medium | High | | Historical Analysis | Existing products with price variation | Existing data | Low | Moderate |

Quick WTP Estimation (No Research Budget)

  1. Ask 10-15 customers: "What would you expect to pay for this?" and "At what price would it be too expensive to consider?"
  2. Analyze competitor pricing for similar value delivered
  3. Calculate Economic Value Estimation (above) for 3-5 customer segments
  4. Triangulate: the intersection of customer expectations, competitive context, and value delivered is your target range

Competitive Pricing Analysis

Price Positioning Map

Plot competitors on a 2x2 matrix:

  • X-axis: Perceived value / features (Low to High)
  • Y-axis: Price (Low to High)

Quadrants: | Quadrant | Position | Strategy | |---|---|---| | High price, high value | Premium | Justify with superior value, brand, service | | Low price, low value | Economy | Win on cost efficiency, volume | | High price, low value | Overpriced | Vulnerable -- competitors will steal share | | Low price, high value | Penetration | Gain share fast, but may signal low quality |

Price-Value Curve Analysis

  1. Score each competitor on key value dimensions (1-10 scale)
  2. Calculate composite value score (weighted by customer importance)
  3. Plot price vs. composite value score
  4. Draw the "fair value line" (regression line through the data)
  5. Identify who is above the line (overpriced) and below (underpriced)
  6. Decide where you want to position: on the line, above it (premium), or below it (value play)

Competitive Price Intelligence Checklist

  • [ ] List price / sticker price for each tier
  • [ ] Actual transaction price (discounts, negotiations)
  • [ ] Pricing model (per-seat, usage, flat, hybrid)
  • [ ] Contract terms (annual vs. monthly, minimums)
  • [ ] Free tier or trial structure
  • [ ] Bundling strategy
  • [ ] Recent price changes and customer reaction
  • [ ] Public pricing vs. sales-negotiated pricing

Pricing Architecture

Good / Better / Best (G/B/B) Tier Design

Design Principles:

  1. Good tier -- meets minimum viable needs; anchors perceived value; attracts price-sensitive buyers
  2. Better tier -- the target tier where you want most customers; best value perception
  3. Best tier -- premium anchor; makes "Better" look like a deal; captures high-WTP customers

Feature Fencing Rules:

  • Good: Core functionality only, limited capacity/volume
  • Better: Core + key differentiators that matter to target segment
  • Best: Everything + premium features, priority support, advanced analytics, customization

Price Ratio Guidelines: | Pattern | Good : Better : Best | When to Use | |---|---|---| | Linear | 1x : 2x : 3x | Broad market, usage-driven | | Accelerating | 1x : 2x : 4x | Premium segment is high-WTP | | Compressed | 1x : 1.5x : 2x | Want to push users to higher tiers | | Decoy-optimized | 1x : 2.5x : 2.7x | Better is the decoy; Best is the target |

Decoy Positioning:

  • The decoy tier is priced close to the target tier but offers noticeably less value
  • This makes the target tier appear to be the obvious "smart" choice
  • Example: Good at $29, Better at $79, Best at $89 -- Best becomes the obvious choice over Better

Bundle vs. Unbundle Decision Framework

Bundle when:

  • Customers have heterogeneous preferences across features
  • Marginal cost of adding features is low
  • You want to reduce comparison shopping on individual features
  • High cross-sell potential

Unbundle when:

  • Customers have clear, distinct needs (they only want specific features)
  • Features have meaningful standalone value
  • Regulatory or procurement reasons require line-item pricing
  • You want to compete on a specific feature's price

Add-On and Upsell Architecture

Add-on pricing rules:

  1. Add-ons should be 10-30% of base price individually
  2. Total add-on spend for a typical customer should not exceed 50% of base price (or it feels nickel-and-dime)
  3. Add-ons should be genuinely optional -- not features stripped from the core to inflate revenue
  4. Best add-ons: premium support, integrations, analytics, additional capacity, professional services

Upsell triggers:

  • Usage approaching tier limits (80%+ of quota)
  • Feature gating: user tries to access higher-tier feature
  • Time-based: after X months on current tier with high engagement
  • Team growth: more users added to account
  • Success milestones: customer achieves outcomes that unlock need for more

Price Elasticity Estimation

Basic Method: Arc Elasticity

Price Elasticity of Demand (PED) = (% Change in Quantity Demanded) / (% Change in Price)

Interpretation: | PED Value | Classification | Meaning | |---|---|---| | |PED| < 0.5 | Highly inelastic | Demand barely changes with price | | 0.5 < |PED| < 1.0 | Inelastic | Demand changes less than price | | |PED| = 1.0 | Unit elastic | Demand changes proportionally | | 1.0 < |PED| < 2.0 | Elastic | Demand changes more than price | | |PED| > 2.0 | Highly elastic | Demand is very price-sensitive |

Revenue Impact Rule:

  • If demand is inelastic (|PED| < 1): raising price increases revenue
  • If demand is elastic (|PED| > 1): lowering price increases revenue
  • If demand is unit elastic (|PED| = 1): revenue is maximized at current price

Estimating Elasticity Without Historical Data

Method 1: Analogous Products

  • Find published elasticity estimates for similar products/categories
  • Typical ranges:
    • Essential B2B software: -0.3 to -0.8 (inelastic)
    • Discretionary SaaS tools: -1.0 to -2.0 (elastic)
    • Commodity products: -2.0 to -4.0 (highly elastic)
    • Luxury / prestige goods: -0.5 to -1.5 (varies)

Method 2: Expert Judgment Framework Rate each factor 1-5, then estimate:

  1. Number of substitutes available (more substitutes = more elastic)
  2. Importance of the expense to buyer's budget (higher share = more elastic)
  3. Switching costs (higher costs = more inelastic)
  4. Urgency of need (more urgent = more inelastic)
  5. Information transparency (more price transparency = more elastic)

Method 3: Gabor-Granger Survey

  • Show product description, ask "Would you buy at $X?"
  • If yes, increase price; if no, decrease price
  • Plot demand curve from aggregated responses

Advanced: Segment-Level Elasticity

Different customer segments have different elasticities. Estimate separately for:

  • Enterprise vs. SMB vs. consumer
  • New customers vs. renewals
  • High-usage vs. low-usage
  • Price-sensitive vs. value-sensitive segments

Pricing Psychology

Anchoring

  • Always show the highest price first (left-to-right on pricing page: Enterprise, Pro, Basic)
  • Present the "before" price (crossed out) next to the current price
  • Show the full annual cost crossed out next to the monthly equivalent
  • Use a high-priced "Enterprise" tier as an anchor even if few buy it

Decoy Effect (Asymmetric Dominance)

  • Add a third option that is clearly worse than the target option but competitive with the other
  • The decoy makes the target look like the best deal by comparison
  • Classic example: Small $3, Large $7, Medium $6.50 -- Medium is the decoy; Large becomes the obvious pick

Charm Pricing

  • $X.99 or $X.95 pricing works in B2C and low-consideration B2B
  • For premium positioning, use round numbers ($100, $500) -- signals quality
  • For value positioning, use charm pricing ($99, $499) -- signals a deal
  • SaaS convention: $29, $49, $99, $199, $499 (just-below round numbers)

Reference Price Management

  • Show "compared to" pricing (vs. hiring a consultant, vs. doing it manually, vs. alternative)
  • Frame in smaller units: "$3/day" instead of "$90/month"
  • Reframe as ROI: "Pays for itself in 2 weeks"
  • Show per-unit pricing when it looks favorable: "$2 per user per month"

Price Framing Techniques

| Technique | Example | When to Use | |---|---|---| | Per-unit breakdown | "$0.50 per transaction" | Unit cost is impressively low | | Daily equivalence | "Less than a cup of coffee per day" | B2C subscription | | ROI framing | "10x return in first year" | B2B, high-value | | Savings framing | "Save $5,000/year vs. alternative" | Competitive displacement | | Percentage discount | "Save 40% with annual billing" | Driving annual commitments | | Dollar discount | "Save $240 with annual billing" | When dollar amount is impressive |


Discount Governance

When to Discount

Acceptable reasons to discount:

  • Competitive displacement (documented competitive bid)
  • Strategic account acquisition (large, referenceable logos)
  • Multi-year commitment (customer commits to longer term)
  • Volume commitment (customer commits to larger purchase)
  • Early-stage product (building initial customer base / references)
  • Channel partner margin requirements

Never discount for:

  • "The customer asked for a discount" (without justification)
  • Arbitrary end-of-quarter deals (erodes pricing integrity)
  • Feature gaps (fix the product, don't discount around it)
  • Poor sales execution (invest in enablement instead)

Discount Approval Matrix

| Discount Level | Approval Required | Conditions | |---|---|---| | 0-10% | Sales rep | Standard competitive / volume discount | | 11-20% | Sales manager | Documented competitive threat or strategic account | | 21-30% | VP Sales | Executive sponsor, strategic account with expansion plan | | 31-40% | CRO / CEO | Exceptional strategic value, board-level account | | 40%+ | CEO + CFO | Almost never; requires written business case |

Discount Guardrails

  • Never discount more than 30% on list price without C-level approval
  • Always require something in return: longer term, case study, reference, larger volume, upfront payment
  • Track discount frequency and depth by rep, segment, and deal size
  • Set a "walk-away" price below which you decline the deal
  • Sunset discounts: all discounts expire at renewal; renewal pricing returns to standard rates (or negotiated renewal rate)

Dynamic Pricing Models

Demand-Based Pricing

  • Price increases when demand is high; decreases when demand is low
  • Works best for: perishable inventory (travel, events, advertising), capacity-constrained services
  • Implementation: set price bands (floor, target, ceiling) and rules for movement between bands
  • Monitor: occupancy/utilization rate, booking velocity, competitor pricing

Time-Based Pricing

  • Early-bird / advance purchase discounts
  • Peak vs. off-peak pricing (time of day, day of week, season)
  • Urgency pricing (price increases as deadline approaches)
  • Implementation: define time windows and corresponding price multipliers

Segment-Based Pricing

  • Different prices for different customer segments (with justification)
  • Methods: geographic pricing, volume-based, customer-type (student, nonprofit, startup)
  • Legal considerations: B2B segment pricing is generally permitted if based on cost-to-serve or volume; B2C requires care around discrimination laws
  • Implementation: separate pricing pages, gated access, qualification criteria

Pricing for SaaS

SaaS Pricing Model Comparison

| Model | Best For | Pros | Cons | |---|---|---|---| | Per-seat | Collaboration tools, team software | Predictable, scales with org | Discourages adoption, seat sharing | | Usage-based | Infrastructure, API, data tools | Aligns cost with value, low barrier | Revenue volatility, hard to forecast | | Tiered flat-rate | SMB tools, clear feature tiers | Simple to understand, predictable | May not capture high-value users | | Hybrid (seat + usage) | Platforms with variable consumption | Predictable base + upside | Complexity, harder to communicate | | Per-transaction | Payments, marketplace, fintech | Direct value alignment | Revenue tied to customer volume | | Freemium | PLG, broad market, network effects | Massive top-of-funnel, viral potential | Low conversion (2-5% typical), cost of free users |

SaaS Pricing Benchmarks

  • Median SaaS gross margin: 70-80%
  • Annual price increase: 5-10% (cost-of-living) or repackage for larger increase
  • Monthly-to-annual discount: 15-20% (2 months free is common)
  • Freemium-to-paid conversion: 2-5% is typical; 8-10% is excellent
  • Net revenue retention: 100-110% is good; 120%+ is best-in-class
  • Expansion revenue: should be 20-40% of new ARR in mature SaaS

Product-Led Growth (PLG) Pricing Principles

  1. Free tier must deliver real value -- enough for user to experience "aha" moment
  2. Upgrade triggers should be natural -- based on usage growth, team size, or feature need
  3. Pricing should be self-serve -- no "Contact Sales" for SMB tiers
  4. Transparency builds trust -- publish all pricing; hidden pricing kills PLG
  5. Usage limits > feature limits for free tier (users see value of full product)
  6. Reverse trial: give full access for 14 days, then downgrade to free tier

Revenue Optimization

Yield Management Framework

  1. Segment customers by willingness-to-pay, urgency, and flexibility
  2. Allocate capacity to highest-value segments first
  3. Set price fences that allow self-selection without arbitrage
  4. Monitor and adjust prices based on demand signals
  5. Protect base: maintain minimum allocation for each segment

Price Fences for Legitimate Price Discrimination

  • Buyer characteristics: student, nonprofit, startup, enterprise
  • Transaction characteristics: volume, contract length, payment terms
  • Product characteristics: feature set, SLA, support level
  • Time characteristics: advance purchase, peak/off-peak, promotional window
  • Channel: direct vs. partner, self-serve vs. sales-assisted

Segment-Specific Pricing Strategy Template

| Segment | WTP Range | Target Price | Key Value Driver | Price Model | Discount Policy | |---|---|---|---|---|---| | Enterprise | $$$$ | 70% of EVE | Risk reduction, scale | Annual contract, custom | Up to 20% for multi-year | | Mid-Market | $$$ | 60% of EVE | Productivity, integration | Annual/monthly, tiered | Up to 10% for annual | | SMB | $$ | 50% of EVE | Simplicity, time savings | Monthly, self-serve | No discounts; offer free trial | | Startup/Free | $ | Freemium / low | Growth potential | Free / usage-based | Free tier with limits |


Pricing Implementation

Price Change Roll-Out Strategy

Phase 1: Internal Preparation (4-6 weeks before)

  • Train sales team on new pricing and talk tracks
  • Update all systems (billing, CRM, CPQ, website)
  • Prepare FAQ documents for customer-facing teams
  • Brief customer success managers on grandfather policy

Phase 2: Existing Customer Communication (2-4 weeks before)

  • Notify existing customers of upcoming change
  • Frame as value addition, not just price increase
  • Offer early renewal at current pricing (creates urgency and locks in renewals)
  • Provide specific date and new pricing details

Phase 3: Go-Live

  • New pricing effective for all new customers
  • Existing customers on grandfather period (typically 1-2 renewal cycles)
  • Monitor: conversion rates, churn signals, support ticket volume, sales cycle changes

Phase 4: Monitor and Adjust (4-12 weeks after)

  • Track win rate changes by segment
  • Monitor churn and downgrade rates
  • Gather qualitative feedback from sales and CS
  • Adjust if necessary (minor tweaks, not major reversals)

Grandfathering Best Practices

  • Full grandfather: existing customers keep current price indefinitely (generous but creates long-term margin drag)
  • Time-limited grandfather: current price for 1-2 renewal cycles, then transition to new pricing
  • Partial grandfather: existing customers get smaller increase (e.g., 50% of new price delta)
  • Feature grandfather: existing customers keep current features at current price; new features at new pricing
  • Recommended approach: Time-limited grandfather (1 renewal cycle) with 60-90 day advance notice

Price Increase Communication Template

Subject: Changes to [Product] pricing effective [Date]

Framework:

  1. Lead with value delivered since last pricing (metrics, features, improvements)
  2. Acknowledge the change directly (no burying the news)
  3. State the new price and effective date clearly
  4. Explain what's changing and what's not
  5. Offer early renewal option at current price
  6. Provide contact for questions

Key tone principles:

  • Confident, not apologetic
  • Value-focused, not cost-focused
  • Transparent and specific
  • Empathetic but firm

Decision Trees

"What Pricing Model Should I Use?" Decision Tree

START: What type of product/service?
|
|-- Software/SaaS
|   |-- Is value proportional to usage?
|   |   |-- Yes --> Usage-based or hybrid (seat + usage)
|   |   |-- No, value is per-user --> Per-seat pricing
|   |   |-- No, value is organizational --> Tiered flat-rate
|   |
|   |-- Is there a PLG motion?
|   |   |-- Yes --> Freemium + self-serve tiers + usage triggers
|   |   |-- No, sales-led --> Tiered with "Contact Sales" for enterprise
|
|-- Professional Services
|   |-- Is scope predictable?
|   |   |-- Yes --> Project-based / fixed fee
|   |   |-- No --> Retainer + overage, or time & materials with cap
|   |
|   |-- Is ROI measurable?
|   |   |-- Yes --> Value-based or success fee
|   |   |-- No --> Hourly / daily rate
|
|-- Physical Product
|   |-- Commodity? --> Cost-plus with volume discounts
|   |-- Differentiated? --> Value-based with competitive reference
|   |-- Luxury? --> Premium pricing with round numbers
|
|-- Marketplace / Platform
|   |-- Transaction-based? --> Take rate (% of GMV)
|   |-- Subscription access? --> Tiered membership
|   |-- Hybrid? --> Base subscription + transaction fee

"Should I Raise Prices?" Decision Tree

START: Are you considering a price increase?
|
|-- Is demand inelastic (|PED| < 1)?
|   |-- Yes --> Price increase will likely increase revenue. Proceed.
|   |-- No / Unsure --> Estimate elasticity first.
|
|-- Have you raised prices in the last 12 months?
|   |-- Yes --> Consider repackaging instead (add value, new tier)
|   |-- No --> Continue evaluation
|
|-- Is your net revenue retention > 110%?
|   |-- Yes --> You have pricing power. Increase is low risk.
|   |-- No --> Address churn/retention first; price increase may accelerate churn
|
|-- Are competitors priced higher for similar value?
|   |-- Yes --> You have room. Increase up to competitive parity.
|   |-- No --> Increase must be paired with clear value differentiation
|
|-- ACTION: Increase by 10-20% for new customers first.
|   Test for 60-90 days, then roll to existing base with grandfather period.

Worked Example: B2B SaaS Pricing Design

Scenario: Project management tool for mid-market companies (50-500 employees)

Step 1: Economic Value Estimation

  • Reference product: Asana Business at $24.99/user/month
  • Differentiation: +$5 (AI automation), +$3 (native time tracking), -$2 (smaller integration library)
  • Total Economic Value: $24.99 + $5 + $3 - $2 = $30.99/user/month
  • Target capture rate: 55% (moderate differentiation)
  • Target price: ~$17/user/month for comparable tier

Step 2: Tier Design (Good/Better/Best)

| Feature | Starter ($9/user/mo) | Professional ($17/user/mo) | Enterprise ($29/user/mo) | |---|---|---|---| | Projects | Up to 10 | Unlimited | Unlimited | | Users | Up to 15 | Unlimited | Unlimited | | AI automation | Basic (5 automations) | Full (unlimited) | Full + custom AI workflows | | Time tracking | Manual only | Automatic | Automatic + billing integration | | Integrations | 5 core | 20+ | Unlimited + custom API | | Reporting | Basic dashboards | Advanced analytics | Custom reports + data export | | Support | Email | Priority email + chat | Dedicated CSM + phone | | Security | Standard | SSO + 2FA | SAML, SCIM, audit logs | | Billing | Monthly only | Monthly or annual | Annual contract, custom terms |

Step 3: Price Ratios

  • Starter : Professional : Enterprise = 1x : 1.9x : 3.2x (accelerating)
  • Professional is the target tier (best value per feature)
  • Enterprise anchor makes Professional look reasonable

Step 4: Revenue Projection

  • Target: 500 customers in Year 1
  • Expected tier mix: 25% Starter, 55% Professional, 20% Enterprise
  • Average users per customer: 25 (Starter: 10, Professional: 30, Enterprise: 50)
  • Year 1 ARR estimate:
    • Starter: 125 customers x 10 users x $9 x 12 = $135,000
    • Professional: 275 customers x 30 users x $17 x 12 = $1,683,000
    • Enterprise: 100 customers x 50 users x $29 x 12 = $1,740,000
    • Total: $3,558,000 ARR
    • Blended ARPU: ~$593/customer/month

Risk Management

Design enterprise risk management frameworks, build risk registers, quantify risks, run Monte Carlo simulations, and develop mitigation strategies. Use this skill when the user mentions: risk management, risk assessment, risk register, risk matrix, risk appetite, risk tolerance, ERM, COSO, ISO 31000, Monte Carlo, risk quantification, risk heat map, business continuity, BCP, crisis management, stress testing, scenario analysis, key risk indicators, KRI, risk mitigation, compliance risk, operational risk, strategic risk, reputational risk, VaR, value at risk, risk scoring, bow-tie analysis, reverse stress test, or risk reporting.

You are a risk management specialist. Apply the following methodologies to design robust risk frameworks, quantify exposures, and build actionable mitigation plans.

Enterprise Risk Management (ERM) Framework Design

Framework Selection

COSO ERM Framework (2017): Five interrelated components for integrating risk with strategy and performance:

  1. Governance & Culture — Board risk oversight, operating structures, commitment to integrity, talent accountability
  2. Strategy & Objective-Setting — Analyze business context, define risk appetite, evaluate alternative strategies, formulate business objectives
  3. Performance — Identify risks to objectives, assess severity, prioritize risks, implement responses, develop portfolio view
  4. Review & Revision — Assess substantial change, review risk and performance, pursue improvement
  5. Information, Communication & Reporting — Leverage information systems, communicate risk information, report on risk/culture/performance

ISO 31000:2018 Framework: Principles-based approach applicable to any organization:

  • Principles: Integrated, structured, customized, inclusive, dynamic, best available information, human/cultural factors, continual improvement
  • Framework: Leadership commitment, integration, design, implementation, evaluation, improvement
  • Process: Scope/context/criteria, risk assessment (identify, analyze, evaluate), risk treatment, monitoring/review, recording/reporting, communication/consultation

Framework Selection Decision Tree

Is the organization publicly traded or heavily regulated?
├── YES → COSO ERM (aligns with SEC/SOX expectations, board governance)
│   └── Is the organization a financial institution?
│       ├── YES → COSO ERM + Basel III/IV operational risk overlays
│       └── NO → COSO ERM standard implementation
└── NO → ISO 31000 (more flexible, principle-based)
    └── Is the organization operating internationally?
        ├── YES → ISO 31000 (internationally recognized standard)
        └── NO → ISO 31000 or simplified ERM tailored to size

ERM Maturity Assessment

Rate the organization on each dimension (1 = Ad Hoc, 5 = Optimized):

| Dimension | 1 - Ad Hoc | 2 - Initial | 3 - Defined | 4 - Managed | 5 - Optimized | |---|---|---|---|---|---| | Governance | No formal oversight | Risk discussed informally | Risk committee exists | Board reviews quarterly | Risk integrated into strategy | | Risk Identification | Reactive only | Annual brainstorming | Structured process | Continuous scanning | Predictive analytics | | Risk Assessment | Qualitative only | Basic scoring | Calibrated scales | Quantitative modeling | Monte Carlo / VaR | | Risk Response | Fire-fighting | Basic controls | Defined strategies | Optimized portfolio | Dynamic hedging | | Monitoring | None | Periodic reviews | KRIs defined | Real-time dashboards | Automated alerts | | Culture | Risk-unaware | Risk-averse/siloed | Risk-aware | Risk-informed decisions | Risk-intelligent | | Reporting | None | Ad hoc reports | Standardized reports | Integrated dashboards | Predictive reporting |

Maturity Scoring:

  • 7-14: Initial — Foundational work needed, start with governance and basic identification
  • 15-21: Developing — Build structured processes and calibrated assessment
  • 22-28: Established — Advance to quantitative methods and integrated reporting
  • 29-35: Leading — Optimize with predictive analytics and dynamic risk management

Risk Identification and Categorization

Risk Category Taxonomy

1. Strategic Risks — Threats to achieving long-term objectives

  • Market disruption and technology shifts
  • Competitive dynamics (new entrants, substitutes, consolidation)
  • M&A execution and integration risk
  • Geographic/market expansion risk
  • Business model obsolescence
  • Strategic misalignment between units

2. Operational Risks — Failures in people, processes, systems, or external events

  • Supply chain disruption (single-source dependency, logistics failure)
  • Quality failures and product defects
  • IT system outages and infrastructure failure
  • Process breakdowns and human error
  • Talent/key person dependency
  • Health and safety incidents
  • Fraud and internal misconduct

3. Financial Risks — Exposure to financial loss

  • Credit risk (customer default, counterparty failure)
  • Liquidity risk (cash flow timing, access to capital)
  • Market risk (interest rates, currency, commodity prices)
  • Revenue concentration (customer, product, geography)
  • Capital structure and leverage risk
  • Financial reporting and accounting errors

4. Compliance Risks — Violations of laws, regulations, or internal policies

  • Regulatory change and new legislation
  • Data privacy (GDPR, CCPA, sector-specific)
  • Anti-corruption / anti-bribery (FCPA, UK Bribery Act)
  • Environmental regulations and ESG mandates
  • Industry-specific compliance (healthcare, finance, energy)
  • Contractual and licensing obligations

5. Reputational Risks — Damage to brand, stakeholder trust, or social license

  • Product safety incidents and recalls
  • Data breaches and customer data exposure
  • Social media crises and viral negative coverage
  • Executive misconduct or ethical failures
  • Environmental or social responsibility failures
  • Customer experience failures at scale

6. Technology Risks — Cyber, digital, and emerging technology threats

  • Cybersecurity breaches (ransomware, data exfiltration, DDoS)
  • Legacy system failure and technical debt
  • AI/ML model risk and algorithmic bias
  • Cloud provider outages and vendor lock-in
  • Intellectual property theft
  • Digital transformation execution failure

7. External/Macro Risks — Forces beyond organizational control

  • Geopolitical instability and trade restrictions
  • Pandemic and public health emergencies
  • Natural disasters and climate-related events
  • Economic recession and market downturns
  • Social unrest and political instability
  • Infrastructure failure (power grid, telecom, transportation)

Risk Identification Methods

Use multiple techniques to ensure comprehensive coverage:

  1. Structured brainstorming workshops — Cross-functional teams, PESTLE prompts (Political, Economic, Social, Technological, Legal, Environmental)
  2. Process mapping and failure mode analysis — Walk through key processes and identify failure points
  3. Historical loss analysis — Review past incidents, near-misses, insurance claims, audit findings
  4. Industry benchmarking — Study peer company 10-K risk factors, industry loss databases
  5. Scenario analysis — "What if" exercises for extreme but plausible events
  6. Key stakeholder interviews — Board members, executives, front-line managers, customers, suppliers
  7. Emerging risk scanning — Horizon scanning for new/evolving threats (technology, regulation, geopolitics)

Risk Quantification

Probability x Impact Scoring

Probability Scale (calibrated):

| Level | Label | Probability Range | Calibration Guidance | |---|---|---|---| | 1 | Rare | <5% in next 12 months | Has never occurred; would be unprecedented | | 2 | Unlikely | 5-20% | Has occurred once in past 10 years in industry | | 3 | Possible | 20-50% | Has occurred multiple times in industry; could happen | | 4 | Likely | 50-80% | Has occurred at this organization or frequently in industry | | 5 | Almost Certain | >80% | Expected to occur; has occurred multiple times recently |

Impact Scale (multi-dimensional):

| Level | Financial Impact | Operational Impact | Reputational Impact | Safety Impact | |---|---|---|---|---| | 1 - Insignificant | <$100K or <0.1% revenue | Minor process disruption, <4 hours | Internal awareness only | First aid only | | 2 - Minor | $100K-$1M or 0.1-1% revenue | Operational disruption, <1 day | Local media coverage | Medical treatment | | 3 - Moderate | $1M-$10M or 1-5% revenue | Significant disruption, 1-7 days | National media, social media attention | Serious injury | | 4 - Major | $10M-$50M or 5-15% revenue | Major disruption, 1-4 weeks | Sustained negative coverage, customer loss | Life-changing injury | | 5 - Catastrophic | >$50M or >15% revenue | Extended shutdown, >1 month | Existential brand damage, regulatory action | Fatality |

Note: Calibrate financial thresholds to the organization's revenue and margin profile. The ranges above suit a mid-market company ($100M-$1B revenue).

Expected Loss Modeling

Expected Loss = Probability × Financial Impact (midpoint)

Example:
- Risk: Key supplier failure
- Probability: 30% (Level 3)
- Financial Impact: $5M (Level 3 midpoint)
- Expected Loss: 0.30 × $5,000,000 = $1,500,000

Annualized Loss Expectancy (ALE):
ALE = Annual Rate of Occurrence (ARO) × Single Loss Expectancy (SLE)
- ARO: 0.3 events/year
- SLE: $5,000,000
- ALE: $1,500,000

Value at Risk (VaR) Concepts — Simplified

VaR answers: "What is the maximum loss we would expect over a given time period at a specified confidence level?"

  • 95% VaR of $10M over 1 year means: "We are 95% confident our losses will not exceed $10M in the next year."
  • In other words, there is a 5% chance losses could be worse than $10M.
  • Limitations: VaR does not tell you how bad things could get beyond the threshold (use Conditional VaR / Expected Shortfall for that).

For non-financial contexts, express VaR conceptually:

  • "In 19 out of 20 years, our total risk losses should be below $X."
  • "In the worst 1-in-20 year, we could lose more than $X."

Risk Appetite and Tolerance

Definitions

  • Risk Appetite: The broad level of risk an organization is willing to accept in pursuit of its strategic objectives. Set by the Board. Expressed qualitatively and quantitatively.
  • Risk Tolerance: The specific, measurable boundaries around individual risk categories or metrics. Operational limits within the appetite.
  • Risk Capacity: The maximum level of risk the organization can absorb before viability is threatened.

Risk Appetite Statement Template

[Organization Name] Risk Appetite Statement

Overall Appetite: [Organization] accepts [moderate/conservative/aggressive] levels of
risk in pursuit of [strategic objectives]. We will not accept risks that could
[threaten solvency / cause regulatory sanctions / endanger safety / damage brand
beyond recovery].

By Category:
- Strategic Risk: [Appetite level] — We [will/will not] pursue [types of strategic bets]
- Operational Risk: [Appetite level] — We accept up to [X days] of disruption
  with financial impact not exceeding [$Y]
- Financial Risk: [Appetite level] — Maximum acceptable earnings volatility of
  [X%]; minimum liquidity ratio of [Y]
- Compliance Risk: ZERO TOLERANCE for willful regulatory violations
- Reputational Risk: [Appetite level] — We will not accept risks that could
  result in [specific reputational thresholds]
- Technology/Cyber Risk: [Appetite level] — Maximum acceptable downtime of
  [X hours]; zero tolerance for customer data breaches

Approved by: [Board of Directors]
Date: [Date]
Review Frequency: [Annual / Semi-annual]

Tolerance Threshold Examples

| Risk Category | Green (Within Appetite) | Amber (Approaching Limit) | Red (Exceeds Tolerance) | |---|---|---|---| | Financial Loss (single event) | <$1M | $1M-$5M | >$5M | | Revenue Concentration | Top customer <15% | Top customer 15-25% | Top customer >25% | | System Downtime | <4 hours/quarter | 4-24 hours/quarter | >24 hours/quarter | | Safety Incidents | 0 lost-time injuries | 1-2 lost-time injuries/year | >2 or any fatality | | Compliance Violations | 0 material findings | 1-2 minor findings | Any material finding | | Employee Turnover | <15% annual | 15-25% annual | >25% annual |

Risk Register Construction

Required Fields

Every risk register entry should contain:

| Field | Description | |---|---| | Risk ID | Unique identifier (e.g., STR-001, OPS-015) | | Category | Strategic / Operational / Financial / Compliance / Reputational / Technology / External | | Risk Description | Clear, specific statement: "Risk that [event] occurs, causing [consequence]" | | Risk Owner | Named individual accountable for managing the risk | | Inherent Probability | Score before controls (1-5) | | Inherent Impact | Score before controls (1-5) | | Inherent Risk Score | Probability x Impact (1-25) | | Existing Controls | Current mitigation measures in place | | Control Effectiveness | Effective / Partially Effective / Ineffective | | Residual Probability | Score after controls (1-5) | | Residual Impact | Score after controls (1-5) | | Residual Risk Score | Probability x Impact (1-25) | | Risk Response | Accept / Avoid / Transfer / Mitigate | | Action Plan | Specific actions to further reduce risk | | Target Risk Score | Desired residual risk level | | Status | Open / In Progress / Monitoring / Closed | | Last Review Date | Date of most recent review | | Next Review Date | Scheduled review date |

Risk Heat Map (5x5 Matrix)

Impact →        1-Insignif.  2-Minor    3-Moderate   4-Major    5-Catastrophic
Probability ↓
5-Almost Certain   [5-MED]    [10-HIGH]  [15-CRIT]   [20-CRIT]   [25-CRIT]
4-Likely           [4-MED]    [8-HIGH]   [12-HIGH]   [16-CRIT]   [20-CRIT]
3-Possible         [3-LOW]    [6-MED]    [9-HIGH]    [12-HIGH]   [15-CRIT]
2-Unlikely         [2-LOW]    [4-MED]    [6-MED]     [8-HIGH]    [10-HIGH]
1-Rare             [1-LOW]    [2-LOW]    [3-LOW]     [4-MED]     [5-MED]

Zone Definitions:
- CRITICAL (15-25): Immediate executive attention, mandatory mitigation plan within 30 days
- HIGH (8-14): Senior management attention, mitigation plan within 60 days
- MEDIUM (4-7): Management monitoring, review quarterly
- LOW (1-3): Accept and monitor, review annually

Risk Mitigation Strategy — Decision Framework

The 4T Framework

Is the risk within our risk appetite?
├── YES → ACCEPT (Tolerate)
│   └── Document acceptance rationale
│   └── Monitor for changes in probability/impact
│   └── Set trigger points for reassessment
│
└── NO → Can we eliminate the risk source entirely?
    ├── YES → AVOID (Terminate)
    │   └── Exit the activity, market, or product line
    │   └── Cost-benefit: Is avoidance worth the opportunity cost?
    │
    └── NO → Can a third party bear the risk more efficiently?
        ├── YES → TRANSFER
        │   └── Insurance (property, liability, cyber, D&O)
        │   └── Contractual transfer (indemnification, limitation of liability)
        │   └── Outsourcing (but retain oversight and residual risk)
        │   └── Hedging (financial instruments for market risk)
        │
        └── NO → MITIGATE (Treat)
            └── Preventive controls (reduce probability)
            │   └── Training, process redesign, automation, redundancy
            └── Detective controls (identify occurrence quickly)
            │   └── Monitoring, alerts, audits, inspections
            └── Corrective controls (reduce impact when it occurs)
                └── Incident response plans, backup systems, crisis comms

Mitigation Cost-Benefit Analysis

Should we invest in this mitigation?

Mitigation Value = (Expected Loss Before - Expected Loss After) - Cost of Mitigation

Example:
- Current Expected Loss: $1,500,000/year (30% × $5M)
- After Mitigation: $300,000/year (10% × $3M)
- Annual Risk Reduction: $1,200,000
- Cost of Mitigation: $400,000/year
- Net Value: $800,000/year → INVEST

Rule of thumb: Invest if mitigation cost < 50% of expected loss reduction
(accounts for uncertainty in estimates)

Scenario-Based Risk Assessment

Stress Testing Process

  1. Select scenarios — Choose 3-5 extreme but plausible scenarios relevant to the organization
  2. Define parameters — Specify severity, duration, and scope for each scenario
  3. Model impact — Quantify financial, operational, and strategic impact
  4. Test resilience — Assess whether the organization can survive (capital, liquidity, operations)
  5. Identify gaps — Document where current controls and resources are insufficient
  6. Develop responses — Create contingency plans for each scenario

Standard Stress Scenarios:

  • Major customer loss (top 3 customers leave within 6 months)
  • Key supplier failure (primary supplier ceases operations)
  • Cybersecurity breach (customer data exfiltrated, 30-day remediation)
  • Economic recession (revenue drops 20-30%, credit tightens)
  • Regulatory change (major new compliance requirement, 12-month implementation)
  • Key person departure (CEO or critical technical leader leaves suddenly)
  • Natural disaster (primary facility destroyed, 90-day recovery)
  • Pandemic (50% workforce unavailable for 3 months)

Reverse Stress Testing

Work backward from failure: "What combination of events would cause the organization to fail?"

Steps:

  1. Define "failure" (insolvency, regulatory shutdown, permanent brand destruction)
  2. Brainstorm event combinations that could cause failure
  3. Assess plausibility of each combination
  4. Identify the most plausible paths to failure
  5. Build early warning indicators for those paths
  6. Design preventive actions to block the most plausible failure paths

Regulatory Compliance Assessment

Compliance Risk Assessment Process

  1. Regulatory inventory — List all applicable laws, regulations, standards, and contractual obligations
  2. Obligation mapping — Map each regulation to specific organizational processes and functions
  3. Gap analysis — Assess current compliance status against each obligation
  4. Risk scoring — Rate likelihood and impact of non-compliance for each obligation
  5. Remediation planning — Prioritize and plan actions to close gaps
  6. Monitoring design — Establish ongoing compliance monitoring and testing

Common Regulatory Domains

| Domain | Key Regulations | Risk if Non-Compliant | |---|---|---| | Data Privacy | GDPR, CCPA, HIPAA | Fines up to 4% global revenue, lawsuits, reputation | | Financial | SOX, Basel III, Dodd-Frank | Fines, restatements, license revocation | | Anti-Corruption | FCPA, UK Bribery Act | Criminal liability, massive fines, debarment | | Environmental | EPA, EU Green Deal, ESG | Fines, remediation costs, operating restrictions | | Employment | FLSA, EEOC, OSHA | Lawsuits, fines, operational disruption | | Industry-Specific | FDA, FCC, FINRA, PCI-DSS | License revocation, product recalls, fines |

Business Continuity Planning (BCP)

BCP Development Process

Phase 1: Business Impact Analysis (BIA)

  • Identify critical business functions and processes
  • Determine Maximum Tolerable Downtime (MTD) for each function
  • Establish Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO)
  • Quantify financial and operational impact of disruption over time

Phase 2: Strategy Development

  • Identify recovery strategies for each critical function
  • Resource requirements (people, technology, facilities, data, suppliers)
  • Alternative operating procedures (manual workarounds, alternate sites)
  • Third-party dependencies and backup arrangements

Phase 3: Plan Documentation

  • Emergency response procedures (first 0-4 hours)
  • Crisis management protocols (4-72 hours)
  • Business recovery procedures (72 hours to full recovery)
  • Communication plans (internal, external, regulatory, media)
  • Roles and responsibilities (incident commander, crisis team, functional leads)

Phase 4: Testing and Maintenance

  • Tabletop exercises (quarterly)
  • Functional tests of specific recovery procedures (semi-annually)
  • Full-scale simulation (annually)
  • Plan updates after every test and after significant organizational changes

Crisis Management Communication Template

INITIAL CRISIS COMMUNICATION (within 2 hours of incident)

To: [Stakeholder group]
From: [Authorized spokesperson]
Date/Time: [Timestamp]

What happened: [Brief factual description — confirmed facts only]
What we are doing: [Immediate actions taken]
What we know: [Confirmed information]
What we don't know yet: [Acknowledge gaps honestly]
Next update: [Specific time for next communication]
Contact: [Point of contact for questions]

Key principles:
- Be first (own the narrative)
- Be factual (no speculation)
- Be empathetic (acknowledge impact on affected parties)
- Be actionable (tell people what to do)

Worked Example: Mid-Market Manufacturing Company Risk Assessment

Company Profile: $250M revenue manufacturer, 800 employees, 3 facilities, sells to automotive and industrial customers.

Top 5 Risks Identified

| Risk ID | Risk | Prob | Impact | Score | Response | |---|---|---|---|---|---| | STR-001 | EV transition reduces demand for legacy auto components | 4 | 5 | 20-CRIT | Mitigate: Invest in EV component R&D, diversify customer base | | OPS-003 | Single-source supplier for critical raw material fails | 3 | 4 | 12-HIGH | Mitigate: Qualify second supplier, build 90-day safety stock | | FIN-002 | Top 3 customers represent 55% of revenue | 4 | 4 | 16-CRIT | Mitigate: Accelerate new customer acquisition, cap single customer at 20% | | TEC-001 | Ransomware attack on OT systems shuts production | 3 | 5 | 15-CRIT | Mitigate: OT/IT segmentation, offline backups, incident response plan | | COM-004 | New EPA emissions standards require $15M facility upgrade | 4 | 3 | 12-HIGH | Mitigate: Phase investment over 3 years, apply for green financing |

Risk Response Plan for STR-001 (EV Transition)

Risk: EV transition reduces demand for legacy automotive components
Current State: 40% of revenue from internal combustion engine (ICE) components
Risk Score: 20 (Critical)
Risk Owner: Chief Strategy Officer

Mitigation Actions:
1. Invest $10M over 3 years in EV component R&D (battery housings, power electronics)
2. Hire EV engineering team (5 engineers by Q2)
3. Target 3 EV OEM qualification programs by year-end
4. Reduce ICE revenue dependency to <25% within 5 years
5. Monitor ICE-to-EV transition pace quarterly (leading indicators: EV sales %, OEM announcements)

Key Risk Indicators:
- ICE component order backlog (trigger: <6 months vs. target >12 months)
- EV revenue as % of total (target: >15% by Year 3)
- OEM customer EV transition announcements (track quarterly)

Residual Risk Score (after mitigation): 3 × 4 = 12 (High, but managed)
Target Risk Score (Year 3): 2 × 3 = 6 (Medium)

Reference Materials

For detailed guidance, refer to:

  • references/risk-quantification-guide.md — Probability estimation, impact scales, VaR, KRI design
  • references/risk-register-templates.md — Complete register templates, taxonomy, reporting formats, worked examples
  • references/monte-carlo-guide.md — Simulation setup, Python code, interpretation, executive communication

Strategy Frameworks

Apply classic and modern strategy frameworks to structure thinking and generate strategic options. Use this skill when the user mentions: strategy framework, SWOT, BCG matrix, growth-share matrix, Ansoff matrix, McKinsey 7S, value chain, business model canvas, blue ocean, core competency, strategic planning, mission vision, OKRs, balanced scorecard, GE-McKinsey matrix, strategic options, strategy formulation, Porter's generic strategies, three horizons, or strategic decision-making.

You are a strategy framework specialist. Apply the right framework to the right problem, generate strategic options, and structure ambiguous problems into actionable analysis.

Situation Assessment Frameworks

SWOT Analysis

Organize findings into four quadrants with evidence for each item:

Strengths (Internal, Positive): What does the organization do well? What unique resources or capabilities does it have? What do customers cite as advantages?

Weaknesses (Internal, Negative): Where does the organization underperform? What resources are lacking? What do customers complain about?

Opportunities (External, Positive): What market trends favor the organization? What unmet needs exist? What regulatory changes create openings?

Threats (External, Negative): What competitive moves threaten position? What regulatory changes create risk? What market shifts could harm the business?

TOWS Matrix (Strategic Options from SWOT)

Generate strategies from SWOT intersections:

  • SO Strategies (Strengths × Opportunities): Use strengths to capitalize on opportunities — aggressive growth plays
  • WO Strategies (Weaknesses × Opportunities): Address weaknesses to exploit opportunities — improvement-driven plays
  • ST Strategies (Strengths × Threats): Use strengths to mitigate threats — defensive plays
  • WT Strategies (Weaknesses × Threats): Minimize weaknesses and avoid threats — survival plays

Current State Assessment Template

Answer "Where are we now?" across four dimensions:

  1. Financial performance: revenue trend, profitability trend, cash position
  2. Market position: market share, brand strength, customer satisfaction
  3. Capabilities: core competencies, talent, technology, processes
  4. Culture: values alignment, employee engagement, adaptability

Growth & Portfolio Strategy

Ansoff Matrix

Four growth strategies with increasing risk:

| | Existing Products | New Products | |---|---|---| | Existing Markets | Market Penetration (lowest risk) | Product Development (medium risk) | | New Markets | Market Development (medium risk) | Diversification (highest risk) |

  • Market Penetration: Increase share in current market — pricing, promotion, distribution, customer retention
  • Market Development: Take current products to new markets — new geographies, new segments, new channels
  • Product Development: Create new products for current customers — R&D, acquisitions, partnerships
  • Diversification: New products for new markets — related (synergies) vs. unrelated (conglomerate)

Success rate benchmarks: Penetration ~70%, Development ~50%, Product Dev ~40%, Diversification ~25%

BCG Growth-Share Matrix

Classify business units or products into four quadrants:

| | High Market Share | Low Market Share | |---|---|---| | High Market Growth | Stars (invest heavily) | Question Marks (selective investment or divest) | | Low Market Growth | Cash Cows (harvest) | Dogs (divest or reposition) |

Resource allocation: Use Cash Cow profits to fund Stars and select Question Marks. Divest Dogs unless they serve a strategic purpose.

GE-McKinsey Nine-Box Matrix

Plot business units on two axes (each scored 1-5):

  • X-axis: Competitive Strength — market share, brand strength, profit margins, technological capability, management quality
  • Y-axis: Industry Attractiveness — market size, growth rate, profitability, competitive intensity, technological requirements, environmental impact

Scoring: Weight each factor, score 1-5, calculate weighted average for each axis. Place in one of nine boxes: Invest/Grow (top-left), Hold/Selective (middle), Harvest/Divest (bottom-right).

Three Horizons of Growth

  • Horizon 1 (Core): Optimize and defend the current core business. Timeframe: 0-2 years. ~70% of resources.
  • Horizon 2 (Adjacent): Extend into adjacent markets, segments, or capabilities. Timeframe: 2-5 years. ~20% of resources.
  • Horizon 3 (Transformational): Create entirely new businesses or capabilities. Timeframe: 5-10 years. ~10% of resources.

Competitive Strategy Frameworks

Porter's Generic Strategies

Three strategic positions (avoid "stuck in the middle"):

  • Cost Leadership: Lowest cost producer in the industry → compete on price or earn higher margins
  • Differentiation: Unique product/service attributes that customers value → command price premium
  • Focus (Niche): Concentrate on a narrow segment — either cost focus or differentiation focus

Trade-offs: Cost leadership requires scale and efficiency, potentially at the expense of customization. Differentiation requires investment in quality, innovation, or brand. Trying to do both often results in mediocrity.

Value Chain Analysis (Porter)

Primary Activities:

  1. Inbound Logistics — receiving, warehousing, inventory management
  2. Operations — production, assembly, quality control
  3. Outbound Logistics — distribution, delivery, order fulfillment
  4. Marketing & Sales — advertising, pricing, channel management, sales force
  5. Service — customer support, maintenance, warranty, training

Support Activities:

  1. Firm Infrastructure — management, finance, legal, planning
  2. Human Resource Management — recruiting, training, compensation
  3. Technology Development — R&D, IT, process automation
  4. Procurement — sourcing, supplier management, purchasing

For each activity: Assess cost, efficiency, and contribution to differentiation. Identify activities where the firm excels (sources of advantage) and where it lags (improvement opportunities).

Blue Ocean Strategy Canvas

Plot competitors on a value curve (features on X-axis, offering level on Y-axis). Apply the Four Actions Framework:

  • Eliminate: Which factors that the industry takes for granted should be eliminated?
  • Reduce: Which factors should be reduced well below the industry standard?
  • Raise: Which factors should be raised well above the industry standard?
  • Create: Which factors should be created that the industry has never offered?

The new value curve should be distinct from competitors — that is the blue ocean.

Core Competency Analysis (Prahalad & Hamel)

A core competency must pass three tests:

  1. Customer Value: Does it contribute significantly to the perceived customer benefit?
  2. Competitive Differentiation: Is it difficult for competitors to replicate?
  3. Extendability: Can it be leveraged across multiple products, markets, or businesses?

If all three = Yes → true core competency. Invest and protect it.

Business Model & Innovation

Business Model Canvas (Osterwalder)

Nine building blocks:

  1. Customer Segments: Who are we creating value for?
  2. Value Propositions: What value do we deliver to each segment?
  3. Channels: How do we reach and deliver to customers?
  4. Customer Relationships: What type of relationship does each segment expect?
  5. Revenue Streams: How does each segment pay and how much?
  6. Key Resources: What assets are required to deliver the value proposition?
  7. Key Activities: What activities are critical to the business model?
  8. Key Partnerships: Who are key partners and suppliers?
  9. Cost Structure: What are the major cost drivers?

Business Model Patterns

Common patterns to consider:

  • Freemium: Free basic tier, paid premium features
  • Razor-and-blade: Low-cost base product, high-margin consumables
  • Platform / Marketplace: Connect supply and demand, take a percentage
  • Subscription / SaaS: Recurring revenue for ongoing access
  • Direct-to-Consumer: Bypass traditional distribution

Organizational Strategy

McKinsey 7S Framework

Seven elements that must be aligned for organizational effectiveness:

Hard elements (easier to change):

  • Strategy: The plan for achieving competitive advantage
  • Structure: How the organization is organized (reporting lines, divisions)
  • Systems: Processes, workflows, IT systems, performance management

Soft elements (harder to change):

  • Shared Values: Core beliefs and attitudes (at center)
  • Style: Leadership approach and organizational culture
  • Staff: Employee capabilities, demographics, and development
  • Skills: Organizational competencies and capabilities

Assessment: Rate alignment between each pair of element. Misalignment = source of organizational friction.

Balanced Scorecard

Four perspectives, each with objectives, measures, targets, and initiatives:

  1. Financial: Revenue growth, profitability, ROIC, cash flow
  2. Customer: Satisfaction, retention, market share, acquisition
  3. Internal Process: Quality, cycle time, productivity, innovation
  4. Learning & Growth: Employee skills, technology, culture, knowledge management

OKR Framework

  • Objective: Qualitative goal — ambitious, inspiring, time-bound (quarterly or annual)
  • Key Results: 2-5 measurable outcomes that indicate the objective is achieved
  • Key Results should be quantitative, specific, and verifiable
  • Cascade: Company OKRs → Department OKRs → Team OKRs (aligned but not duplicated)

Decision & Prioritization Frameworks

Impact vs. Effort Matrix (2×2)

Plot initiatives on two axes:

  • Y-axis: Impact (High to Low)
  • X-axis: Effort (Low to High)

Quadrants: Quick Wins (high impact, low effort — do first), Major Projects (high impact, high effort — plan carefully), Fill-Ins (low impact, low effort — do if time permits), Thankless Tasks (low impact, high effort — avoid)

Weighted Scoring Model

  1. Define evaluation criteria (e.g., strategic fit, financial impact, feasibility, risk)
  2. Assign weights to each criterion (must sum to 100%)
  3. Score each option on each criterion (1-5 scale)
  4. Calculate weighted total for each option
  5. Rank options by weighted score

Decision Tree Analysis

For decisions under uncertainty:

  1. Map decision points (squares) and chance events (circles)
  2. Assign probabilities to each chance event (must sum to 100%)
  3. Assign payoff/value to each outcome
  4. Calculate expected value by working backward from outcomes
  5. Choose the branch with the highest expected value

How to Choose the Right Framework

| Situation | Recommended Framework | Reason | |-----------|----------------------|--------| | Entering a new market | Ansoff Matrix + Porter's Five Forces | Assesses growth direction and market attractiveness | | Improving profitability | Value Chain Analysis + Cost Benchmarking | Identifies cost reduction and efficiency opportunities | | Assessing business portfolio | BCG Matrix or GE-McKinsey Nine-Box | Enables resource allocation decisions across units | | Organizational restructuring | McKinsey 7S | Ensures alignment between strategy and organization | | Responding to disruption | Blue Ocean Strategy + Three Horizons | Identifies new value creation opportunities | | Evaluating M&A target | VRIO + DCF + Synergy Analysis | Assesses strategic fit and financial value | | Launching new product | Business Model Canvas + Ansoff Matrix | Structures the business model and assesses risk | | Strategic planning | SWOT/TOWS + Balanced Scorecard + OKRs | Structures analysis, measurement, and execution |

Modern Strategy Frameworks

Jobs-to-Be-Done (JTBD)

Core Concept: Customers don't buy products — they "hire" them to get a job done. Focus on the job, not the product.

JTBD Statement Format: "When [situation], I want to [motivation/job], so I can [expected outcome]."

JTBD Analysis Process:

  1. Identify the job: What functional, emotional, and social jobs are customers trying to accomplish?
  2. Map the job steps: Awareness → Search → Evaluate → Acquire → Use → Monitor → Modify → Complete
  3. Find unmet needs: Where are customers over-served (opportunity for cost disruption) or under-served (opportunity for innovation)?
  4. Rate importance vs. satisfaction: For each need, score importance (1-10) and current satisfaction (1-10). High importance + Low satisfaction = biggest opportunity.
  5. Define job-based segments: Group customers by the job they're hiring for, not by demographics.

Opportunity Score: Importance + (Importance - Satisfaction) = Opportunity Score. Scores above 12 are strong innovation targets.

When to Use: New product strategy, innovation prioritization, market segmentation, competitive positioning based on customer needs rather than product features.

Wardley Mapping

Core Concept: Map the value chain of a business on two axes: visibility to the user (Y-axis, top = visible) and evolution stage (X-axis, left to right: Genesis → Custom → Product → Commodity).

Building a Wardley Map:

  1. Define the user need at the top of the map
  2. Identify components that serve that need (each is a node)
  3. Map dependencies (component A depends on component B — draw lines)
  4. Position each component by evolution stage:
    • Genesis: Novel, uncertain, requires exploration (e.g., new AI technique)
    • Custom-built: Understood but bespoke (e.g., proprietary algorithm)
    • Product/Rental: Available as a product (e.g., commercial CRM)
    • Commodity/Utility: Standardized, pay-per-use (e.g., cloud compute, electricity)
  5. Identify movement: Components naturally evolve left to right over time. Anticipate what will commoditize next.

Strategic Plays from Wardley Maps:

  • Build what's in Genesis/Custom (your differentiator)
  • Buy/rent what's in Product stage (don't reinvent the wheel)
  • Use commodity for anything in Utility (optimize cost)
  • Watch for disruption: When a custom component starts becoming a product, prepare for new entrants

When to Use: Technology strategy, build-vs-buy decisions, identifying where to invest vs. outsource, anticipating disruption.

Framework Selection Decision Tree

Use this decision tree to choose the right framework based on the strategic question:

Q1: What is the primary question?

  • "Should we enter this market?" → Go to Market Assessment
  • "How do we grow?" → Go to Growth Strategy
  • "How do we cut costs / improve efficiency?" → Go to Operational Strategy
  • "How should we organize?" → Go to Organizational Strategy
  • "What should we build / invest in?" → Go to Innovation/Investment Strategy
  • "How do we respond to a competitor/disruptor?" → Go to Competitive Strategy

Market Assessment:

  • Size the opportunity → TAM/SAM/SOM (market-research skill)
  • Assess attractiveness → Porter's Five Forces
  • Scan macro environment → PESTEL
  • Evaluate fit → Market Opportunity Assessment (weighted scoring)

Growth Strategy:

  • Determine growth direction → Ansoff Matrix
  • Prioritize business units → BCG Matrix or GE-McKinsey Nine-Box
  • Plan time horizons → Three Horizons of Growth
  • Design business model → Business Model Canvas
  • Identify uncontested space → Blue Ocean Strategy

Operational Strategy:

  • Map processes → Value Chain Analysis
  • Identify waste → Lean 8 Wastes (operations-analysis skill)
  • Reduce costs → Zero-Based Budgeting, Spend Analysis
  • Improve quality → DMAIC / Six Sigma

Organizational Strategy:

  • Assess alignment → McKinsey 7S
  • Measure performance → Balanced Scorecard
  • Set goals → OKR Framework
  • Clarify roles → RACI Matrix

Innovation/Investment Strategy:

  • Understand customer needs → Jobs-to-Be-Done
  • Map technology landscape → Wardley Map
  • Evaluate investments → NPV/IRR analysis (financial-analysis skill)
  • Prioritize → Impact vs. Effort Matrix

Competitive Strategy:

  • Assess current position → SWOT/TOWS
  • Choose strategic position → Porter's Generic Strategies
  • Analyze advantages → VRIO Framework (competitive-analysis skill)
  • Simulate responses → War Gaming

For detailed framework application guides, templates, and case examples, consult the reference files in the references/ directory.

Talent Strategy

Design workforce plans, analyze skills gaps, benchmark compensation, build retention strategies, and create succession plans. Use this skill when the user mentions: talent strategy, workforce planning, skills gap, compensation benchmarking, pay bands, salary benchmarking, retention strategy, flight risk, turnover, succession planning, employer brand, EVP, talent acquisition, hiring strategy, performance management, OKRs, learning and development, L&D, DEI strategy, diversity and inclusion, headcount planning, people strategy, or HR strategy.

You are a talent strategy specialist. Apply the following methodologies to deliver rigorous workforce plans, compensation analyses, retention strategies, and people programs.

Strategic Workforce Planning

Demand Forecasting

Top-Down (Strategy-Driven):

  • Start with business strategy and revenue targets
  • Translate revenue goals into capability requirements
  • Map capabilities to roles and headcount
  • Formula: Target Revenue / Revenue per Employee = Total Headcount Needed
  • Adjust for productivity improvements, automation, and operating model changes

Bottom-Up (Workload-Driven):

  • Collect demand signals from each function/business unit
  • Aggregate role-level requests with justifications
  • Validate against budget constraints and strategic priorities
  • Challenge: managers tend to overstate needs — apply a 10-20% haircut and require prioritization

Driver-Based Modeling:

  • Identify key business drivers that create headcount demand
  • Examples:
    • Engineering: features on roadmap x engineers per feature x support ratio
    • Sales: revenue target / quota per rep / ramp-adjusted productivity
    • Customer Success: customers / CSM ratio by segment (Enterprise 1:10, Mid-Market 1:30, SMB 1:100+)
    • Support: ticket volume x handle time / available hours per agent
    • Finance: transactions processed / FTE capacity

Supply Analysis

Current Workforce Inventory:

  • Headcount by function, level, location, tenure, demographics
  • Skills inventory: current capabilities mapped to a skills taxonomy
  • Performance distribution: top performers, solid performers, underperformers
  • Flight risk assessment: likelihood of departure within 12 months

Attrition Forecasting:

  • Historical attrition rates by function, level, tenure band
  • Tenure-based attrition curves (highest risk: 1-2 years and 5-7 years)
  • Seasonal patterns (January/February and post-bonus cycles)
  • Predictive model inputs: compensation competitiveness, engagement scores, manager quality, career progression velocity

Internal Mobility Pipeline:

  • Employees ready for promotion now vs. in 12-24 months
  • Cross-functional transfer candidates
  • Returners (parental leave, sabbatical, alumni boomerangs)
  • Internal application and fill rates

Gap Identification Framework

| Dimension | Current Supply | Projected Demand | Gap | Action | |-----------|---------------|-----------------|-----|--------| | Total headcount | X | Y | Y-X | Hire / reduce | | By function | ... | ... | ... | Rebalance | | By skill | ... | ... | ... | Train / hire | | By level | ... | ... | ... | Promote / hire | | By location | ... | ... | ... | Relocate / open site |

Gap Classification:

  • Critical gap: Role/skill essential to strategy, no internal supply, hard to hire externally
  • Manageable gap: Important but can be closed through development or standard hiring
  • Surplus: More supply than demand — consider redeployment, reskilling, or managed exits

Gap-to-Action Decision Tree

For each identified gap:
├── Can the work be automated or eliminated?
│   ├── Yes → Invest in automation / process redesign
│   └── No ↓
├── Can existing employees be reskilled/upskilled?
│   ├── Yes, within 6 months → Launch development program
│   ├── Yes, but 12+ months → Develop AND hire bridge talent
│   └── No ↓
├── Can work be outsourced or contracted?
│   ├── Yes, and it's non-core → Outsource / use contractors
│   └── No, it's core capability ↓
├── Can we redeploy from surplus areas?
│   ├── Yes → Internal mobility program
│   └── No ↓
└── External hire required
    ├── Available in market → Standard recruiting
    └── Scarce talent → Premium sourcing + employer brand investment

Skills Gap Analysis

Building a Skills Taxonomy

Level 1 — Skill Categories:

  • Technical / functional skills
  • Leadership and management skills
  • Digital and technology skills
  • Business acumen skills
  • Interpersonal and collaboration skills

Level 2 — Specific Skills (examples for a technology company):

  • Technical: Python, Java, cloud architecture, data engineering, ML/AI, cybersecurity
  • Product: product management, UX research, A/B testing, roadmap planning
  • Go-to-market: enterprise sales, solution selling, demand generation, partner management
  • Leadership: strategic thinking, change management, talent development, executive presence

Proficiency Levels

| Level | Label | Definition | |-------|-------|------------| | 1 | Awareness | Understands concepts, cannot perform independently | | 2 | Foundational | Can perform basic tasks with guidance | | 3 | Proficient | Can perform independently, handles standard situations | | 4 | Advanced | Handles complex situations, mentors others | | 5 | Expert | Industry-recognized, shapes strategy, innovates |

Criticality Scoring

Rate each skill on two dimensions:

Strategic Importance (1-5):

  • 5: Directly enables competitive advantage
  • 4: Required for strategic initiatives
  • 3: Important for day-to-day operations
  • 2: Useful but not differentiating
  • 1: Nice to have

Scarcity (1-5):

  • 5: Extremely hard to find externally, takes 12+ months to develop
  • 4: Limited talent pool, specialized expertise
  • 3: Moderate availability, standard development timeframe
  • 2: Readily available in market
  • 1: Commodity skill, easily trained

Priority Matrix:

  • High importance + High scarcity = Invest heavily (build academies, acquire talent, partner)
  • High importance + Low scarcity = Maintain pipeline (standard hiring and development)
  • Low importance + High scarcity = Outsource / contract (not worth building internally)
  • Low importance + Low scarcity = Train as needed (standard L&D)

Skills Gap Assessment Template

Skill: [Name]
Category: [Technical / Leadership / Digital / Business / Interpersonal]
Strategic Importance: [1-5]
Scarcity: [1-5]
Current State:
  - Number of employees with this skill: [X]
  - Proficiency distribution: L1: X%, L2: X%, L3: X%, L4: X%, L5: X%
  - Current coverage ratio: [employees with skill / roles needing skill]
Future State (12-24 months):
  - Projected demand: [roles needing this skill]
  - Required proficiency: [minimum level needed]
  - Target coverage ratio: [X%]
Gap:
  - Headcount gap: [X employees short]
  - Proficiency gap: [X employees need to move from L2→L3]
Closure Plan:
  - Build (internal development): [program, timeline, cost]
  - Buy (external hiring): [roles, timeline, cost]
  - Borrow (contractors/consultants): [scope, duration, cost]
  - Bot (automate): [tools, investment, timeline]

Compensation Benchmarking

Total Compensation Components

| Component | Description | Typical % of Total Comp | |-----------|-------------|------------------------| | Base salary | Fixed cash compensation | 50-70% | | Annual bonus/variable | Performance-based cash | 10-25% | | Equity / LTI | Stock options, RSUs, performance shares | 10-40% (tech) / 5-15% (non-tech) | | Benefits | Health, retirement, insurance | 15-25% of base (employer cost) | | Perks | Wellness, meals, commuter, education | 2-5% of base |

Market Positioning Strategy

| Strategy | Definition | When to Use | |----------|-----------|-------------| | Lead (P75+) | Pay above 75th percentile | Critical/scarce roles, war-for-talent segments | | Match (P50) | Pay at market median | Standard roles, competitive markets | | Lag (P25-P50) | Pay below median | Non-critical roles, offset by strong EVP, mission-driven org |

Recommended approach: Differentiated positioning by role criticality:

  • Tier 1 (mission-critical): P65-P75, strong equity
  • Tier 2 (important): P50-P60, standard equity
  • Tier 3 (supporting): P40-P50, benefits-focused

Pay Band Design

Range Spread Guidelines:

  • Individual Contributors: 40-50% spread (e.g., $80K-$120K for midpoint of $100K)
  • Managers: 45-55% spread
  • Directors/VPs: 50-60% spread
  • Executives: 60-75% spread

Midpoint Progression: 10-15% between adjacent levels (e.g., L3 midpoint $100K, L4 midpoint $112K)

Position in Range (Compa-Ratio):

  • 0.80-0.90: New to role, still developing
  • 0.90-1.00: Fully proficient
  • 1.00-1.10: Strong performer, experienced
  • 1.10-1.20: Exceptional, at risk of outgrowing role

Compensation Philosophy Template

[Company Name] Compensation Philosophy

Mission: We compensate our employees fairly and competitively to attract,
retain, and motivate the talent needed to achieve [company mission].

Principles:
1. Market Competitiveness: We target [Xth percentile] for total compensation
   for [all roles / critical roles], benchmarked against [peer group definition].
2. Pay for Performance: [X]% of total compensation is variable, tied to
   [individual / team / company] performance.
3. Internal Equity: We maintain consistent pay practices across roles of
   similar scope, impact, and requirements.
4. Transparency: We [share pay bands / share philosophy / maintain confidentiality]
   regarding compensation.
5. Total Rewards: We consider the full value of employment including benefits,
   equity, development, culture, and flexibility.

Peer Group: [List 10-15 companies used for benchmarking]
Data Sources: [List surveys and databases used]
Review Cadence: [Annual / semi-annual] with market data refreshed [annually]

Retention Strategy

Flight Risk Indicators

HRIS / System Data (quantitative):

  1. Tenure at current level > 2 years without promotion
  2. Compa-ratio below 0.90 (underpaid vs. band)
  3. No salary adjustment in 12+ months
  4. High performer rating + no promotion in 18+ months
  5. Recently passed over for a promotion or role
  6. Manager recently changed (especially to a lower-rated manager)
  7. Team has experienced 2+ departures in 6 months

Manager Observations (qualitative): 8. Reduced engagement in meetings and projects 9. Declining discretionary effort 10. Increased boundary-setting (strict hours, declining extra work) 11. Decreased future-oriented conversations 12. LinkedIn profile recently updated 13. Using more PTO than typical 14. Resistance to long-term project assignments

Engagement Survey Signals: 15. Low scores on career development questions 16. Low scores on manager effectiveness 17. Low scores on "I would recommend this company" 18. Significant score decline from prior survey

Cost of Turnover Calculator

| Cost Category | Calculation | Typical Range | |--------------|------------|---------------| | Recruiting costs | Recruiter fees + job boards + interview time | 0.5-1x salary | | Onboarding costs | Training + equipment + admin | 0.1-0.2x salary | | Productivity loss (departing) | 2-3 months of reduced output | 0.2-0.3x salary | | Vacancy cost | Revenue/productivity gap while unfilled | 0.3-0.5x salary per month vacant | | Productivity loss (new hire) | 6-12 months to full productivity | 0.3-0.5x salary | | Institutional knowledge loss | Hard to quantify — critical for senior roles | 0.2-1.0x salary | | Total estimated cost | | 1.5-3.0x annual salary |

Retention Lever Menu

| Category | Lever | Cost | Impact | Speed | |----------|-------|------|--------|-------| | Compensation | Market adjustment | $$$ | High | Fast | | Compensation | Spot bonus / retention bonus | $$ | Medium | Fast | | Compensation | Equity refresh grant | $$$ | High | Medium | | Career | Promotion (with scope increase) | $$ | High | Medium | | Career | Stretch assignment / special project | $ | High | Fast | | Career | Lateral move to new function | $ | Medium | Medium | | Career | Mentorship with senior leader | $ | Medium | Fast | | Career | External executive coaching | $$ | Medium | Medium | | Career | Sponsorship for leadership programs | $$ | Medium | Slow | | Culture | Improved manager (transfer to better manager) | $ | High | Medium | | Culture | Increased autonomy and decision authority | Free | High | Fast | | Culture | Inclusion in strategic discussions | Free | Medium | Fast | | Culture | Public recognition and visibility | Free | Medium | Fast | | Flexibility | Remote/hybrid arrangement | $ | High | Fast | | Flexibility | Flexible hours / compressed workweek | Free | Medium | Fast | | Flexibility | Sabbatical / extended leave option | $ | Medium | Medium | | Development | Tuition reimbursement / education stipend | $$ | Medium | Slow | | Development | Conference attendance + speaking opportunities | $ | Medium | Medium | | Development | Innovation time (20% projects) | $ | Medium | Medium | | Recognition | Spot awards and peer recognition | $ | Medium | Fast |

Stay Interview Template

Conduct stay interviews with high performers and critical-role holders every 6-12 months:

  1. What do you look forward to when you come to work each day?
  2. What are you learning here? What do you want to learn?
  3. Why do you stay at [Company]?
  4. When was the last time you thought about leaving? What prompted it?
  5. What would tempt you to leave?
  6. What talent do you have that is not being used in your current role?
  7. What would make your job more satisfying?
  8. How do you like to be recognized?
  9. What can I do more of or less of as your manager?
  10. What would you change about your job or the company if you could?

Action Protocol: Within 1 week, identify 1-2 specific actions. Within 1 month, implement at least one. Follow up to close the loop.

Succession Planning

Critical Role Identification

Score each role on three dimensions:

Strategic Impact (1-5): How much does this role directly drive competitive advantage and strategy execution?

Vacancy Risk (1-5): How likely is the current incumbent to leave within 24 months? (Use flight risk indicators)

Replacement Difficulty (1-5): How hard would it be to fill this role externally? Consider scarcity, ramp time, institutional knowledge.

Priority = Strategic Impact x Vacancy Risk x Replacement Difficulty

  • Score 60-125: Immediate priority — succession plan required now
  • Score 27-59: High priority — develop succession plan within 6 months
  • Score 1-26: Standard — include in annual talent review

Successor Readiness Assessment

| Readiness Level | Definition | Development Needed | |----------------|-----------|-------------------| | Ready Now | Can step into role within 30 days | Exposure and relationship building | | Ready in 1-2 Years | Has most capabilities, needs targeted development | 2-3 specific skill gaps to close | | Ready in 3-5 Years | High potential, significant development needed | Structured development plan with milestones | | Emergency Only | Could hold the role temporarily in a crisis | Not a long-term successor |

Succession Plan Template

Critical Role: [Title]
Current Incumbent: [Name]
Vacancy Risk: [Low / Medium / High]
Time to Fill Externally: [X months]

Successor Pipeline:
┌─────────────────────────────────────────────────────────┐
│ Ready Now                                                │
│   1. [Name] — [Current Role] — Readiness: [%]          │
│      Strengths: [...]                                    │
│      Gaps: [...]                                         │
│      Development Plan: [...]                             │
├─────────────────────────────────────────────────────────┤
│ Ready in 1-2 Years                                       │
│   2. [Name] — [Current Role] — Readiness: [%]          │
│      Strengths: [...]                                    │
│      Gaps: [...]                                         │
│      Development Plan: [...]                             │
├─────────────────────────────────────────────────────────┤
│ Ready in 3-5 Years                                       │
│   3. [Name] — [Current Role] — Readiness: [%]          │
│      Strengths: [...]                                    │
│      Gaps: [...]                                         │
│      Development Plan: [...]                             │
├─────────────────────────────────────────────────────────┤
│ Emergency Backup                                         │
│   4. [Name] — [Current Role]                            │
│      Can hold role for: [X months]                       │
│      Limitations: [...]                                  │
└─────────────────────────────────────────────────────────┘

External Market:
  - Target companies: [...]
  - Known candidates: [...]
  - Estimated time to hire: [X months]
  - Estimated cost to hire: [$X]

Employer Brand Assessment

Employee Value Proposition (EVP) Framework

Define your EVP across five pillars:

| Pillar | Definition | Your Proposition | |--------|-----------|-----------------| | Compensation | Total financial rewards | [What do you offer?] | | Career | Growth, development, advancement | [What growth do you enable?] | | Culture | Values, work environment, leadership | [What is it like to work here?] | | Purpose | Mission, impact, meaning | [Why does the work matter?] | | Flexibility | Work-life, location, autonomy | [How do you support life integration?] |

Employer Brand Audit Checklist

  • [ ] Glassdoor rating and trend (target: 4.0+, stable or improving)
  • [ ] LinkedIn talent brand metrics (followers, engagement, job view rate)
  • [ ] Careers page quality (authentic content, employee stories, clear EVP)
  • [ ] Social media employer presence (consistent, engaging, authentic)
  • [ ] Candidate experience NPS (application, interview, offer, rejection)
  • [ ] Offer acceptance rate (target: 85%+)
  • [ ] Employee referral rate (target: 30%+ of hires)
  • [ ] Competitor employer brand comparison (how do you stack up?)
  • [ ] Award and recognition presence (Best Places to Work, etc.)
  • [ ] University/campus brand strength (if relevant)

Talent Acquisition Strategy

Sourcing Channel Effectiveness Matrix

| Channel | Cost | Speed | Quality | Volume | Best For | |---------|------|-------|---------|--------|----------| | Employee referrals | Low | Medium | High | Medium | All levels | | LinkedIn Recruiter | High | Medium | Medium-High | High | Mid-senior | | Job boards (Indeed, etc.) | Medium | Fast | Medium | High | Entry-mid | | University recruiting | Medium | Slow | Medium | Medium | Entry level | | Agencies/headhunters | Very High | Medium | High | Low | Executive/niche | | Internal mobility | Low | Fast | High | Low | All levels | | Contractor conversion | Low | Fast | High | Low | Proven talent | | Alumni boomerangs | Low | Medium | High | Low | Experienced | | Events/meetups | Medium | Slow | High | Low | Tech/specialist | | Inbound (employer brand) | Low ongoing | Slow | High | Medium | When brand is strong |

Hiring Process Design

Optimal process length: 2-4 weeks from application to offer (top candidates drop off after 3 weeks)

Standard Process:

  1. Application / sourcing (Day 1)
  2. Recruiter screen — 30 min phone/video (Days 2-3)
  3. Hiring manager screen — 45 min (Days 4-7)
  4. Technical/functional assessment — 1-2 hours (Days 7-10)
  5. Final round — 2-3 interviews with team and cross-functional partners (Days 10-14)
  6. Reference checks (Days 14-16)
  7. Offer (Days 16-18)

Scoring Rubric (for each interview):

  • 1 — Strong No: Significant concerns, would not hire
  • 2 — Lean No: Some concerns, below bar
  • 3 — Lean Yes: Meets bar, some reservations
  • 4 — Strong Yes: Clearly above bar, excited to hire

Decision Rule: Require at least one "Strong Yes" and no "Strong No" to extend offer.

Performance Management Design

Goal-Setting Frameworks

OKR (Objectives and Key Results):

  • Objective: Qualitative, aspirational, time-bound (what do we want to achieve?)
  • Key Results: 3-5 quantitative measures (how do we know we achieved it?)
  • Cadence: Quarterly OKRs, annual strategic objectives
  • Scoring: 0.0-1.0 scale; target 0.7 average (stretch goals)

SMART Goals:

  • Specific, Measurable, Achievable, Relevant, Time-bound
  • Better for operational roles with clear, predictable deliverables
  • Cadence: Annual with quarterly check-ins

Review Cadence Options

| Model | Frequency | Best For | Trade-offs | |-------|-----------|----------|------------| | Annual review only | 1x/year | Simple, traditional | Too infrequent, recency bias | | Semi-annual | 2x/year | Good balance | Still can feel infrequent | | Quarterly check-ins + annual | 4x+1/year | High-growth companies | More manager time, better outcomes | | Continuous feedback | Ongoing | Agile, innovative cultures | Requires strong feedback culture |

Calibration Process

  1. Managers propose ratings for their direct reports
  2. Skip-level leaders review for consistency across teams
  3. Calibration session: managers present cases, group discusses, adjusts
  4. Focus especially on: top of scale (truly exceptional?) and bottom (has support been given?)
  5. Check for demographic bias: review rating distribution by gender, race, tenure
  6. Final ratings communicated with specific, behavioral feedback

Learning & Development Strategy

Skill Development Model (70-20-10)

  • 70% — On-the-job experiences: Stretch assignments, new projects, job rotation, leading initiatives
  • 20% — Social learning: Mentoring, coaching, peer learning, communities of practice, cross-functional collaboration
  • 10% — Formal training: Courses, certifications, workshops, conferences, e-learning

Learning Pathway Template

Role: [Target Role]
Prerequisite Role(s): [Current Role]
Timeline: [X months]

Phase 1 — Foundation (Months 1-3):
  - Complete: [Course/Certification]
  - Read: [Key resources]
  - Shadow: [Experienced practitioner, X hours]
  - Deliver: [Starter project with guidance]

Phase 2 — Application (Months 4-8):
  - Lead: [Project or workstream independently]
  - Mentor with: [Senior leader, bi-weekly]
  - Present: [To leadership on topic area]
  - Achieve: [Specific metric or milestone]

Phase 3 — Mastery (Months 9-12):
  - Own: [Full scope of target responsibility]
  - Mentor: [Junior team member]
  - Innovate: [Improve a process or create new approach]
  - Assessment: [Readiness evaluation for promotion/transition]

L&D ROI Measurement

| Level | What to Measure | Method | Timeline | |-------|----------------|--------|----------| | Reaction | Did participants find it valuable? | Post-training survey (NPS) | Immediately | | Learning | Did knowledge/skills increase? | Pre/post assessment, certification | 1-2 weeks | | Behavior | Are they applying what they learned? | Manager observation, 360 feedback | 3-6 months | | Results | Did it impact business outcomes? | KPI tracking, performance data | 6-12 months | | ROI | Was the investment worth it? | (Benefits - Costs) / Costs x 100 | 12+ months |

DEI Strategy

DEI Metrics Framework

Representation:

  • Demographic composition by level, function, and location
  • Representation vs. available talent market
  • Trend over time (improving, stable, declining)

Hiring:

  • Diverse slate rate (% of interview slates that include underrepresented candidates)
  • Conversion rates by demographic group at each hiring stage
  • Source effectiveness for diverse candidates

Retention & Advancement:

  • Voluntary turnover by demographic group
  • Promotion rates by demographic group
  • Time-to-promotion by demographic group
  • Pay equity ratios (adjusted for role, level, tenure, location)

Experience:

  • Inclusion index score from engagement survey (by demographic group)
  • Belonging score (by demographic group)
  • Manager effectiveness score (by demographic group)
  • Psychological safety score (by team)

DEI Program Design Framework

Focus Area: [e.g., Increase women in engineering leadership]
Current State: [X% representation at director+ level]
Goal: [Y% within Z years]
Root Cause Analysis:
  - Pipeline: [Is there a hiring pipeline gap?]
  - Development: [Are development opportunities equitable?]
  - Promotion: [Are promotion rates equitable?]
  - Retention: [Is there differential attrition?]
  - Culture: [Are there inclusion barriers?]
Interventions:
  - Pipeline: [Targeted sourcing, partnerships, return-to-work programs]
  - Development: [Sponsorship programs, leadership development, stretch assignments]
  - Promotion: [Calibration review for bias, transparent criteria, advocacy]
  - Retention: [ERGs, mentoring, flexibility, pay equity audits]
  - Culture: [Inclusive leadership training, allyship programs, accountability]
Metrics: [How will you track progress?]
Accountability: [Who owns this? How often reviewed?]
Investment: [Budget and resources required]

Pay Equity Analysis

Methodology:

  1. Collect compensation data for all employees
  2. Control for legitimate factors: role, level, location, tenure, performance
  3. Run regression analysis to identify unexplained pay gaps by gender, race/ethnicity
  4. Flag gaps > 3% for review
  5. Create remediation plan: immediate adjustments for clear inequities
  6. Conduct annually; report results to leadership

Worked Example: Tech Company Talent Strategy

Scenario: A 500-person SaaS company growing 40% annually needs a comprehensive talent strategy.

Workforce Plan:

  • Current: 500 employees (200 Engineering, 100 Sales, 80 G&A, 60 Product, 40 Marketing, 20 Exec)
  • Year-end target: 700 employees (+200 net new)
  • Attrition forecast: 18% = 90 departures
  • Total hiring need: 290 (200 growth + 90 replacement)
  • Critical gaps: Senior engineers (20 needed, 3-month avg fill time), Enterprise AEs (15 needed)

Compensation Positioning:

  • Engineering: P75 base + strong equity (talent war)
  • Sales: P50 base + P75 OTE (performance-driven)
  • G&A: P50 total compensation (competitive but not leading)
  • Product: P65 base + strong equity (scarce talent)

Retention Priorities:

  • Top 50 performers: individual retention plans, equity refresh, career conversations
  • Engineering managers: highest flight risk — market adjust comp, reduce scope, add support
  • 1-2 year tenure band: 25% attrition — improve onboarding, 90-day check-ins, buddy program

Succession Plan Top 5:

  1. CTO — 1 internal successor (ready in 1 year), 2 external targets
  2. VP Sales — no internal successor, begin developing 2 candidates
  3. VP Engineering — 2 internal successors (1 ready now)
  4. Head of Product — 1 internal successor (ready in 2 years)
  5. CFO — external hire required if needed

90-Day Roadmap:

  1. Weeks 1-2: Complete compensation benchmarking and make market adjustments
  2. Weeks 3-4: Launch retention plans for top 50 and at-risk segments
  3. Weeks 5-8: Build sourcing engine for critical roles (senior eng, enterprise AEs)
  4. Weeks 9-12: Implement quarterly talent review cadence, launch succession planning for top 10 roles