Skip to main content
All Guides
Metrics

Engineering QBR Template: Quarterly Business Review for Engineering

A complete engineering QBR template with the 4 pillars of quarterly reporting. Present velocity without gaming, build board credibility, and drive data-informed decisions.

15 min readUpdated February 1, 2026By CodePulse Team
Engineering QBR Template: Quarterly Business Review for Engineering - visual overview

The Quarterly Business Review is where engineering earns—or loses—its seat at the strategic table. Most engineering QBRs fail because they present metrics the board doesn't care about, delivered in a language executives don't speak. This guide gives you a complete framework for QBRs that build credibility and secure resources.

"A QBR isn't a status update—it's your quarterly case for why engineering deserves the resources it has and the investment it needs."

Engineering leaders who master QBRs get more headcount approved, face fewer questions about productivity, and build the trust that lets them operate autonomously. Those who don't become the department that executives watch nervously, looking for signs that their largest cost center is underperforming.

Why Engineering Needs QBRs (Hint: Board Credibility)

Every other function presents business reviews. Sales shows pipeline and quota attainment. Marketing shows CAC and pipeline contribution. Finance shows burn rate and runway. When engineering can't present with the same rigor, executives fill the gap with assumptions— usually unfavorable ones.

The Credibility Gap

Engineering typically consumes 30-50% of a tech company's operating budget. Yet when boards ask "Is engineering efficient?", most VPs struggle to answer with the same confidence their peers in sales or marketing can. This credibility gap has real consequences:

  • Headcount requests get scrutinized more heavily
  • Technical initiatives face skepticism ("Is this really necessary?")
  • Executives start asking for time-tracking and activity metrics
  • Strategic input from engineering gets deprioritized

"Boards don't distrust engineering because they don't understand it. They distrust engineering because engineering hasn't learned to speak their language."

What QBRs Actually Accomplish

A well-executed QBR does more than report metrics. It accomplishes four strategic objectives:

ObjectiveWhat It Looks LikeBusiness Impact
Demonstrate ValueShow output relative to investmentJustifies budget and headcount
Build PredictabilityShow consistent delivery patternsEnables confident roadmap commitments
Surface Risks EarlyFlag technical debt, capacity gapsPrevents surprise escalations
Align InvestmentShow where engineering time goesEnables strategic resource allocation

Our Take

Most engineering QBRs are backwards. They start with what engineering did (activity metrics, PRs merged, deploys shipped) instead of what the business got (features delivered, risk reduced, capacity increased). The first question every slide should answer is "so what?"—and if the answer requires technical knowledge to understand, you've already lost your audience. QBRs are business documents, not engineering ones.

See your engineering metrics in 5 minutes with CodePulse

The QBR Framework: 4 Pillars

Four-pillar QBR dashboard showing Delivery Velocity, Code Quality, Team Capacity, and Investment Mix with health gauges
The QBR scorecard framework: Track all four pillars for comprehensive reporting

Every effective engineering QBR rests on four pillars. Miss one, and your review leaves questions unanswered. Include all four, and you've covered what boards and executives actually want to know.

Pillar 1: Delivery

The question it answers: "Is engineering shipping what the business needs?"

Delivery metrics demonstrate that engineering converts resources into outcomes. This isn't about raw velocity—it's about predictability and alignment with business goals.

DELIVERY PILLAR METRICS
═══════════════════════════════════════════════════════════════

Primary Metrics:
├── Roadmap Delivery Rate: % of committed items shipped
├── Feature Velocity: Features shipped per engineer per quarter
├── Cycle Time: Idea → Production (median days)
└── Deployment Frequency: Releases per week

Supporting Context:
├── Roadmap items completed vs. carried over
├── Unplanned work percentage (interrupts, escalations)
├── Scope changes after commitment
└── Time-to-market for strategic initiatives

Red Flags to Address Proactively:
├── Delivery rate below 75%
├── Cycle time trending upward quarter-over-quarter
├── More than 30% unplanned work
└── Consistent scope changes post-commitment

Pillar 2: Quality

The question it answers: "Is what we ship reliable and maintainable?"

Quality metrics show that speed isn't coming at the expense of reliability. This is where you demonstrate engineering maturity and reduce executive anxiety about outages.

QUALITY PILLAR METRICS
═══════════════════════════════════════════════════════════════

Primary Metrics:
├── Change Failure Rate: % of deployments causing incidents
├── MTTR: Mean time to recovery from production issues
├── Incident Frequency: P0/P1 incidents per month
└── Customer-Reported Bugs: Bugs found in production vs. testing

Supporting Context:
├── Test coverage trend (not absolute number)
├── Post-incident action items completion rate
├── Security vulnerability remediation time
└── Technical debt ratio (maintenance vs. feature work)

Red Flags to Address Proactively:
├── Change failure rate above 5%
├── MTTR exceeding 1 hour
├── Recurring incidents in same system
└── Security vulnerabilities older than 30 days

Pillar 3: Capacity

The question it answers: "Can engineering support our growth plans?"

Capacity metrics show whether your team can handle what's coming. This is where you build the case for headcount—or explain why you don't need more.

CAPACITY PILLAR METRICS
═══════════════════════════════════════════════════════════════

Primary Metrics:
├── Output per Engineer: Features shipped / FTE
├── Attrition Rate: Voluntary departures (trailing 12mo)
├── Open Positions: Headcount gap vs. plan
└── Ramp Time: Months until new hire at full productivity

Supporting Context:
├── Team distribution across initiatives
├── Key person dependencies (bus factor)
├── On-call burden distribution
└── Interview pipeline health

Red Flags to Address Proactively:
├── Output per engineer declining quarter-over-quarter
├── Attrition above industry average (13%)
├── Critical systems with bus factor of 1
└── Average time-to-fill exceeding 60 days

Pillar 4: Investment

The question it answers: "Where is engineering time actually going?"

Investment metrics translate engineering activity into capital allocation language that boards understand. This is often the most eye-opening section for executives who wonder why features seem slow despite headcount growth. For deeper guidance on investment categorization, see our KTLO vs Innovation Guide.

INVESTMENT PILLAR METRICS
═══════════════════════════════════════════════════════════════

Primary Metrics:
├── Feature Work: % time on new capabilities (target: 50-60%)
├── Enhancements: % time improving existing features (15-25%)
├── KTLO/Maintenance: % time keeping lights on (15-20%)
└── Tech Debt/Platform: % time on infrastructure (10-15%)

Investment Breakdown Example:

                      Q1      Q2      Q3      Target
─────────────────────────────────────────────────────
New Features         42%     45%     48%      50%
Enhancements         23%     22%     24%      20%
Maintenance          25%     23%     20%      20%
Tech Debt            10%     10%      8%      10%
─────────────────────────────────────────────────────

Red Flags to Address Proactively:
├── KTLO/Maintenance exceeding 25%
├── Tech debt declining when systems aging
├── Feature work below 40%
└── Investment profile not matching strategic priorities

For guidance on presenting these metrics to board members specifically, see our Board-Ready Engineering Metrics Guide.

Metrics That Belong in Every QBR

Not every metric belongs in a QBR. Include too many, and you overwhelm the audience. Include too few, and you leave critical questions unanswered. Here's the essential set.

The Core Eight

These eight metrics should appear in every QBR, regardless of company stage or engineering team size:

MetricPillarWhy It MattersWatch For
Roadmap Delivery RateDeliveryMeasures predictabilityBelow 75%
Cycle TimeDeliverySpeed of value deliveryUpward trend
Change Failure RateQualityRelease qualityAbove 5%
P0/P1 IncidentsQualitySystem reliabilityIncreasing trend
Features/EngineerCapacityTeam efficiencyDeclining trend
Attrition RateCapacityTeam stabilityAbove 13%
Feature vs. KTLO SplitInvestmentResource allocationKTLO above 25%
Tech Debt TrendInvestmentPlatform healthContinuous growth

Contextual Additions

Beyond the core eight, add metrics based on current company priorities:

  • Growth phase: Time-to-market for new features, scalability metrics
  • Cost optimization phase: Infrastructure spend trend, efficiency ratios
  • Post-incident: MTTR, incident frequency, remediation completion
  • Pre-acquisition/IPO: Security posture, compliance status, code quality

For a comprehensive view of what metrics to track, see our Engineering Health Scorecard Guide.

See your engineering metrics in 5 minutes with CodePulse

Presenting Velocity Without Getting Gamed

Velocity metrics are dangerous. Present them wrong, and you create incentives that destroy engineering culture. Present them right, and you demonstrate team effectiveness without encouraging gaming.

Our Take

The moment you put velocity on a slide, someone will try to increase it. If that velocity is measured in story points, they'll inflate estimates. If it's PRs merged, they'll split work into smaller pieces. If it's lines of code (please no), they'll write verbose code. The only safe velocity metrics are outcome-based: features shipped, customer problems solved, revenue enabled. These are hard to game because gaming them actually creates value.

Safe Velocity Metrics

These metrics resist gaming because improving them requires actually improving:

SAFE VELOCITY METRICS
═══════════════════════════════════════════════════════════════

Outcome-Based (Recommended):
├── Features shipped to production (defined deliverables)
├── Customer problems resolved (support ticket to fix)
├── Revenue-enabling releases (unblocked deals)
└── Roadmap items delivered on time

Process-Based (Use With Context):
├── Cycle time (idea to production) - encourages small batches
├── Deployment frequency - encourages automation
├── PR review turnaround - encourages collaboration
└── Lead time for changes - encourages efficiency

AVOID (Easy to Game):
├── Story points completed
├── Lines of code
├── Commits
├── Hours worked

Presenting Velocity Without Triggering Gaming

Follow these principles when showing velocity in QBRs:

  1. Never show individual velocity: Team and org-level only. Individual metrics invite comparison and gaming.
  2. Always show quality alongside: Velocity without quality context is meaningless. Pair throughput with change failure rate.
  3. Focus on trends, not absolutes: "Cycle time improved 20%" is safer than "Cycle time is 2 days" because it doesn't set a target to beat.
  4. Connect to business outcomes: "Faster cycle time meant we shipped the enterprise feature before the competitor."
  5. Acknowledge diminishing returns: "Our cycle time is healthy—further optimization would come at the cost of code quality."

"If a metric becomes a target, it ceases to be a good metric. Show velocity to demonstrate health, not to set expectations for continuous improvement."

The Anti-Gaming Narrative

Before showing velocity metrics, frame them correctly:

VELOCITY FRAMING SCRIPT
═══════════════════════════════════════════════════════════════

WRONG WAY:
"Our deployment frequency increased from 10 to 15 per week."

Why it's wrong:
- Implies 15 is better than 10
- Creates pressure to reach 20 next quarter
- Doesn't connect to business value

RIGHT WAY:
"Our deployment frequency is stable at 15 per week, which gives
us the ability to ship customer-requested changes same-day when
needed. This quarter, that meant we could respond to the Acme
security requirement in 48 hours instead of waiting for a release
window."

Why it works:
- Frames the number as enabling capability, not a target
- Connects to specific business outcome
- Doesn't imply higher is better

QBR Template with CodePulse Data

Here's a complete QBR template you can adapt. Each section maps to the four pillars and includes space for the specific metrics that matter.

QBR Agenda (90 Minutes)

ENGINEERING QBR AGENDA
═══════════════════════════════════════════════════════════════

TIME        SECTION                              OWNER
────────────────────────────────────────────────────────────────
0:00-0:05   Executive Summary                    VP Eng
0:05-0:20   Delivery Performance                 VP Eng
0:20-0:35   Quality & Reliability                VP Eng / SRE
0:35-0:50   Capacity & Team Health               VP Eng
0:50-1:05   Investment Allocation                VP Eng
1:05-1:15   Risks & Asks                         VP Eng
1:15-1:30   Q&A / Discussion                     All
────────────────────────────────────────────────────────────────

PREPARATION CHECKLIST:
[ ] Pull latest metrics from CodePulse (24 hours before)
[ ] Review previous QBR commitments (what did we promise?)
[ ] Draft risk section (what might go wrong?)
[ ] Prepare appendix slides for likely questions
[ ] Rehearse the 5-minute version (for running over)

Slide-by-Slide Template

SLIDE 1: EXECUTIVE SUMMARY
═══════════════════════════════════════════════════════════════

ENGINEERING HEALTH: [A/B/C/D/F] ([+/-] from last quarter)

Key Wins:
• [Major delivery achievement with business impact]
• [Quality or reliability improvement with business impact]
• [Team or capacity milestone]

Key Challenges:
• [Top risk or blocker with mitigation status]
• [Resource constraint or capacity issue]

Next Quarter Focus:
• [Top 3 priorities aligned with business goals]


SLIDE 2: DELIVERY PERFORMANCE
═══════════════════════════════════════════════════════════════

┌─────────────────────────────────────────────────────────────┐
│  DELIVERY SCORECARD                                          │
├─────────────────────────────────────────────────────────────┤
│  Roadmap Delivery:    [X]%     (target: 80%+)     [STATUS]  │
│  Cycle Time:          [X] days (target: <3 days)  [STATUS]  │
│  Deploy Frequency:    [X]/week (target: 10+)      [STATUS]  │
│  Unplanned Work:      [X]%     (target: <20%)     [STATUS]  │
└─────────────────────────────────────────────────────────────┘

What Shipped (Top 5):
1. [Feature] - [Business impact]
2. [Feature] - [Business impact]
3. [Feature] - [Business impact]
4. [Feature] - [Business impact]
5. [Feature] - [Business impact]

Delivery Trend (4 quarters):
Q1: [X]% → Q2: [X]% → Q3: [X]% → Q4: [X]%


SLIDE 3: QUALITY & RELIABILITY
═══════════════════════════════════════════════════════════════

┌─────────────────────────────────────────────────────────────┐
│  QUALITY SCORECARD                                           │
├─────────────────────────────────────────────────────────────┤
│  Change Failure Rate: [X]%     (target: <5%)      [STATUS]  │
│  MTTR:               [X] min   (target: <60 min)  [STATUS]  │
│  P0/P1 Incidents:    [X]       (target: <3)       [STATUS]  │
│  Uptime:             [X]%      (target: 99.9%+)   [STATUS]  │
└─────────────────────────────────────────────────────────────┘

Incident Summary:
• P0 Incidents: [Count] ([Root cause summary])
• P1 Incidents: [Count] ([Pattern if any])
• All post-incident actions completed: [Yes/No]

Quality Trend (4 quarters):
[Mini chart showing CFR and incident trends]


SLIDE 4: CAPACITY & TEAM HEALTH
═══════════════════════════════════════════════════════════════

┌─────────────────────────────────────────────────────────────┐
│  CAPACITY SCORECARD                                          │
├─────────────────────────────────────────────────────────────┤
│  Headcount:          [X] FTE   (plan: [X])        [STATUS]  │
│  Attrition (12mo):   [X]%      (target: <13%)     [STATUS]  │
│  Open Positions:     [X]       (avg time: [X] days)         │
│  Output/Engineer:    [X]/qtr   (trend: [+/-X]%)   [STATUS]  │
└─────────────────────────────────────────────────────────────┘

Team Distribution:
• [Team A]: [X] engineers - [Primary focus]
• [Team B]: [X] engineers - [Primary focus]
• [Team C]: [X] engineers - [Primary focus]

Key Person Risk:
• [System/Area]: Bus factor = [X] - [Mitigation status]

Hiring Pipeline:
• [X] interviews scheduled
• [X] offers extended
• [X] projected joins next quarter


SLIDE 5: INVESTMENT ALLOCATION
═══════════════════════════════════════════════════════════════

ENGINEERING TIME ALLOCATION:

New Features:    ████████████████████████████████░░░░░░░░  48%
Enhancements:    █████████████████░░░░░░░░░░░░░░░░░░░░░░░  22%
KTLO:            ████████████████░░░░░░░░░░░░░░░░░░░░░░░░  20%
Tech Debt:       ██████████░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░  10%

Investment Trend:
             Q1      Q2      Q3      Target
────────────────────────────────────────────
Features     42%     45%     48%      50%
KTLO         28%     24%     20%      20%
Trend        ✓ Moving toward target

Notable Investment Decisions:
• [Initiative]: [X] FTE-months invested - [Outcome/status]
• [Initiative]: [X] FTE-months invested - [Outcome/status]


SLIDE 6: RISKS & ASKS
═══════════════════════════════════════════════════════════════

ACTIVE RISKS:

Risk                Impact      Likelihood   Mitigation
─────────────────────────────────────────────────────────
[Risk 1]            HIGH        MEDIUM       [Status]
[Risk 2]            MEDIUM      HIGH         [Status]
[Risk 3]            LOW         HIGH         [Status]

RESOURCE ASKS:

Ask                 Justification              Decision Needed
─────────────────────────────────────────────────────────────
[Headcount/Budget]  [Business case]            [Date]
[Tool/Investment]   [Business case]            [Date]

COMMITMENTS FOR NEXT QUARTER:

1. [Specific, measurable commitment]
2. [Specific, measurable commitment]
3. [Specific, measurable commitment]

📊How to See This in CodePulse

CodePulse automates QBR data collection and visualization:

  • Executive Summary provides the health grade and key metrics for Slide 1
  • Year in Review generates quarterly trend data for all four pillars
  • Investment allocation is calculated automatically from PR categorization
  • Export to CSV or generate reports for board deck inclusion

Frequently Asked Questions

How long should an engineering QBR be?

90 minutes maximum. If you can't cover everything in 90 minutes, you're including too much detail. The goal is strategic discussion, not comprehensive reporting. Keep slides to 6-8 maximum, with appendix slides ready for deep dives.

Who should attend the engineering QBR?

Core attendees: CEO, CFO, VP Engineering, and key engineering directors. Optional: CPO/Head of Product (for roadmap alignment), CTO (if separate from VP Eng). Avoid inviting individual contributors—the QBR is a leadership forum.

How do I handle a QBR when metrics are bad?

Lead with transparency. Show the metrics clearly, explain the root cause, and present your remediation plan. Boards respect leaders who surface problems early with solutions in progress. Never hide or spin bad metrics—it destroys credibility when the truth eventually surfaces.

Should I share the QBR deck in advance?

Yes, send it 24-48 hours before the meeting. This gives attendees time to absorb the data and come prepared with questions. It also shifts meeting time from "presenting" to "discussing"—which is far more valuable.

How do I handle requests for individual contributor metrics?

Redirect firmly but diplomatically. Explain that individual metrics create gaming incentives and don't correlate with business outcomes. Offer team-level metrics instead. If pushed, share that industry research shows individual velocity metrics are counterproductive. For more on this topic, see our philosophy on engineer performance measurement.

What if executives ask for metrics I don't have?

Be honest: "We don't currently track that, but I can look into adding it." Then evaluate whether the metric is actually valuable or just something that sounds important. Not every requested metric deserves investment in tracking.

How do I compare our metrics to industry benchmarks?

Use published research from DORA, Accelerate, and engineering analytics vendors. Always contextualize: "Elite" DORA performance might not be appropriate for a regulated fintech with heavy compliance requirements. Benchmarks provide context, not targets.

See your engineering metrics in 5 minutes with CodePulse

Continue building your executive communication toolkit with these related guides:

See these insights for your team

CodePulse connects to your GitHub and shows you actionable engineering metrics in minutes. No complex setup required.

Free tier available. No credit card required.