Skip to main content
All Guides
Metrics

Why Microsoft Abandoned DORA for SPACE (And You Should Too)

Learn how to implement the SPACE framework from Microsoft and GitHub research to measure developer productivity across Satisfaction, Performance, Activity, Communication, and Efficiency.

16 min readUpdated December 25, 2025By CodePulse Team

The SPACE framework, developed by researchers at Microsoft, GitHub, and the University of Victoria, represents the most comprehensive approach to measuring developer productivity. Unlike single-metric approaches, SPACE acknowledges that productivity is multidimensional and requires balanced measurement across five distinct dimensions.

This guide explains each SPACE dimension, how to implement SPACE metrics using GitHub data, and how CodePulse maps to the framework—giving you a complete, research-backed approach to engineering productivity measurement.

What Is the SPACE Framework?

SPACE emerged from research published in 2021 by Nicole Forsgren (co-author of the DORA research), Margaret-Anne Storey, Chandra Maddila, Thomas Zimmermann, Brian Houck, and Jenna Butler. The framework was created to address a fundamental problem: single metrics like lines of code or commits fail to capture the full picture of developer productivity.

The name SPACE is an acronym for its five dimensions:

  • Satisfaction and well-being
  • Performance
  • Activity
  • Communication and collaboration
  • Efficiency and flow

The key insight of SPACE is that you should measure across multiple dimensions, at multiple levels (individual, team, organization), and balance quantitative metrics with qualitative signals.

💡 Why SPACE Matters

The DORA metrics focus on delivery outcomes. SPACE complements DORA by measuring the human factors that drive sustainable performance. Together, they provide a complete picture of engineering health.

See your engineering metrics in 5 minutes with CodePulse

The Five SPACE Dimensions

1. Satisfaction and Well-being

Definition: How fulfilled and healthy developers feel about their work, team, tools, and organization.

Why it matters: Satisfied developers are more productive, more likely to stay, and produce higher-quality work. Burnout is expensive—replacing a developer costs 50-200% of their annual salary.

How to Measure

MetricData SourceLevel
Developer satisfaction scoreSurveys (quarterly)Individual, Team
After-hours work patternsGit commit timestampsTeam
Weekend commit frequencyGit commit timestampsTeam
Tool satisfaction (NPS)SurveysOrganization

CodePulse implementation: CodePulse tracks commit timestamps and can surface after-hours and weekend work patterns through the Developer metrics views. For more on detecting burnout signals, see our Burnout Signals from Git Data guide.

2. Performance

Definition: The outcomes of work—quality, impact, and whether code meets requirements.

Why it matters: Activity without outcomes is waste. Performance metrics ensure that what developers are doing actually delivers value.

How to Measure

MetricData SourceLevel
Code quality (test failure rate)CI/CD, GitHub status checksTeam, Repository
Review coverage %GitHub PR dataTeam, Repository
Merge without approval rateGitHub PR dataTeam, Repository
Customer satisfaction impactProduct analytics, NPSOrganization

CodePulse implementation: The Dashboard tracks test failure rate, review coverage, and merge-without-approval rate directly from GitHub data. See our GitHub Code Quality Metrics guide for details.

3. Activity

Definition: The count of actions or outputs completed—commits, PRs, reviews, etc.

Why it matters: Activity metrics are easy to collect but dangerous if used alone. They provide a baseline of "work being done" but must be balanced with quality and outcome metrics.

⚠️ Warning: Activity metrics can be gamed. Never use them as the sole measure of productivity. Always pair with Performance and Satisfaction metrics.

How to Measure

MetricData SourceLevel
PRs merged per day/weekGitHub PR dataTeam, Repository
Commits per dayGit commit dataIndividual, Team
Reviews given/receivedGitHub PR reviewsIndividual, Team
LOC per day (with caveats)Git commit dataTeam

CodePulse implementation: Activity metrics are available in the Developer leaderboard and Repository metrics. CodePulse excludes bot accounts and auto-generated code by default to provide accurate human activity data.

4. Communication and Collaboration

Definition: How well developers work together—through code review, knowledge sharing, mentoring, and cross-team coordination.

Why it matters: Software development is a team sport. Individual productivity means nothing if teams can't collaborate effectively. Strong collaboration also prevents knowledge silos.

How to Measure

MetricData SourceLevel
Review network densityGitHub PR reviewsTeam
Cross-team review frequencyGitHub PR reviewsOrganization
Knowledge silo riskCode ownership analysisRepository
Review sentiment (constructive vs. negative)PR comment analysisTeam

CodePulse implementation: The Review Network visualizes collaboration patterns, while Knowledge Silos analysis identifies files owned by a single contributor. See our Code Hotspots and Knowledge Silos guide.

Detect code hotspots and knowledge silos with CodePulse

5. Efficiency and Flow

Definition: The ability to complete work with minimal interruptions and waste—how smoothly code flows from idea to production.

Why it matters: Flow state is when developers do their best work. Interruptions, context switching, and waiting for reviews all destroy flow.

How to Measure

MetricData SourceLevel
PR cycle timeGitHub PR timestampsTeam, Repository
Wait time for reviewGitHub PR timestampsTeam
Work in progress (WIP)Open PR countIndividual, Team
Context switching (PRs across repos)PR and commit dataIndividual

CodePulse implementation: Efficiency metrics are front and center in the Dashboard, with full cycle time breakdowns showing where time is spent (coding, waiting for review, in review, waiting for merge). See our Cycle Time Breakdown guide.

Implementing SPACE in Your Organization

Step 1: Choose Metrics Across Multiple Dimensions

The SPACE researchers recommend measuring at least three of the five dimensions. Here's a practical starter set that balances coverage with measurability:

DimensionRecommended MetricsData Source
SatisfactionQuarterly developer survey + after-hours patternsSurvey tool + CodePulse
PerformanceTest failure rate + review coverageCodePulse Dashboard
ActivityPRs merged + reviews givenCodePulse Developer metrics
CommunicationReview network health + knowledge silo countCodePulse Review Network
EfficiencyPR cycle time + wait for review timeCodePulse Dashboard

Step 2: Measure at Multiple Levels

SPACE works best when applied at three levels:

  • Individual: For personal growth and development (never for ranking or performance reviews based solely on metrics)
  • Team: For identifying team-level bottlenecks and collaboration issues
  • Organization: For executive visibility and resource allocation

Step 3: Combine Quantitative and Qualitative

Git data tells you what is happening. Surveys and conversations tell youwhy. Both are essential for actionable insights.

📊 SPACE in CodePulse

CodePulse provides automated measurement for 4 of 5 SPACE dimensions directly from GitHub data:

  • Satisfaction: After-hours commit patterns via Developer metrics
  • Performance: Test failure rate, review coverage via Dashboard
  • Activity: PRs, commits, reviews via Developer leaderboard
  • Communication: Review Network and Knowledge Silos analysis
  • Efficiency: Full cycle time breakdown via Dashboard

SPACE vs. DORA: Complementary Frameworks

SPACE and DORA are not competing frameworks—they're complementary. Here's how they differ and how to use them together:

AspectDORASPACE
FocusDelivery outcomesDeveloper experience and productivity
Metrics4 specific metrics5 dimensions, flexible metrics
LevelTeam/OrganizationIndividual/Team/Organization
Data sourcesMostly automated (CI/CD)Mixed (automated + surveys)
Best forDelivery performanceDeveloper productivity and well-being

Recommendation: Use DORA for delivery metrics (deployment frequency, lead time, CFR, MTTR) and SPACE for broader productivity measurement. See our DORA Metrics Guide for detailed implementation.

Common Pitfalls to Avoid

1. Using Activity Metrics for Performance Reviews

Commits, PRs, and LOC are easily gamed. Using them in performance reviews incentivizes volume over value. Use SPACE metrics for team-level insights, not individual rankings.

2. Measuring Only What's Easy

Git data is easy to collect, but don't ignore Satisfaction. Quarterly surveys take effort but provide critical context that numbers alone can't show.

3. Optimizing a Single Dimension

Pushing for Activity (more PRs!) while ignoring Efficiency (long wait times) leads to burnout and frustration. Balance all dimensions.

4. Not Acting on Insights

Metrics without action are just vanity metrics. When SPACE reveals issues (high context-switching, review bottlenecks, after-hours work), commit to addressing them.

Getting Started with SPACE Metrics

  1. Connect your repositories: Add your GitHub organization to CodePulse to start collecting Activity, Performance, Communication, and Efficiency data automatically.
  2. Review the Dashboard: Your Dashboard shows key SPACE metrics including cycle time (Efficiency), review coverage (Performance), and PR activity (Activity).
  3. Explore Review Network: The Review Network visualizes Communication patterns and highlights collaboration gaps.
  4. Add a quarterly survey: Use a simple developer satisfaction survey to capture Satisfaction data that complements your automated metrics.
  5. Set dimension targets: Based on baseline data, set improvement targets for at least three SPACE dimensions.
Measure what matters. Build a better experience. Ship with confidence.

See these insights for your team

CodePulse connects to your GitHub and shows you actionable engineering metrics in minutes. No complex setup required.

Free tier available. No credit card required.