Enterprise Benchmarks

The Enterprise Benchmark

Global GitHub stats are misleading. Here's what matters for teams.

3h

Median Cycle Time

for reviewed PRs

Based on 117,413 reviewed PRs | GitHub Archive / BigQuery | October 2025

Why Global Stats Don't Apply

When you see "GitHub average merge time" statistics, you're not seeing enterprise reality. You're seeing a mix of solo projects, hobby repos, automated bots, and instant self-merges.

The 85% Problem

85.4% of GitHub PRs have no code review at all. When you include them in "averages," you're not measuring code review—you're measuring the absence of it.

To find meaningful benchmarks for teams that actually practice code review, we filtered to PRs that received at least one review event. That's only 14.6% of all PRs—but it's the slice that matters for enterprise engineering teams.

803K PRsAll GitHub PRsHas Code Review?85.4%Filtered Out14.6%Pass Through117K PRsReviewed Only

"85% of GitHub PRs have no review. Using global averages is like measuring traffic speed by including parked cars."

The 117,413 Reviewed PRs Sample

By filtering to PRs with actual code review, we get metrics that reflect team-based development workflows—the kind most enterprises care about.

117,413

Reviewed PRs

49,808

Unique Repos

50,274

Unique Authors

14.6%

of All PRs

Global vs Enterprise: The Numbers

All GitHub PRs

Includes solo projects, hobby repos, self-merged code

Median Cycle Time0h (instant)
Self-Merge Rate71.48%
Weekend Merges21.49%
PRs with Review14.6%

Reviewed PRs Only

Recommended

PRs with code review - more representative of team workflows

Median Cycle Time3h
Self-Merge Rate52.11%
Weekend Merges17.18%
PRs with Comments32.2%

Cycle Time Breakdown

The 3-hour median cycle time breaks down into three phases. Waiting for review dominates—but the story is more nuanced than the averages suggest.

Average Cycle Time by Phase (105.0 hours total)

WaitingReviewMerge0255075100Hours

Waiting

PR opened to first review

Median: 0.6h

Review

First review to approval

Median: 0h

Merge

Approval to merged

Median: 0.1h

92% of time is waiting

Waiting for review (96.9h avg) dominates the cycle. Actual review time (4.8h) and merge delay (6.9h) are relatively quick once attention arrives.

"The 3-hour median cycle time is the real enterprise benchmark—not GitHub's 0h (which reflects instant self-merges)."

Self-Merge Reality in Reviewed PRs

Even among PRs that receive code review, a significant portion are still self-merged. But the gap tells an important story.

Global Self-Merge
71.48%

All GitHub PRs

Reviewed PRs Self-Merge
52.11%

19pp lower than global

The 19 percentage point gap shows that review culture makes a difference. Repos that practice code review have meaningfully lower self-merge rates—but 52% is still high. Even "reviewed" PRs often get approved and then merged by the author.

Our Take

Stop comparing your team to GitHub averages. They're meaningless for enterprise engineering.

The 3-hour median cycle time from reviewed PRs is your real benchmark. If your team is slower, focus on the waiting phase—that's 92% of the problem. If you're faster, you're likely operating at elite levels. Global stats include too much noise from solo projects and instant self-merges to be useful for team workflows.

What This Means for Your Team

Use 3 hours as your baseline

If your median PR cycle time exceeds 3 hours, you have room to improve. Elite teams ship faster—but this is the benchmark for teams with healthy review practices.

Focus on reducing wait time

92% of cycle time is waiting for that first review. Dedicated review times, smaller PRs, and better reviewer assignment can cut this dramatically.

Target 50% or lower self-merge

The 52% self-merge rate among reviewed PRs sets a realistic floor. If your rate is higher, your branch protection rules may not be effective.

Ignore global "0h" cycle times

Reports showing instant merge times are measuring self-merges, not code review. They're not relevant to your engineering culture goals.

Related Research

Methodology

This analysis uses the "reviewed PRs" subset from our 2025 Engineering Benchmarks study. A PR is classified as "reviewed" if it received at least one PullRequestReviewEvent before merge. Data source: GitHub Archive / BigQuery for October 2025. Sample: 117,413 reviewed PRs from 49,808 repositories. Cycle time phases calculated from timestamp differences between PR events.

Measure your team against the enterprise benchmark

CodePulse shows you exactly where your cycle time goes—waiting, review, or merge.