Cycle Time Analysis

Cycle Time Decoded

Where Your PRs Spend Their Time

89%

Waiting Time

of PR cycle time is idle

Based on 117,413 reviewed PRs | GitHub Archive / BigQuery | October 2025

When engineering leaders talk about "speeding up delivery," they often focus on the wrong thing. The data reveals a surprising truth: actual review time is a tiny fraction of your PR lifecycle. The real bottleneck? Waiting.

The 3 Phases of PR Lifecycle

Every PR that goes through code review passes through three distinct phases. Understanding where time actually goes is the first step to optimization.

PR CreatedMergedWaiting for Review89%Review4%Merge6%~97h avg~4.8h~6.9h

PR lifecycle phases shown to scale based on average time spent in each phase

Waiting for Review

From PR opened to first review comment. The developer is blocked, waiting for someone to look at their code.

96.9h avg

In Review

From first review to approval. Active collaboration between author and reviewers.

4.8h avg

Merge Delay

From approval to merge. The PR is approved but not yet merged—often waiting for CI, deployment windows, or final checks.

6.9h avg

"89% of PR cycle time is spent waiting for the first review. The actual review takes less than 10% of total time."

Where Time Actually Goes

The breakdown is stark: 89% of your PR lifecycle is pure idle time. Developers aren't slow—they're blocked.

Cycle Time Breakdown (Average Hours)

89%
4%
6%
Waiting (96.9h)
Review (4.8h)
Merge (6.9h)

Average vs P90 Cycle Time by Phase

  • Merge
  • Review
  • Waiting
0h40h80h120h160hAverageP90

P90 shows the worst 10% of PRs. At P90, total cycle time reaches 148.9 hours.

How PR Size Affects Each Phase

Larger PRs don't just take longer to review—they wait longer to be picked up. The data shows a clear correlation between PR size and total cycle time.

Median Cycle Time by PR Size

  • Total Cycle Time
  • Wait Time
TinySmallMediumLargeXLMassive0h3h6h9h12h
PR SizePRsMedian WaitMedian Total
Tiny (<10)31,4910.2h1.1h
Small (10-50)25,7540.5h2.6h
Medium (50-200)23,1920.9h4.7h
Large (200-500)14,4650.9h5.8h
XL (500-1000)8,3561.1h8.7h
Massive (1000+)14,1550.8h7.3h

Note: Massive PRs (1000+) show lower median wait than XL PRs, suggesting these are often automated or pre-approved changes that bypass normal review queues.

DORA Mapping: Lead Time for Changes

DORA's "Lead Time for Changes" measures the time from code commit to production. PR cycle time is a key component of this metric. Here's how the data maps to DORA performance tiers.

Elite

< 1 hour

Deploy on demand, multiple times per day

High

< 1 day

Deploy between once per day and once per week

Medium

< 1 week

Deploy between once per week and once per month

Low

> 1 week

Deploy less frequently than once per month

Where do reviewed PRs fall?

With a median of 3 hours, reviewed PRs on GitHub land solidly in the High tier. But P90 at 149 hours shows the tail extends into Medium territory—there's room to improve.

Benchmarks by Performance Tier

Based on our analysis of 117,413 reviewed PRs, here are the cycle time benchmarks for each performance tier.

Elite Teams (Top 10%)

Target: <1 hour total cycle time. Achieved through near-instant review pickup (median 0.2h for tiny PRs), automated testing, and streamlined merge processes.

High-Performing Teams (Top 50%)

Target: <4 hours total cycle time. The median for reviewed PRs is3 hours—if you're here, you're doing well. Focus on reducing wait time for your larger PRs.

Average Teams (25th-75th percentile)

Target: <24 hours total cycle time. Same-day merge is the goal. If PRs are regularly carrying over to the next day, investigate your review queue.

Teams Needing Improvement (Bottom 25%)

Warning: >24 hours cycle time. At P90, PRs take 149+ hours. This usually indicates understaffed review capacity, unclear ownership, or process bottlenecks.

🔥 Our Take

Stop optimizing review time. Start optimizing wait time.

The data is clear: 89% of cycle time is waiting for someone to start reviewing. The actual review? Just 4%. Teams obsess over "faster reviews" when they should focus on "faster pickup." Assign reviewers automatically, set SLAs for first response, and make review queue visibility a team priority. The review itself isn't the bottleneck—getting started is.

Related Research

Methodology

This analysis is based on 117,413 merged pull requests that received at least one code review event, from GitHub Archive / BigQuery during October 2025. Cycle time phases are calculated as: Waiting = PR opened to first review; Review = first review to approval; Merge = approval to merge. PRs without review events are excluded from this analysis. For full methodology and all findings, see the 2025 Engineering Benchmarks.

See your team's cycle time breakdown

CodePulse shows exactly where your PRs spend time—and where to optimize.