What GitHub's largest code review study reveals about how software really ships
83%
No Review
(1000+ line PRs)
68%
Self-Merged
(all PRs)
18x
Less Scrutiny
(large PRs)
53%
Longer Wait
(new contributors)
Based on analysis of 3,387,250 merged PRs from GitHub Archive / BigQuery | December 2024
We analyzed over 3.4 million merged pull requests from GitHub's public archive to understand how code review actually works in practice. What we found challenges the assumption that "code review" is a universal practice.
The uncomfortable truth: The bigger the code change, the less anyone looks at it. 83% of pull requests over 1,000 lines ship without any formal code review.
We measured the percentage of PRs that receive zero formal code review (no approvals, no change requests, no review comments). The results are striking: across all PR sizes, the majority ship without any documented review process.
Based on 3,387,250 merged PRs. "No formal review" = zero approvals, zero change requests, zero review comments.
"83% of pull requests over 1,000 lines ship to production with zero formal code review."
We compared the PR author to the person who clicked "merge." In 68% of cases, they're the same person. While self-merge can be appropriate (small fixes, sole maintainers), at this scale it suggests "code review culture" may be more aspiration than reality.
2,346,381 PRs self-merged
Author = Merger (68.03%)
1,102,410 merged by someone else
Different person reviewed (31.97%)
"Two-thirds of all code on GitHub is merged by the same person who wrote it."
Intuition suggests larger changes need more review. The data says the opposite. Review comments per 100 lines of code drops from 0.91 for tiny PRs to just 0.05 for massive ones—an 18x reduction in scrutiny per line.
As PR size increases, reviewers leave exponentially fewer comments per line of code—suggesting cognitive overload and "rubber-stamping."
"The bigger the change, the less anyone looks at it. Large PRs receive 18x fewer review comments per line of code."
First-time contributors to a repository wait significantly longer for their PRs to be merged. This "onboarding tax" has real implications for open source sustainability and team onboarding.
First-time contributors
26h
median time to merge
P90: 305h (12.7 days)
133,315 PRs analyzed
Repeat contributors
17h
median time to merge
P90: 199h (8.3 days)
913,530 PRs analyzed
First-time contributors wait 53% longer (9 extra hours)
"First-time contributors wait 53% longer for their code to be reviewed. The onboarding tax is real."
Contrary to the "Friday deploy" meme, Monday is actually the biggest merge day—19% of all merges happen at the start of the week. This suggests teams clear their review backlogs after the weekend.
Monday has the highest merge rate (19.08%), followed by Tuesday (18.05%). Weekend days see the lowest activity.
"Monday isn't just the start of the work week—it's when 1 in 5 PRs finally get merged."
Bot PRs (Dependabot, Renovate, CI automation) peaked at 62% in 2022 and have since declined to 34% in 2024. Teams are becoming more selective about what they automate.
2022 saw peak bot activity at 62%. By 2024, bot PRs dropped to 34%—nearly half the peak level.
Bot PRs (Dec 2024)
37.9%
1,822,367 PRs
Human PRs (Dec 2024)
62.1%
2,989,928 PRs
"Bot PRs peaked at 62% in 2022, then declined to 34% in 2024."
Over a quarter of all code pushes happen on Saturday and Sunday. And 64% of PRs are opened outside traditional 9-5 UTC business hours. The "always-on" engineering culture is reflected in the data.
27.6%
Weekend Pushes
Over 1 in 4 commits
64%
After-Hours PRs
Outside 9-17 UTC
16:00
Peak PR Hour
UTC (4 PM)
"27% of code pushes happen on weekends. The always-on culture is real."
We analyzed cycle times across notable open source projects. The variance is staggering: VS Code and Homebrew merge PRs same-day, while Zephyr RTOS takes over 4 days median. Context matters—embedded systems demand rigor, package managers value speed.
| Repository | Median Merge Time | PRs Analyzed |
|---|---|---|
| Homebrew/homebrew-cask | Same-day | 8,060 |
| microsoft/vscode | Same-day | 1,651 |
| Homebrew/homebrew-core | 1h | 8,459 |
| elastic/elasticsearch | 2h | 3,030 |
| elastic/kibana | 3h | 5,000 |
| grafana/grafana | 3h | 2,019 |
| home-assistant/core | 4h | 2,843 |
| Repository | Median Merge Time | PRs Analyzed |
|---|---|---|
| zephyrproject-rtos/zephyr | 102h (4.3 days) | 1,813 |
| llvm/llvm-project | 23h (1.0 days) | 5,044 |
| rust-lang/rust | 22h (0.9 days) | 1,570 |
| NixOS/nixpkgs | 20h (0.8 days) | 13,990 |
| cockroachdb/cockroach | 19h (0.8 days) | 2,430 |
"Review time varies 12x across elite projects—from 2 hours to 25 hours."
Different language ecosystems have different velocities. PowerShell leads with a 6h median (likely due to automated workflows), while C is slowest at 24h.
| Language | Merged PRs | Median Hours | Avg PR Size |
|---|---|---|---|
| PowerShell | 9,026 | 6h | 307 lines |
| Dockerfile | 5,704 | 11h | 85 lines |
| C# | 36,364 | 14h | 377 lines |
| Shell | 29,077 | 15h | 111 lines |
| TypeScript | 204,152 | 16h | 377 lines |
| JavaScript | 89,678 | 16h | 353 lines |
| Ruby | 22,001 | 16h | 145 lines |
| Rust | 45,942 | 18h | 299 lines |
| Java | 77,086 | 20h | 282 lines |
| Go | 72,837 | 20h | 260 lines |
| Python | 129,702 | 21h | 271 lines |
| Kotlin | 24,435 | 21h | 230 lines |
All data comes from GitHub Archive, a public dataset that records all public GitHub events. We queried the data using Google BigQuery's public dataset.
| Metric | Sample Size |
|---|---|
| PR size and review analysis | 3,387,250 merged PRs |
| Self-merge analysis | 3,448,791 merged PRs |
| Contributor analysis | 1,046,845 merged PRs |
These benchmarks come from public open source projects. How do your private repositories stack up? CodePulse tracks review coverage, self-merge rates, and contributor wait times for your team.
No credit card required. 5-minute setup. Read-only GitHub permissions.