Team Benchmarks

What Separates Elite Engineering Teams

From 0 hours to 102 hours—why cycle time varies 50x across top projects.

102x

Cycle Time Variance

homebrew-cask (0h) vs zephyr (102h)

Based on 3,387,250 merged PRs | GitHub Archive / BigQuery | December 2024

The Speed Spectrum

When we analyzed cycle time (PR open to merge) across GitHub's most active projects, we found a staggering 102x difference between the fastest and slowest teams.

Fastest Projects (Cycle Time)

ProjectMedian Cycle TimePRs Analyzed
Homebrew/homebrew-cask0h8,060
microsoft/vscode0h1,651
Homebrew/homebrew-core1h8,459
elastic/elasticsearch2h3,030
elastic/kibana3h5,000
grafana/grafana3h2,019
home-assistant/core4h2,843

Slowest Projects (Cycle Time)

ProjectMedian Cycle TimePRs Analyzed
zephyrproject-rtos/zephyr102h1,813
llvm/llvm-project23h5,044
rust-lang/rust22h1,570
NixOS/nixpkgs20h13,990
cockroachdb/cockroach19h2,430

"Homebrew merges PRs in 0 hours median. Zephyr takes 102 hours. Same platform, same tools—102x difference in speed."

Review Time Benchmarks

Time to first review varies 12x across projects—from 2 hours to 25 hours median.

Fastest Review

homebrew-core2h
homebrew-cask2h
core7h
rust8h
llvm-project10h
next.js11h
nixpkgs13h

Slowest Review

App25h
odoo24h
dagster24h
kibana18h
pytorch16h

Case Study: Homebrew (The Speed Machine)

Homebrew consistently tops our benchmarks with 0-2 hours median cycle time across both homebrew-core and homebrew-cask. How do they do it?

Automated Everything

Extensive CI automation validates package updates. If the tests pass, it often merges automatically. Human review is the exception, not the rule.

Narrow Scope

Most PRs are version bumps or new package additions—low risk, templated changes. The review burden is minimal because the blast radius is small.

Key Insight: Homebrew optimized for throughput by standardizing contributions.

8,060 PRs merged in homebrew-cask with a 0-hour median cycle time.

"VS Code merges PRs in 0 hours median. LLVM takes 23 hours. Elite teams aren't just fast—they've designed their process to be fast."

Case Study: Zephyr (Slow by Design)

The Zephyr RTOS project has a 102-hour median cycle time—the slowest in our dataset. But this isn't a problem to fix; it's a feature.

Safety-Critical Domain

Zephyr powers embedded systems, medical devices, and IoT infrastructure. A bug here can brick hardware or create safety hazards. Thorough review is mandatory.

Multi-Stage Review

Changes go through architecture review, code review, and testing gates. Multiple maintainers must sign off. Speed is sacrificed for correctness.

Key Insight: Zephyr's slow cycle time is appropriate for its domain.

1,813 PRs merged with rigorous review—because the cost of bugs is measured in recalls and safety incidents, not just Slack messages.

🔥 Our Take

"Elite" doesn't mean "fastest." It means "appropriate for context."

Homebrew's 0-hour median isn't better than Zephyr's 102-hour median—they're solving different problems. The real question isn't "how do we get faster?" but "what's the right speed for our risk profile?" A package manager can afford to ship fast and revert. An RTOS kernel can't. Know your domain before setting benchmarks.

"Review times vary 12x across elite projects—from 2 hours to 25 hours."

What Your Team Can Learn

Benchmark Against Similar Projects

Don't compare your enterprise backend to Homebrew. Find projects with similar risk profiles and complexity. Elastic and Grafana (3h median) are better comparisons for most teams.

Automate the Obvious

Homebrew's speed comes from automation, not heroics. Identify your version-bump equivalents—changes that can be validated by CI without human review.

Scope Down, Speed Up

Smaller, focused PRs get reviewed faster everywhere. VS Code's 0-hour median comes partly from a culture of small, atomic changes.

Know When Slow is Right

If you're building infrastructure, databases, or safety-critical systems, Zephyr and LLVM are your peers. A 20+ hour cycle time might be exactly right.

Related Research

Methodology

This analysis is based on 3,387,250 merged pull requests from GitHub Archive / BigQuery during December 2024. Cycle time is measured from PR creation to merge event. Review time is measured from PR creation to first review comment or approval. Projects shown had significant PR volume to ensure statistical reliability. For full methodology, see the complete study.

Benchmark your team against elite projects

CodePulse shows you exactly where your cycle time and review time stand—and what's slowing you down.