We have spent six months tracking GitHub engineering activity across thousands of startup organizations. The goal: figure out which public engineering patterns reliably precede private-market fundraising events.
Five patterns kept showing up. None of them is a guarantee. All of them appear with enough regularity to be useful as leading indicators — typically 6 to 12 weeks before a Series A announcement, sometimes earlier for later-stage rounds.
Here is the field guide.
Pattern 1: The Contributor Step Function
The most reliable single predictor is a sudden, sustained increase in unique contributors. Not a gradual climb — a step function. The team goes from 5 contributors to 12 in a two-week window.
Why it works. Most startups hire in bursts immediately after closing a round. New hires start committing code within days of joining. If you see the contributor count jump on the company’s public org, the round likely closed 2 to 4 weeks ago, and the announcement is 4 to 8 weeks out.
What to look for. Contributor count increases 50% or more in a 14-day rolling window, sustained for at least 4 weeks afterward.
Where it fails. Contractor bursts inflate this metric without implying a hire. Filter to contributors with 3 or more merged PRs in the window. Open-source community contributors around a public release also inflate it — distinguish employed contributors (matching email domain or prior-commit history with the company) from community contributors.
Pattern 2: The Infrastructure Explosion
A startup that suddenly creates 3 to 5 new public repositories in a single month is building platform infrastructure. This pattern typically appears at the Seed-to-Series-A transition: the core product works, and now the team is building the supporting ecosystem around it.
Why it works. Infrastructure buildout requires capital. Companies do not invest in platform engineering unless they have a runway and a hiring plan to staff the platform team. The timing of new infra repos almost always coincides with a recent or imminent fundraise.
What to look for. 3 or more new repositories created in 30 days, where the new repos are infrastructure-related — SDKs, public APIs, internal tools made public, deployment configs, monitoring stacks — rather than experimental forks or documentation repos.
Where it fails. Some teams use monorepos, and this signal disappears entirely. Supplement with commit-message analysis for keywords like terraform, observability, migration, benchmark, and deploy.
Pattern 3: The Weekend Surge
When a startup’s commit pattern shifts from weekday-only to seven-days-a-week, something has changed. This is especially meaningful when the weekend activity comes from multiple contributors, not a solo founder.
Why it works. Teams work weekends when they are racing toward a deadline. The most common deadlines are a product launch, a fundraise-related demo, a competitive response, or due-diligence preparation. Each of those is a signal that something significant is happening on the calendar.
What to look for. Sustained weekend commit activity across 2 or more contributors for 3 or more consecutive weekends.
Where it fails. Solo founders cosplay productivity by committing on weekends. The signal is in the team weekend activity, not individual.
Pattern 4: The Documentation Sprint
A sudden burst of documentation commits — README updates, API docs, architecture diagrams, contributing guides, onboarding materials — often precedes a fundraise or a launch. This is the team preparing for scrutiny.
Why it works. Documentation is the last thing engineering teams do voluntarily. When they document proactively, they are either preparing for due diligence (fundraise), opening up to community contributions (launch), or onboarding new hires (post-fundraise close). All three are interesting to investors.
What to look for. A week or more of documentation-heavy commits after a period of feature development. The sequence matters: code first, docs second, suggests intentional preparation for an external audience.
Where it fails. Some teams treat docs as feature work and ship them concurrently. In that case, the signal blurs into normal development. Look for a step change in the docs-to-code commit ratio rather than absolute volume.
Pattern 5: The Velocity Regime Change
The strongest signal in the dataset is not high velocity — it is a change in regime. A startup that averaged 30 commits per 14-day window for six months, then suddenly jumps to 90 commits for three consecutive 14-day windows, has undergone a fundamental shift.
Why it works. Velocity regime changes reflect organizational changes. The most common causes are new funding (more engineers, faster iteration), product-market fit (real users forcing the team to ship faster), or a strategic pivot (rebuilding the core). Regime changes that sustain for 6 or more weeks are particularly meaningful because they outlive a single launch sprint.
What to look for. Commit velocity exceeding the trailing 6-month average by 100% or more, sustained for 3 or more consecutive 14-day windows.
Where it fails. AI-pure startups commit constantly regardless of stage. Their baseline is too noisy for this signal to work cleanly. Exclude AI-only orgs from the strongest classification tier and weight other features more heavily for them. Bot-generated commits (Dependabot, Renovate, GitHub Actions, semantic-release) also inflate velocity — filter on author email patterns and commit-message prefixes like chore(deps):.
How to combine the patterns
The patterns are most powerful in combination. A startup showing Pattern 1 (contributor step function) and Pattern 5 (velocity regime change) at the same time is almost certainly in the middle of a fundraise process or has just closed one.
A practical scoring framework: weight each pattern 1 point. A score of 3 or more out of 5 in any given month is your shortlist. A score of 4 or 5 is a “call this week” signal. A score of 2 or fewer is noise — likely a normal product-development sprint.
Cross-check the shortlist with traditional sources. If the company has a Series A close in Crunchbase from the last 60 days, the patterns are explained — the deal is done. If there is no public information and you are seeing 4 of 5 patterns, you are looking at the future.
What This Is Not
The signal is not investment advice. It tells you who to talk to. It does not tell you who to wire money to. The decision still requires founder conversations, product evaluation, market analysis, and competitive teardown. Engineering velocity is a sourcing signal, not a thesis.
The signal is also useless for stealth startups that do not push public code. Pre-stealth founders are choosing OPSEC over discoverability — that trade-off blinds this method completely. There is no workaround.
And the signal is not real-time. The patterns require 14 to 30 days of post-event data to confirm. By the time the regime change has sustained across three windows, six weeks have passed. The lead time is real, but it is not instantaneous.
A Starter Workflow
For VCs, scouts, or angels who want to test this without building a full pipeline:
Pick the 50 startups in the sectors you cover. For each company, hit the GitHub REST API for /orgs/{org}/repos?sort=pushed, grab the top 3 most recently pushed repos, and pull 90 days of commit activity. Compute a 14-day rolling sum. Compare the most recent 14-day sum against the trailing 90-day median. A ratio above 2.0× is your shortlist for Pattern 5.
Cross-check against contributor count today versus 30 days ago for Pattern 1. Cross-check against new-repo creation in the last 30 days for Pattern 2. The other two patterns require richer data, but Patterns 1, 2, and 5 give you a working shortlist with a single Python script and a 5,000-call/hour authenticated GitHub token.
Run it every Monday morning. The discipline of the cadence is more important than the sophistication of the model. The first few weeks, the signal will be noisy. By the eighth week, you will start to see the regime changes clearly.
Methodology and the live ranked watchlist across 20 sectors are at signals.gitdealflow.com/blog/5-github-patterns-that-predict-fundraises.