Technical

Crawl Budget

The number of pages a search engine will crawl on a site in a given time window, constrained by demand and capacity.

Definition
Slug
crawl-budget
Category
Technical
Also known as
crawl rate, crawl allocation

Crawl budget is the practical limit on how many of a site's URLs a search engine will fetch in a given time window. Google describes it as a function of two factors: crawl capacity (how fast the site can serve requests without degrading) and crawl demand (how much Google wants to crawl based on freshness signals and historical URL importance).

For most small and medium sites, crawl budget is not a constraint. Google can crawl tens of thousands of URLs without strain, and any new content gets discovered within hours. Crawl budget becomes a real concern at scale: large e-commerce catalogues, news archives, and faceted-navigation sites where the URL space can balloon into millions of variants.

When crawl budget matters, the symptoms are visible in Search Console's Crawl Stats report: high crawl volume on low-value URLs (filter combinations, pagination, internal search results), slow discovery of new content, and stale indexes on pages that have been updated. The fix is rarely "ask Google to crawl more." The fix is to spend the existing budget more wisely.

Crawl-budget optimisation tactics: block low-value parameter URLs in Robots.txt or via parameter handling in Search Console; consolidate near-duplicates with Canonical Tag; remove or noindex thin pages that absorb crawl bandwidth without ranking; improve server response time so each crawl request consumes less of the capacity envelope; and submit a clean XML Sitemap that prioritises the URLs you want crawled.

Crawl rate is also affected by site performance — specifically Time to First Byte (TTFB) and overall response latency. When the crawler detects that a site is slowing under load, Googlebot backs off to avoid degrading the site for human users. Improving back-end performance directly expands the crawl envelope.

For sites under a million URLs, treat crawl budget as a secondary concern. For sites above that, monitor Crawl Stats monthly. Look at the ratio of useful crawl (new URLs, recently updated URLs, important commercial URLs) to wasted crawl (filter variants, soft 404s, redirects). The healthier the ratio, the faster the site responds to content changes in search results.

Apply it

Track domain authority for your sites

Authority Score, backlinks, and 90-day deltas — refreshed daily across every site you monitor.

Start free
Climb

Add your sites. Watch the score.

Daily Authority Score and backlink monitoring for portfolio operators. Free tier — no card.