Crawl Budget
Sitecheck Team
The number of pages a search engine will crawl on your site within a given timeframe.
Crawl budget refers to the number of URLs a search engine crawler (such as Googlebot) will fetch and process on your site within a given period. It is influenced by site size, server response speed, and the perceived quality of your pages.
Why it matters: On large sites, wasting crawl budget on low-value or duplicate pages means important content gets crawled less frequently — or not at all — delaying indexing.
Quick tips:
- Block low-value URLs (pagination, filter combinations, internal search results) via noindex or robots.txt.
- Improve site speed — faster responses allow more pages to be crawled in the same window.
- Reduce redirect chains, which each consume an additional crawl request.
See also: robots.txt, noindex, sitemap.xml.