Add a robots.txt

robots.txt is the first file crawlers fetch. A missing or wrong one either gives crawlers no guidance or — worse — a stray Disallow: / can de-index the entire site. It also controls AI crawler access (GPTBot, ClaudeBot).

How to fix

Serve a robots.txt that allows crawling of public content, disallows only private paths, declares the sitemap, and intentionally decides whether AI bots are allowed.

User-agent: *
Allow: /
Disallow: /admin

Sitemap: https://example.com/sitemap.xml

Detected automatically by the SEOlvl SEO Health audit (check robots). Run a free audit or see the full issue library.