Free SEO Tool

robots.txt Checker

Test if your robots.txt is accessible, properly formatted, and not accidentally blocking search engine crawlers.

Understanding robots.txt

What is robots.txt?

A text file at the root of your website that tells search engine crawlers which pages they can and cannot access. It uses the Robots Exclusion Standard to communicate crawling rules.

Why does 5xx matter?

If robots.txt returns a 5xx error, Google treats it as a full block and stops crawling your entire site for up to 30 days. This is different from a 404, which is treated as no restrictions.

Browser vs Bot differences

CDNs like Cloudflare can serve different responses to browsers and bots. Our checker fetches as both a Chrome browser and Googlebot to detect differential treatment.

Sitemap directives

Including Sitemap: directives in your robots.txt helps search engines discover your sitemap without relying on Search Console. Multiple sitemaps can be listed.

Frequently Asked Questions

Does this tool actually fetch robots.txt from my server?

Yes. We make real HTTP requests to your domain from both a Chrome browser user-agent and a Googlebot user-agent, then compare the responses.

Can I monitor robots.txt continuously?

Yes! Pulse Stack™ SEO Monitoring checks your robots.txt every 5 minutes, alerting you immediately if it starts returning errors or if the content changes unexpectedly.

What if my robots.txt returns different content to Googlebot?

This is a serious issue called differential serving. Our tool checks both perspectives. For continuous monitoring, use the SEO Monitoring dashboard which compares browser and bot responses every cycle.

Is there a limit to how many domains I can check?

The free tool has no per-user limits. For automated, scheduled monitoring across multiple domains, sign up for SEO Monitoring.

Monitor your robots.txt continuously

Get alerts within minutes if your robots.txt starts returning errors, changes unexpectedly, or starts blocking crawlers.

Start monitoring free