robots.txt Validator
Paste your robots.txt content and validate its syntax and formatting.
3 errors
5 warnings
2 notices
-
Warning
line 1
Directive casing "user-agent" is non-standard. Use "User-Agent".
user-agent: *
-
Notice
line 2
Empty Disallow means "allow everything" — valid but confirm intent.
Disallow:
-
Warning
line 5
Directive casing "disallow" is non-standard. Use "Disallow".
disallow: /secret
-
Warning
line 6
Directive casing "User-agent" is non-standard. Use "User-Agent".
User-agent: Googlebot
-
Error
line 7
Missing colon — directive must be in "Field: value" format.
Disallow /no-slash-colon
-
Notice
line 8
Empty Allow has no effect — you can remove it.
Allow:
-
Warning
line 9
Directive casing "Crawl-delay" is non-standard. Use "Crawl-Delay".
Crawl-delay: fast
-
Error
line 9
Crawl-delay value "fast" is not a number.
Crawl-delay: fast
-
Error
line 10
Sitemap URL "not-a-url" must be an absolute URL starting with http:// or https://.
Sitemap: not-a-url
-
Warning
line 11
Directive casing "User-agent" is non-standard. Use "User-Agent".
User-agent: Bingbot