Free Robots.txt Validator


See how to use Free Robots.txt Validator

Attributed to Martijn Koster, a software engineer who is credited with creating the Internet’s first search engine, robots.txt files, also called the Robots Exclusion Standard, are files webmasters use to help control participating web spiders from accessing parts of, or an entire website. These files are often used in conjunction with sitemaps, which serve the opposite purpose, that of directing spiders to crawl certain parts of a site.

Because robots.txt files contain instructions for web spiders, all instructions it contains must be provided in a specific format to ensure that the crawler running into it will understand the directives it contains. By using the Robots.txt Validator tool you can verify that your instructions will be understandable to the spider reaching it. Any errors and warnings will be provided with your results.

To use: Enter in the full URL of your robots.txt file into the field to test the validity of your file. This tool can be used to test development copies of your file before it is live.


Feedback

 

Users comments:

Login or register to post comments and rate

Not rated yet.