Note: If you registered a WebmasterWorld.com account before January 2013,
you can login with that account.
Note: If you registered a WebmasterWorld.com account before January 2013, you can login with that account.
Please check your spam box + whitelist our email address noreplyimninjas.com
Advanced Robots TXT Generator
See how to use Advanced Robots TXT Generator
With this tool you can compare how your site currently handles search robots and how your site would handle them with the new proposed robotos.txt file generated by this tool. Simply paste or type your URL into the box before clicking on “Compare.”
The first thing any search engine crawling a site will do is look for the robots.txt file on the root domain. This plain text file is used to direct search engine crawlers to or away from certain areas of a website. If one exists the list of directives will be read to see what files and directories are blocked from crawling, if any are. You can create this file with this generator.
To upload a file that is preexisting to allow for editing within the tool, paste or type in the URL of your root domain in the text box and click “Upload.” Use our tool to create either Disallow or Allow directives for specified site content. Simply click on the drop down and select “Allow” or “Disallow.” When selecting User Agents selecting “*” will include all user agents, or select only one. By selecting “Add Directive” you are able to add a new directive to the list. Conversely selecting “Remove Directive” allows you to remove directives in order to create new ones.
To exclude specific search engines from crawling content on your site you can specify which you are instructing within the criteria. To create alternative directives for a single search engine crawler you must click the User Agent box and select which bot you are directing. By clicking “Add Directive” the new custom section is included with the generic directives. These generic directives can be edited to be allowed by creating new Allow directives for the specific user agent and removing the Disallow that matches.
If you have a XML-based sitemap, a file that allows web crawlers to be able to fully understand a website’s entire design, this can be added to your robots.txt file. To add this, enter the URL of the file into the XML Sitemap box and click “Update.”
Once you are done creating or editing your sitemap you can export your completed file by clicking on the “Export” button. Upload your new robots.txt file to your root domain through an FTP client.
The tool was edited by Ann Smarty