Google Webmaster Tools Gets Updated Robots.txt Testing Tool

webmaster tools updated

Webmaster tools

Google has released an updated robots.txt testing tool in Webmaster Tools. The tool can be found in the Crawl section.

The aim of the new version of the tool is to make it easier to make and maintain a “correct” robots.txt file and make it easier to find the directives within a large file that are or were blocking individual URLs.

In the above you will see the current robots.txt file and can test new URLs to see whether they’re disallowed for crawling,” says Google’s Asaph Amon, describing the tool. “To guide your way through complicated directives, it will highlight the specific one that led to the final decision. You can make changes in the file and test those too, you’ll just need to upload the new version of the file to your server afterward to make the changes take effect. Our developer’s site has more about robots.txt directives and how the files are processed.”

“Additionally, you’ll be able to review older versions of your robots.txt file, and see when access issues block us from crawling,” Amon explains. “For example, if Googlebot sees a 500 server error for the robots.txt file, we’ll generally pause further crawling of the website.”

Google recommends double-checking the robots.txt files for your existing sites for errors or warnings. It also suggests using the tool with the recently updated Fetch as Google tool to render important pages or using it to find the directive that’s blocking URLs that are reported as such.

Google says it often sees files that block CSS, JavaScript, or mobile content, which is problematic. You can use the tool to help you fix that if it’s a problem with your site.

Let’s chat about your project

Mail To Us

Our Skype Id
uniqwebtech

Get in touch, the kettle is always on.

    Website DesignDigital Marketing

    Website DevelopmentBranding / Design