How To Test Your Robots.txt File In Google Webmaster Tools?

Test your robots.txt file
A robots.txt file can be important for a sufficiently large website that might want to keep some of its page away from search engines. Most smaller websites won't even need this file, but for others, it can be sometimes difficult to make and maintain the correct robots.txt files, as finding the directives within a large robots.txt that are blocking individual URLs can be tricky. For that, you need a debugging or testing tool to check out any problems with your robots file. And as always, Google is there to help.
The Google Webmaster Tools dashboard now has an updated robots.txt testing tool which would let webmasters find problems with their robots.txt files!

Robots.txt testing tool

You can see that this tool is available under the Crawl Section in Google Webmaster Tools.

robots.txt tester

In this sub-section, you can see the current robots.txt file associated with your website, and you can test new URLs to see whether they're allowed or disallowed for crawling. Under your file, you can see a text field where you enter the URL in question.


Finalize your robots.txt file in the editor, and then test a URL to see if you've got the directive right, and that it works properly. The changes take effect live, so you can add more directives on the spot to test them. Of course, you will have to copy and paste these new directives on your production site's hosting server separately, since Google Webmaster Tools has no way of updating that automatically.

If the URL actually gets blocked, the tool will guide you through complicated directives, and will highlight the specific one that lead to the decision to clock that URL. This way, you can really check whether a certain rule you set up works or not.

Additionally, you'll be able to review older versions of your robots.txt file, and see when access issues block Googlebot from crawling. For example, if Googlebot sees a 500 server error for the robots.txt file, Google will generally pause further crawling of the website.

You can also combine it with other parts of Webmaster Tools: for example, you might use the updated Fetch as Google tool to render important pages on your website. If any blocked URLs are reported, you can use this robots.txt tester to find the directive that's blocking them, and, of course, then improve that. A common problem comes from old robots.txt files that block CSS, JavaScript, or mobile content — fixing that is often trivial once you've seen it.

Do give this tool a go, and tell us what you think in the comments section below. Cheers :)

If you don't want to get yourself into Serious Technical Trouble while editing your Blog Template then just sit back and relax and let us do the Job for you at a fairly reasonable cost. Submit your order details by Clicking Here »

Post a Comment

PLEASE NOTE:
We have Zero Tolerance to Spam. Chessy Comments and Comments with 'Links' will be deleted immediately upon our review.