Now Use Fetch As Googlebot To Render Webpages As Well


Fetch And Render As Googlebot
Google Webmaster Tools has a great feature, Fetch As Google which allows webmasters to see the results of Googlebot trying to fetch their webpages. This tool returns useful information such as server headers and other HTML, which can help diagnose technical problems. It's was a great tool - but now Google has made it even greater! You can now see how Googlebot renders the webpage as well, rather than just fetching it.
Previously, you could use the Fetch as Googlebot feature to view an output of code and HTML. This served programmers well, but wasn't very helpful to those who're new to web programming, or those who are fairly new. Now, however, they can see a visual representation of what Googlebot sees.

In addition, it will also show you errors on what resources Googlebot could not access, effectively preventing it from fetching and rendering them on the page. Here is the new “fetch and render” button on the Fetch as Google feature:

Fetch as Googlebot

Page Rendering

In order to render the page, Googlebot first tries to find all the external style or script files associated with a webpage. They are fetched if they are found, or not blocked by robots.txt. Such files include, but are not limited to, CSS, JavaScript, or image files. These are then used to render a preview image that shows Googlebot's view of the page.

After you have a page to Fetch and Render, it will take a few minutes and show you a status complete indicator. The indicator may show “partial” rendering, if resources are being blocked from Googlebot’s crawl. Either way, clicking on it will show you the rendering results.

Blocked Resources

External files that can easily found and accessed are used as-is. But what about those that might be blocked, such as by external servers or by your own robots.txt? For such cases, Googlebot follows the robots.txt directives for all such files, and shows the errors in accessing these files below the preview image of the rendered page.

For the sake of a more complete crawl, it is recommended that you make all such embedded resources accessible to Googlebot, provided that the resource adds some meaning. Resources such as website-analytics scripts, or social media buttons don't contribute meaningfully, and hence can be omitted.

Google recently announced more JavaScript debugging tools coming to Google Webmaster Tools, this is that tool, Google was talking about. It now clearly shows what resources are being blocked, such as JavaScript, CSS and so forth.

Need Quick Help within 24 Hours? ASK NOW



If you don't want to get yourself into Serious Technical Trouble while editing your Blog Template then just sit back and relax and let us do the Job for you at a fairly reasonable cost. Submit your order details by Clicking Here »

6 comments : Post Yours! Read Comment Policy ▼
PLEASE NOTE:
We have Zero Tolerance to Spam. Chessy Comments and Comments with 'Links' will be deleted immediately upon our review.

  1. I wonder if it is good that Joomla! is blocked by robots.txt access to images, templates, and css stylesheet? Because the site only renders as text. I am also interested about SEO, because I have pictures appropriately named, pictures are also descriptions in the 'alt'. Is google it actually sees and indexes?

    ReplyDelete
  2. Wordpress for example doesn't block so much

    ReplyDelete
  3. Qasim Zaib, This tools only help for faster indexing?

    ReplyDelete
  4. I am going to try for Results news Blog and will share the results/improvements i.e. fast indexing etc soon.

    Thanks for updating us,keep it up....Qasim Zaib

    ReplyDelete
  5. Helpful article
    Blogger Widgets l Blogger Templates l Blogger Tips
    & Tricks l Facebook Tips & tricks l SEO l More
    New Blogger Widget

    ReplyDelete