SEO | Web Analytics

New Google Recommendations for AJAX Websites

Google Webmaster Central recently announced that Google was changing its recommendations for AJAX web pages. What do you need to do to ensure that your AJAX pages are crawlable?

The good news is: nothing!  Google’s bots are able to render Javascript and understand web pages like modern browsers do, even AJAX web pages.  As long as you do not block the Google from crawling your CSS and Javascript files, the crawlers will “see” the site correctly.  To check this, you can use the Google Search Console “Fetch as Google” tool to “Fetch and Render” pages from your website.

Google Webmaster Tools Fetch and Render Results Comparison

When the rendering is complete, you can compare the side-by-side images of how the Googlebot saw your page versus how a visitor to your website would have seen your page.  Any blocked resources will be listed in the table below.

If your site uses the older AJAX crawling scheme (aka. _escaped_fragment_ URLs), you do not need to make any changes.  Your site will still be indexed correctly.  The next time you redesign your site you can eliminate the _escaped_fragment_ URLs.

Now the bad news: even though Googlebots can read your AJAX pages, Google Analytics cannot automatically track any changes made to a page with AJAX. This is a problem if your site uses AJAX to submit forms or load new content in tabs or modals, for example. You will need to specially configure virtual pageviews or events in order to record that users have viewed new content or interacted with elements on the page. We recommend Google Tag Manager to do so.

Do you need help understanding whether your site is correctly crawled by Googlebots or correctly tracked in Google Analytics? Send me an email.