Search Engine Optimization November 30th, 2010
Google continuously works to enhance their algorithm to index content around the web. Since the content that is delivered by Ajax doesn’t generate different pages (URLs), it is meant to be non-seo friendly. The issue with that is more and more of the best content is being generated by Ajax. Google now realizes this and has finally announced how you can index your Ajax content.
Developers use “hash” tags (#) as a part of URLs in anchor texts to distinguish the content from the rest of the website. It's great for users, but search engine spiders usually cannot understand it. For Google to crawl your Ajax content, they require hash (#) and exclamation point (!) which is together named as “Hashbang”(#!). See how it is implemented in example.
To allow Google to index your Ajax content, you must use the hashbang in the URL, which will be interpreted by Google in a unique manner – they'll take everything after the hashbang, and pass it to the site as a URL parameter instead. The name they use for the parameter is: _escaped_fragment_
Google will then rewrite the URL, and request content from that static page. To show what the rewritten URLs look like, here are some examples:
Google will list the new URLs in SERP (without hash or hashbang).
We at TechWyse are always researching and testing new tools. We have successfully completed our Google Ajax index test!
Below is the outcome to show how our Ajax content is crawlable. A single paged Ajax website has been indexed as four pages in Google.
As this concept is still in Beta, there are a few things to keep in mind before you decide to program your entire website in Ajax:
1) Only the Google algorithm can understand and cache as different pages whereas other search engines currently do not.
2) The page load time is increased because of the Ajax scripts
3) Unique Meta tags cannot be implemented for the pages that contain Ajax content, so you will be losing weightage in many other search engines.