With the “Web 2.0” world that we live in, companies are inclined to create “snazzy” new AJAX experiences for users. I have been dubbing this the AJAX Dilemma lately because organizations/companies/website owners are creating sites that are not accessible and SEO friendly. They are risking crippling their businesses in the areas of search traffic by not planning SEO into the lifecycle of the product. Ensuring there is a Non-Javascript experience for search engines and accessibility is not a new practice. Creating sites using “Progressive Enhancement” and using “Unobtrusive JavaScript” has been around for quite some time.By creating web standard code and using the methods above, you are not sacrificing SEO for User Experience.
This is important to creating search friendly websites because ensurng search engines are able to crawl and index your content is necessary to acquiring search traffic and market share. By not embracing these approaches, you are leaving yourself in a handicap for SEO efforts and leaves you at a competitive disadvantage. In fact, it should not even just be designed on a project per project basis, but should be built into code that takes advantage of web standards and progressive enhancement.
The AJAX Dilemma for SEO
Show and Hide type interactions that are used in content areas throughout a site, like tabs or accordions, can create a crippling effect on the amount of content crawled by search engines, if they are not built correct. This content is typically built with poorly written JavaScript that does not take advantage of Web Standard code. The hidden content areas, when clicked on, will provide a string that follows the URL like, #somehiddencontent. Traditionally search engines ignore everything after the “#” (hash) tag.
Here are some examples of these interactions that are poorly written and don’t provide a graceful fail when JavaScript is turned off.
Here is an example of tabs used on iFoods.tv with JavaScript on:
Here is that same interaction, with JavaScript turned off:
Notice something different? Of course you do, it is VERY blatently obvious that the entire content within the tabs is completely missing. This is a large opportunity that is missed by ifoodstv.com at getting content not only indexed, but great internal links to deep content.
Now, taking a look at the way that iMedix does tabs is a little different, here it is with JavaScript on:
Here it is with JavaScript Off, on a different tab:
Do you notice a difference here at all? No?! Well, thats right, because there isn’t one.
Also, to add to that, iMedix is rewriting URLs so that there are no hash tags in the URLs:
This is a fantastic job of ensuring that there is not only crawlable content, but crawlable URLs. This is very important, search engines tend to ignore things after hash tags (#) in URLs. This is especially impotant with AJAX since that is typically how urls are created. Jeremy Keith (JavaScript expert extraordinaire) has described some ways to get around the AJAX issues, including has tags, and has described it as Hijax.
The Solution
The solution to creating AJAX that is SEO friendly is to ensure that you are building your site using Progressive Enhancement and Unobtrusive JavaScript. Along with that, url issues that could lead to canonicalization and duplicate content issues can be avoided using methods like Hijax. With the extremely competitive markets out there on the web, this is VERY important. It is possible to create rich user experiences with Ajax that are great for SEO.
As an example of how to do this, you can build standard interactions on your site that are accessible and search friendly. (This is how we built any standard interaction at PayPal). Any standard show/hide type interaction can be built using standardized JavaScript APIs. Doing this will create Non-JavaScript versions of interactions that are accessible and search friendly, but along with that will drastically reduce the amount of code that you have to write and increase developer efficiency.
Resources to help with AJAX and SEO
Google Webmaster Central: A spiders view of Web 2.0
Dom Scripting: Hijax
Progressive Enhancement with AJAX
Follow me on Twitter for more info @tonyadam or subscribe to my feed to keep up to date!
There are 8 comments
Nice… I wrote about this awhile back on my personal blog here.
[…] 27th August – AJAX and Non-JavaScript Experiences for SEO friendly websites (planetc1) AJAX can result in SEOs literally pulling their hair out, with it often being implemented so badly that page content consists of nothing more than a few words in the navigation bar. This does not mean that all AJAX is bad though, and Tony Adam shows the difference between good implementation and the bad. If PayPal can get it right, I’m sure that the rest of us can! Direct Link: Tony Adam […]
Or.. they could just include a sitemap.xml file….
@popo that is completely untrue…sitemaps.xml only provides pages on a site, not the content within them.
I know a good AJAX framework to make it SEO friendly and has a demo on http://www.ajaxoptimize.com/
Hi Tony,
I may have missed something on the imedix.com example but it works because it is not doing any Ajax stuff with the tabs. The only Ajax stuff seems to be to reload the tab images and data. When you click on a tab it reloads the entire page, which is why the URL is correct in the address bar (btw I traced all this using wireshark).
The tab example seems complicated with Hijax. Taking the progressive enhancement approach
i. The baseline code would have different versions of the page for each tab.
ii. You could use CSS/DOM to have a single page with different content shown depending on the tab that is clicked. From an SEO perspective you would have one big page will all the content in something like individual list elements. However this would not be possibly to hijaxify.
You would have to return to i) and rather than load the entire page reload just the tab data but I’m not sure this is possible either with the hijax approach.
In short it seems like you would have to do this at the server side. Either server a CSS/DOM version of the page if the browser doesn’t support JS or an Ajax version of the page.
I’d be interested in comments though.
[…] be indexed or crawled by a “search engine spider.” There are tips and tricks on on how to build Ajax SEO Friendly websites out […]
[…] issues by developing Progressively Enhanced code. While, I’ve written about how to address crawling and indexing with AJAX and SEO. I wanted to take a little bit of a deep dive into this again, along with the bigger issue, […]