Jakob Nielsen on sitemaps in 2000

Today, while browsing an old book called “Designing Web Usability”, written by Jakob Nielsen in the far 2000 (some 7 years ago), i have found a reference to the idea of distributing site content to the search engines in the form of the. On the page 238 there is small subtitle “Integrating Sites and Search Engines”, where he discusses the idea of integrating sites more closely with search engines. The problem for its implementation was considered at the time of writing – the agreement of a standardized method for encoding the user’s query terms. Right now we know, that the Search Engines can agree on the way of crawling the websites, but at the same time we know, that they use a lot of own “meta-extensions” to the crawling, like Google’s “no-follow” or Yahoo’s “no-content” for example. While user’s query terms are really far away from being interpreted by the sites (at the moment of writing it seems that only Yahoo pays attention to the meta content tag), the search engines have already done the first step into the direction of a better cooperation with website administrators and into the assuring of a better quality and better search results content.

Interesting, is that recently the governments of Arizona, California, Utah and Virginia announced they would use Sitemaps on their web sites. Of course it is quite a publicity stunt with a support for Google in the first place (creators of the sitemap protocol), but at the same time it is a quite a recognition of the Sitemap protocol. First Yahoo, then Microsoft and Ask.com has recognized it, and now we have some “enterprises” from the public sector coming for its support. Way to go, Jakob, i am looking for your next book =O)

Leave a Reply

Your email address will not be published. Required fields are marked *