Web Standards And Sitemaps

Everyone was talking yesterday about the joint acceptance of Sitemaps by Google, Yahoo and Microsoft Search. They have even supported a new Sitemaps website (http://www.sitemaps.org). It’s well worth a visit and clearly explains what Sitemaps are all about.

Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.

Sitemap 0.90 is offered under the terms of the Attribution-ShareAlike Creative Commons License and has wide adoption, including support from Google, Yahoo!, and Microsoft.

Once you have a sitemap file on your website then you can register this with the three search engines. In this way, you have an assurance that your website is adequately visible to these search engines.

What is more significant is that the three have come together around the same approach. In a sense, they have created a de facto web standard way of cataloguing web sites. We’re all the winners when some of the obvious mechanics of the Internet can be done in a standard way. Perhaps it’s another signal of the new regime’s thinking at Microsoft. Would it have happened so quickly when Bill Gates was directing everything that Microsoft did?

It’s an awesome burden when you’re so big and have so much money that you can require that the game should be played your way. Sometimes it can create enormous legacy problems as the world does not accept your view. Microsoft with Internet Explorer version 7 is now trying to work more with web standards but prior versions have created a huge population of non-conforming web pages. The frustrations caused by this non-standard thinking are widespread.

Another smaller example of Microsoft’s “Do It My Way” thinking is those favicons you may or may not see as you surf the Web. Those are the small icons that appear in Favorites or Bookmark lists or in the address field of your browser. Microsoft invented these icons but has presumably by now forgotten about trying to make them work in a standard way. You can make favicons work in Firefox but not reliably in Internet Explorer. Again it’s an obvious piece of Internet mechanics that suffers by lack of standards.

Using economic power to block basic mechanics from functioning in a common sense way is counter-productive and misguided. Hopefully this action on Sitemaps is just another important signal that we all win when everyone tries to make standards work.

Tags: , , ,

4 thoughts on “Web Standards And Sitemaps”

  1. Well is it true that a site map with too many links on it ,I have heard 100
    will get you stuck in the hell world of supplemental results 🙁

    And you know the next step after that is banned to the out reaches of the internet universe ….

  2. Eddie did you not read the above “Web Standards And Sitemaps
    Everyone was talking yesterday about the joint acceptance of Sitemaps”

  3. It is better to have a site map although the search spiders may well find all the web pages anyway. Now we have the advantage that the same sitemap XML file can be submitted to Google, Yahoo! and shortly to MSN/Live. Check Sitemaps, a website supported by all three, for more details.

    I do not believe the number of links on the site map will affect whether your website falls in the Google supplemental results. The quality of those links may certainly have an effect on whether this happens.

Comments are closed.