We may be mixing metaphors here, but both are illustrative of what is going on in Search Marketing Land at the moment. There is intense and heated debate about Google’s strong suggestion that paid links should carry a no-follow attribute.
In other words if you pay to have another website link to your own website then that link should carry a no follow attribute. This would mean that such links in no way affect the ranking of web pages in the Google keyword query search process. Jennifer Laycock has described this as idiocy. Google has even suggested that the Federal Trade Commission (FTC) should rule on how such links are designated, since to Google they are clearly advertising. Dan Thies somewhat archly has welcomed the intervention of the FTC, if such should happen. Given the huge economic implications of any decision, such powerful advocacy is not at all unexpected.
Standing back from the fray, and looking at a longer-term perspective, it is not surprising to see what is happening. When Sergey Brin and Lawrence Page were working on Backrub, which became Google, their fundamental paper on The Anatomy of a Large-Scale Hypertextual Web Search Engine clearly presages the present conflict. The following quote is the key:
Another big difference between the web and traditional well-controlled collections is that there is virtually no control over what people can put on the web. Couple this flexibility to publish anything with the enormous influence of search engines to route traffic and companies which are deliberately manipulating search engines for profit become a serious problem. This problem has not been addressed in traditional closed information retrieval systems. Also, it is interesting to note that metadata efforts have largely failed with web search engines, because any text on the page which is not directly represented to the user is abused to manipulate search engines. There are even numerous companies which specialize in manipulating search engines for profit.
The solution they suggested was to use the set of hyperlinks from other web pages pointing towards any given webpage as a measure of the importance of that webpage. It has similarities to the way in which academic papers are often evaluated by the number of citations in other academic papers. Their own Backrub paper has a long list of such citations. Perhaps not surprisingly it does not include one such by the father of the Internet, Sir Tim Berners-Lee, which had appeared a few months prior to their own. (Thanks to Ruud Hein for this reference.) Entitled Links and Law, its Abstract includes the following:
Normal hypertext links do not of themselves imply that the document linked to is part of, is endorsed by, or endorses, or has related ownership or distribution terms as the document linked from.
Brin and Page were taking a diametrically opposed position.
That was all in 1997. Now in 2007 we see how the world has evolved. Google is enormously successful as a search engine. It puts high-value on hyperlinks to web pages. Everyone reacts accordingly. With only a minor change their Backrub quotation still makes sense:
Also, it is interesting to note that using backlinks as a measure of authority has largely failed with web search engines, because such backlinks can be abused to manipulate search engines. There are even numerous companies which specialize in manipulating search engines for profit.
What is the appropriate solution for search engines now. Jason Calacanis with his Mahalo is suggesting that algorithms will not provide the answer and human judgment must be involved. On the other hand Google has enormous resources and collects massive amounts of data about actual keyword queries and how searchers react to the answers they receive. As a current thread in the Cre8asite Forums discusses, perhaps Google will ease back on its attempts to remove “spammy” hyperlinks and begin to mine the rich performance data it has on the actual keyword search process. It can’t happen soon enough.