An interesting discussion on backlinks and Google. Google’s algorithm is well known. Despite the billions of dollars of revenue and R&D money being poured into fine tuning the search engine algorithm it is still built on the fundamental principle of valuing backlinks.
It’s based on the insight the founders and supervising professor had on how academic papers are valued. In fact, it’s remarkably similar. Academic papers are valued on two dominant criteria:
1. The quality of the peer-reviewed journal the article is published on.
2. How many other academics cite your work.
Does this look familiar. Substitute the first with “Page Rank” and the second with “backlinks” and there you have it.
Of course, it’s not entirely based this and only this principle. Over the years, Google has fine-tuned this to prevent spammers and marketers to get the best of them. However, there is one undeniable fact: Even in the year 2007, trading quality links is internet gold.
Why did weblogs take off since the terrorist attacks? The traditional folklore goes that the mourning following 9/11 facilitated a national dialogue. People pouring out there hearts and mourning through their blogs created a momentum to bring blogs to the limelight.
Although this isn’t exactly far from the truth as a catalyst, the fact is blogs do one thing well and naturally: Search Engin Optimization.
A good blog template will give you valid html markup free of charge and without maintenance (as long as you don’t wreck it with amateur customizations) and system of automatic backlinks through trackbacks.
Trackbacks were revolutionary at the time. By using pings, you could automatically get linked on any blog with trackbacks enabled. This was originally to facilitate dialogue and the specification was built on trust and etiquette (something that haunts it to this day).
Before that you basically had to exchange emails and manually add links to your site with an editor. Now it could be done automatically and managed just as easily.
Of course, spammers caught on and had their field day. That’s why we have blacklists and statistical filtering like Akismet.
Google is also aware of this and are constantly experimenting with innovative ways to prevent people from gaming the search results.
However, as we can see, it still isn’t a level playing field. SEO of shady and legitimate varieties are thriving more than ever. My guess is that Google still uses a similar algorithm to the one they license from Stanford but use a variety of statistical filtering to provide weighting parameters to decide the given standing of a website.
Just a little initial investment to build links with other sites that share similar content or with friends can go a long way to help more people find you. As SEO becomes a larger cottage industry, there are going to be more smarter ways to use Google’s patented system to their advantage by building quality links that may not add any value for the average surfer.
The thing is you really can’t do anything to change that with all the smart filtering or statistical weeding without changing the fundamental model and abandoning the fundamental currency of links.
I think services like StumbleUpon are cleverly positioning themselves to avoid these limitations and provide surfers with more tailored content. The future of search may lie with a more individually calibrated search that can smartly avoid marketing-optimized content. However, even then you’ll never eradicate clever marketers because it’s the grease and oil that makes the economies smoothly turning even if their excesses may draw occasional scorn.