The perfect algorithm – how would yours work?

Search engine ranking algorithms are a mysterious thing. Very few people on earth have access to their exact blueprint, for those of us who think we have cracked it, it all seems relatively simple. Put enough of the right things in place in the right combination and presto you are in, right, simple huh? In reality of course, hardly.

Work at the coalface dictates that the safes combination gets harder to crack as more people try to open it for their target terms. It just doesn’t do anymore to think of ones documents in simple structure word count and number terms. As the document numbers increase, some keywords can take on an almost esoteric level of attainment. The access parameters are ratcheted up to a point of ‘hey if you want to score here, you gotta be doing real good‘. So, whats a man to do then?

Techno crackhead SEO’s on observation acid

SEO minded people who think about this sort of stuff might well share some of my musings, specifically in terms of thinking like a search engine algorithm. The theory being of course that any successful understanding of anything makes it a whole lot easier to apply what we have learned and therefore, apply in attacking it – hardly rocket science there.

Too many people I think, tend to approach SEO from a rigid bits and bytes approach. They forget that at their very core, search algos are built by ordinary thinking human beings, subject to similar influences as us all. They are people who visit the same kinds of conferences, interact with the same kinds of people via forums and blogs and pubs and restaurants. The only difference between them and us, and lets not make no mistake about it, it is very much them and us is that they hold the keys and are in a state of continual defence and counter offence.

Observation observation observation

If you look at most sites that perform well consistently today, then amongst the more competitive of SERPs, there are a number of observable constants.

It seems almost obvious to say, but I’ll say it nonetheless that most good sites with good competitive rankings are relatively well balanced and have the right combinations of the required signals to rank.

Really Rob? No shit sherlock, well yeah but it doesn’t hurt to say them out loud now does it.

Content content content

On the content side its pretty safe to say that a site has to have the right level of keywords, spread about in the right kind of way. In the overwhelming majority of cases pages that rank for keywords have them on the page.

Trust me baby and I’m popular too

On the trust side a site needs the right level of authority in its field, with the right kinds of people linking in, in the right kinds of way.

On the social side its not a bad thing to to hope that the site is discussed often enough in the right web social circles.

Do people hang at your party?

From the visitor perspective, we know that search engines can deduce a hell of a lot from the actions of people who are either logged in or have a toolbar installed. Toolbar data being a great way of obtaining that vital user behaviour data useful for indicating the right positive or neagitive feedback signals.

If you can objectively measure how people behave ‘on site’ then overtime, with sufficient data, some excellent assumptions can be made.

If questions like, ‘Once on a site how long do visitors stick around‘ can be answered or ‘Are they off in a heartbeat flicking back to the SERP for a better result‘ then asking the questions of ‘Is this a common phenomena‘ and ‘How many different people in different parts of the planet engage in such behaviour patterns‘ really do help to make assumptions and say that these would be the kinds of signals that should be folded in and added to a sites overall ability to rank.

We don’t like SEO’s we don’t want or need their sphere of influence

For the Search engines, an SEO’s ability to influence the latter aspects mentioned is next to zero. As a result, this information should outweigh many of the other established or accepted signals that many assume to be weightier.

For me, this should be the holy grail of a search engineers work, creating an algo that is next to unmanipulable, at least by the direct actions of search engine marketers.

Other contributions of course are things like ‘user personalisation’, often talked about as the next big SEO challenge, with algos tailored towards surf history, age and user behaviour; almost dictating that the day of the universal SERP are on their way out.

SEO on its deathbed?

Absolutely not! Good SEO’s who appreciate the ever shifting sands already have an excellent take on all of the factors required to rank. Even with the private data mining capabilities mentioned, the search engines still require good, well structured sites made and promoted by people with a good understanding for what creates and sustains buzz and interest in this Internet world – that demand isn’t going to go away anytime soon.


Its no surprise that the big search players all make a big play on the benefits of membership to their little cookie clubs and whatnot, and maybe a day will come even, where they are arrogant enough to make you play their game or go off and find something else to search with, who knows.
They can hardly be blamed mind, cos after all, it all helps in the quest for the perfect algo right?

Using search engine query strings to optimise your content.

First off a word of caution: This method could get you into trouble so watch yourself. A competitor could scream cheater! Fact is, its not cheating its not cloaking either, its using the referer string in combination with the query string to deliver content.

Many long tail searches often land on pages that don’t really cut the mustard for the query. This is lose lose. You lose in terms of outputting a page that isn’t really relevant to what the user was looking for and the user loses by having to hit the back button.

Lets say for example sake that you have a high authority page that ranks for practically everything. You might have a sentence within your copy that matches what a user has entered into a search engine; yet your page isn’t really about what the sentence refers to. The sentence just happens to fit in amongst the context of 3 or 4 hundred other words, but doesn’t really apply to what the user is after – net result, one disappointed user, x kb’s of wasted bandwidth.

You could therefore, offer these users an option by way of an optimised representation that catered for this deficit. I’ll keep it simple here and assume that you have a site or a blog that concentrates on a particular set of core products. Lets assume that you run a small niche power tool website. You sell things like drills, sanders, planes and other related items. You blog daily on various products and methods and talk at length about all sorts of aspects relative to DIY or general maintenance.

The code below looks at the query string entered by the user at the refering search engine. It then checks that string against a selection of predefined words and delivers a message based upon those words.

“Welcome visitor from refering search engine your query contained the word predefined word a page containing predefined word products from query string can be found here > linktowhereever

$queryurl = $_SERVER['HTTP_REFERER'];
$refer = parse_url($_SERVER['HTTP_REFERER']);
$refer= $refer['host'];

if(strstr ($refer, 'yahoo')){


$qstring = str_replace('+',' ',$match[1]);

if(stristr ($qstring, 'drill')){
$optilink = "";
$message = 'Welcome visitor from $refer your query contained the word drill a page containing drill related products can be found at this link <a href=$optilink>$qstring</a>';
if(stristr ($qstring, 'planes')){
$optilink = "";
$message = 'Welcome visitor from $refer your query contained the word planes a page containing power plane products can be found at this link <a href=$optilink>$qstring</a>';

if(stristr ($qstring, 'sanders')){
$optilink = "";
$message = 'Welcome visitor from $refer your query contained the word sanders a page containing power sander products can be found <a xhref="$optilink"> </a>at this link &lt;a href=$optilink&gt;$qstring&lt;/a&gt; ';

By adopting this approach you could deliver the message via a floating layer, or pop up window on exit. You could even output it at the start of the content and place it within a little paragraph.




//continue with the rest of the content

The search engines would rather that they decide what pages to return based upon their calculations of relevancy.

The fact is that sometimes they do a pretty crap job at it and could do with a little help. Besides, we should be able to decide what we do with our visitors. Its not for the search engines to dictate to us. My view is, that provided its related and adds value to the user, then there is no real harm in giving them that little bit more. Its a not a cut and dry case of smoke and mirrors cloaking with sneaky redirects or any of that stuff, its just taking things one step further and deciding to help out a little.

If you consider that some websites have pages that change daily, if not hourly then the reasons to employ such methods becomes even more apparent. How many times have you visited a page, only to find that what you were looking for was not there? I have, and in those cases I often had to go to the search engines cache to see what it was. That or I have to embark on a site search at my destination to find what I was after. Methods as proposed would reduce instances of those scenarios.

%d bloggers like this: