Breaking News: Google Continues Cannibalising Search Results

So the recent change in how Google displays its ads on its search engine has already pulled up a number of interesting outcomes with agencies that manage large accounts reporting a number of standouts.

An increase in CTR of 16% across SERPs should be pretty concerning to folks in the organic space, and frankly to advertisers as well. I’m not saying these results are instantly stealing 16% of traffic from organic results, but there’s certainly been a migration as a result of this change; however significant or insignificant is yet to be seen. Aaron at EliteSem

That’s quite a big chunk and is echoed by what icrossing saw too with big increases in CTR for the new ad slot.

  • Positive click-through-rate impact for top positions (+5%) and PLA (+10%), as competition at the top right has been eliminated.

  • Negative click-through-rate impact for positions 5–7 (-8%) as they moved from top right to bottom of the page.

  • Negative impression impact for positions 8–10 (-69%) and click impact (-50%). However since this segment accounted for a very small percentage of impressions in the “before” period, their loss doesn’t represent a significant impact.

There’s no doubt a slew of these across the web. Look at any account with a large enough dataset and you’ll likely see similar patterns.

But what does this really mean for organic? It’s pretty obvious what it means for PPC. In the short term, for competitive queries the new position four ad slot seems to be doing a sterling job at stealing organic click share. If CTR’s are up across ad slots, then it follows that available click share MUST be down for organic, even if we account for the loss of side ads, right?

I was talking with a client yesterday about conversion rates on site.

We had all been a little perplexed in how conversions rates had dropped off of late and had tried a variety of things to identify and reverse.

We looked at the usual suspects of onsite changes, page speed, competitor activity, sector innovations etc and were doing a degree of head scratching trying to establish what was going on. Most channel traffic was up, organic especially. The view was that maybe rankings had decreased for competitive head terms (nope) or that direct and referral traffic had increased due to PR activity and that was impacting conversion rates due to lower buyer intent (a fact, but also nope)

The client noticed that the conversion problem had occurred around the 22nd of February, which funnily enough was around the time that Google rolled out its new land grab. Aha! The smoking gun.

What was really interesting (but surprising) was that the inclusion of this new ad spot, appears to have impacted the click through on high converting pages for competitive search terms. Effectively, for every competitive position attained, visibility has dropped by an order of at least one position.

Is it really the case that people collectively have jumped the shark and no longer care about ads in google as they once did? Has Google created such a neat and compelling ad product that users are now more drawn to the ad than they would be the organic result? Are the ads more relevant today even? Is all that SERP diversity of images, videos, knowledge graph, news results and the like just a massive pain in the Goolies? Are ads the quicker route for commercial intent!? Maybe!

Of course, I’m surmising and using the data witnessed from one account. It may not necessarily be the same for every commercial query and determining what is and what is not a commercial query isn’t a walk in the park either. Just because a query doesn’t have ‘buy’ or ‘book’ in the string doesn’t mean that it’s an informational intent type query.

It’s only when you begin to dig in to your conversion data locally that you’ll even begin to notice, and even when you have your aha moment you’ll be none the wiser as to how to fix it.

In short, the only fix that matters is, to gain increased visibility for your commercial intent queries, and the only way you are going to do that in “Google Four Ad slots” is to buy ads.

Sure, you can up your activity in your other channels and up your efforts targeting queries of lesser commercial intent and create more wow moments in your PR and general marketing efforts but make no mistake. Those organic opportunities are continually diminishing as Google seek to eat more of that organic pie.

For those interested, it might also be interesting to take a little look at CTR generally and look at a few of the tactics Google has taken over the years.

Looking at CTR historically

If you look at click throughs around positions over the years you’ll see that it’s an interesting picture. Many of us will have read the various click through studies  detailing how pos #1 gets x % position #2 y% position #3 z% tailing off the further you go down the SERP.

Here’s an old  graph from Internet Marketing Ninjas showing the optify data

This is old of course and came from the days when there was a max of two ads above the fold at the top.

However, it does show the general picture and variations over the years show similar curves and it’s pretty safe to say that with the advances in PPC ads since (smart links, stars, better ad copy, blah blah) that those numbers and their respective share has likely diminished since as ad clicks, knowledge graph type distractions have gained click share.

Eye tracking and clicks

Heatmaps show us that generally, much of our attention is taken by the space above the fold.

A page loads, we scan it, see what we need and click it and many of the studies produced have helped inform ad placement, nav placement, button placement and the like.

This eye tracking study below shows the google of old 2005 and the google of  2015.  The golden triangle versus the um…red guy with no arms and legs.

2005 versus 2015

What’s really interesting is the whole background colour change in the ad slot in the image to the right. Note the background is some kind of distinctive yellowish colour.

Do a search today, and that colour distinction is no longer there. The only differentiator is the word “Ad” and that’s diluted by other distractions like ad links and gold stars.

four ads hotels in london
four ads hotels in london

Many of the features that Google used to show for its organic results, user rating stars for example are now seen in its ads, but increasingly, not in its organic results.

It would seem that increasingly in the organic portion, attention is taken away at every opportunity. One could be forgiven for concluding that Google sought to confuse the consumer by continually shifting such features around and blurring the lines between organic and paid. After all, we aren’t stupid are we? We don’t need to see the ads with a clearly defined different background colour, do we.

Some might say that it would appear that if it’s commercial and you monetise it, then the Google of today wants you to pay for those clicks.

For businesses looking to seek visibility for commercial queries, they are effectively a pay for inclusion engine today. If you want visibility, then they want you to pay for it.

It’s a risk laden strategy. Altavista did the same in 1998 and killed itself.

Users didn’t want ads shoved in their faces and users left in droves, enticed by the thing that was all Googley.

Google aren’t stupid and have learnt from the mistakes of their predecessors. They do lots of testing and use feature creep to change things. Revolutionaries they are not.

I’ll leave you to draw your own conclusions around the bait and switch tactics and overlaps of paid serps versus organics. There’s no reason why they’d seduce users with rich snippets, only to snatch them away and leave them hanging around in their paid results, no reason at all.

If you are seeing similar things in your campaigns, decreased conversions whilst organic traffic has increased, and it fits in with these date ranges, do let me know in the comments.


Just to be clear, I didn’t personally identify the reason for reduced conversions. A team member at the client put forward the hypothesis and the whole 4 ad slot scenario seems to fit. I’d love to say who that is, but client confidentially and all that stuff… Hat tip Nick!


New Year – New SEO Products – Links, Audits and Reviews

Hello – I’m excited to announce the release of some really useful SEO products for 2016.

The products are aimed at marketers and business owners and lazy SEO’s who’d rather not do the work themselves.

Presently, there are three four to choose from but I’ll be developing more as time allows.

It’s a bit of a departure from the usual SEO product suite announcement in that none of these products are produced via automation or some clever backend api integration.

The reports created will of course use a suite of the best tools in the business. For starters most will use a combination of Kerboo, MajesticSEO, SEMRush, Google, Bing and Moz – we also use a few other top-secret ones too but if we told you what they were then we might have to tickle you to death.

The nature of the type of reports produced means that you’ll have to wait at least a few days for whatever you buy. Sometimes you’ll have to wait longer dependant upon what you’ve ordered and the number of others waiting for the same. Look for the status update on the product pages for the latest turnaround times.

As I said, there are 3 4 new products.

A manual link report product, an seo site review , a content marketing module and a bells and whistles audit and strategy report.

The products are all different and tailored to the specific client that requires them. There’s no template, no fluff, no sausage machine in action.

To go too much in to the finer details of each would be to spoil the surprise and delight of your purchase.

What I can say is that I love what I do and have been doing it for quite some time now (20 years OMG).  I’ll provide you with actionable insights that will make a difference to your understanding of your business and niche. I’ll give you ideas and inspiration and will show you how to fix any general silliness you’ve managed to find yourself doing. I won’t tell you about anything you know already and I won’t kill you with charts and lists and intangibles.

You’ll find phraseology like – “This part of your site is sub optimal and my recommendation is that you change this line of code to this line of code” or “An analysis of your market shows that you have some major content opportunities at hand, my advice is that you do X Y and Z as a priority…”

I hope not to have to write stuff like “The majority of your backlinks appear to come from a suburb of Afghanistan, whereas you aspire to rank in the bustling community of NYC…” I’d prefer not to work with numpties if I can avoid it, so if that’s you then erm…sorry.

There are rare occasions when it’s clear that there’s very little to say or add.

If you are one of these fortunate people then accept my apologies in advance as I decline your request. Why not go spend your cash on nice holiday or give it to charity instead?

That’s it! Happy 2016 to you!

Ps For the referral minded among you, there’s an affiliate program full of half decent commission for completed sales.

PPs. For a limited time, enter BoxMeUp at the checkout for an additional 20% off

What Should An SEO Do For My Business If I Have A Problem

It’s a fair question, and one that will get different responses from different companies.

Ultimately, your SEO will be looking to identify and unblock any bottlenecks and help return your domains search engine visibility for queries that are important to your business.

In this post, we are going to look at  some of the typical aspects that a reputable SEO company should be looking at if you experience a sudden stop or gradual drop off in traffic to your website from search engines.

Where has my search engine traffic disappeared to?

Businesses that turn to SEO companies for  help will often do so on the back of a crisis.

They may have seen a gradual decline in search engine traffic or a sudden drop in traffic that has a huge impact on sales or enquiries that matter to their bottom line. Such events are of course worrying and require investigation to see what is the problem and how best we can identify and present  solutions.

You’ll need to give your SEO as much information as you can. They’ll need access to your analytics package to view past traffic performance and your Google and Bing webmaster tools accounts  search consoles to see if there are additional direct clues.

You should also be candid with them and tell them of anything that you know that has been done to help them identify things for you. If you bought a tranche of links from a link seller or signed up for a dubious website promotion strategy then tell them.

Lack of transparency will not help you and will cost you more money in the long run and the SEO will likely find out anyway through their investigations.

Using the webmaster search console to help  identify  problems

The webmaster search consoles may tell your SEO professional if there’s a specific issue relating to the domain due to a manual penalty or an onsite performance issue.

The webmaster search console contains specific information about your domain, generated through the search crawl and the responses received. It will also show search traffic numbers and limited information around keywords, volumes, positions and click through rates.

Manual Penalties – Maybe you have a manual search penalty

Search engines will (but not always) notify webmasters if a manual penalty has been applied. A manual penalty  is applied for egregious abuse of search engine guidelines. These might be for link buying for example, hidden text or other spammy type activities that have been identified as unacceptable.

Where you have a manually applied penalty, you’ll need to file a reinclusion request from within the console. You’ll need to outline what you have done to correct any transgressions and politely beg for mercy, promising that you’ll never do what you’ve been penalised for again.

Generally, manual penalties are rare and there are often other reasons why a sites traffic has been impacted. Crawl errors are often responsible.

Let’s look at those.

Identification of Crawl Errors – Is your site generating debilitating site errors?

When a search engine visits a website, it effectively ‘crawls’ the pages using its search engine spider or robot. These spiders or bots as they are known are simple fetch and grab programs that read the content of the pages and then store and classify them in their databases. The codes returned by your web server are recorded and the outputs are then shown to you for analysis.

The crawl aspect of the search console will provide insights into how the search engine is evaluating the domain and will provide clues to any issues. Crawl errors are very useful as they help us see what may be going wrong onsite and contributing to poor performance.

Poor Robots.Txt File

An example of this might be a poorly formatted robots.txt file. The robots.txt file is a means of telling the search engines what should and what should not be indexed. It resides on your root domain and is accessed periodically by the search bots and spiders. Mistakes in these can often block an entire domain from being indexed, leading to very poor performance in search. A review of this file will help identify a problem.

Server Error Status Codes

The error code section of the search console is a great means of identifying on-site performance errors.

Server error status codes are generated by web servers, are numbered and have different meanings. Dependant upon the error, an SEO would advise and explain what each meant and how they were impacting your traffic.  The worst type to have would be 401 or 403 as these are effectively saying to the search bots “go away, you’re forbidden or not authorised” If the bots can’t read your content, then your content cannot be ranked or indexed in search.

More common search status errors are so-called 404 errors. These occur when a page that is requested cannot be found. The web server will often (subject to config) return a generic page that says page not found. The better ones are useful to users giving supplemental help in enabling people to find alternatives.

Server error codes are a useful means of gaining insight into poor scripting or server performance generally so should always be considered as an early part of the investigation process.

DNS Errors

DNS errors are often transient and can occur where the host server has issues relating to configuration or routing or hardware,  DNS errors will restrict access for people looking to read your content. This includes search bots. Persistent DNS errors will prevent your site being seen in search so it’s important to get on top of the issue should it occur.

Server Connectivity and Performance

Sometimes, your web server will struggle to perform and might have connection issues that impact upon page speed and content delivery. Where this occurs, it’s important that you address the causes and return the site to peak performance. An SEO should look at performance factors as part of their investigation as ultimately, search engines would prefer any pages that they return to their users to be fast loading and functional. A poorly configured web server or script will drain server resources and switch users off to your site. If this happens with too much regularity, then search engines will lose confidence and trust in your site as a resource and your rankings may be impacted.

Algorithmic filters due to Panda or Penguin

Other reasons why your site’s traffic may have been impacted relate to so called algorithmic filters. There are many types of algorithm and they are rolled out periodically or generated upon the fly. The two we’ll look at here are called Panda and Penguin.

The search console with regard to these, isn’t that useful as the data ranges we like to use to review such things are limited to 90 days. To take a good look at these we need to see historical traffic data over a longer timeframe as this enables us to look at traffic patterns and discount things like seasonality or general growth over time.

Using Your Analytics Package to Identify Algorithmic Filters


The Panda algorithm is aimed at low quality or thin content and seeks to demote pages that are considered to trigger these signals. Panda has had a number of iterations over the years and SEO’s have identified the dates which can then be referenced against website traffic patterns. The general theory being that if your traffic plummets coincide with the published release dates of these, then it’s pretty easy to conclude what the issue is through looking at your traffic within your analytics package.

It may of course also be very obvious anyway and a good SEO should be frank enough with you to say that actually, your site is appalling and you need to reevaluate your content generation model…

Sites that were built in 1999 may not necessarily meet the expectations of 2015 perhaps. A good SEO company will at least discuss this with you and help you appreciate the needs of today’s web users.  If you are answering a web query in 2015, then you need to be going above and beyond.


The penguin algorithm relates to your link graph. Some websites have unnatural inbound link patterns or have too many links that are considered to be from low quality sites. Where this is the case, a good SEO will help you identify what these are and will be able to help with a plan that will disavow any low quality links.

Again, the use of your analytics package will help the SEO align your traffic with known penguin release and refresh dates so that they can confirm whether or not your traffic fall off is penguin related.

Content Issues

You may have recently undergone a site redesign, your developer may have used a new technology or url structure that impacted your site in a negative way. Poor metadata, duplicate page titles, non existent page titles, poor keyword selection are just a handful of issues that may be present on site. A good seo company will help identify what these are and show you the way forward.

Wrapping things up

As we can see, there are many things that can contribute to poor performance of websites in search engines; manual penalties, algorithmic filters, poor content, poor site structure and architecture, poor hardware and each of these can pull your site down for the queries you aspire to. A thorough examination of these issues will help you take the steps that will eventually return your site to where you’d like it to be. It’s a good idea to have an seo site audit  before issues occur as this can save many thousands of pounds in fixing subsequent issues arising.

Ps For the marketing DIY enthusiasts we have a range of products that can help you drive your business forward – maybe you need a   manual link report to identify potential problems with your link profile , or an seo site review to unify your thinking and know you’re on the right track , or a content marketing module to give your creativity a kickstart and finally there’s a full audit and strategy report to give you that full on perspective.

Playing With Attribution Modelling and Getting Aha moments

One of the great things about working for yourself is that subject to resource you can virtually do what you like.

I spend far too many hours messing around with what I’ve learnt over the years and applying aspects that will offer limited return. I guess I do it because it’s fun and it sates a curiosity and if I’m really lucky it sometimes causes me to stumble on something of real value.

We all read mountains of stuff about conversions and attribution and the challenges faced in matching up the various channels to their respective ROI pots. People will naturally gravitate to positions that effectively back up the department for which they’re responsible for, so it’s no surprise to read all manner of conflicting viewpoints that make the case for the relative efficacy of channel a or tactic b.

The best way to understand things is of course to pull them all apart and put them back together again, often in the wrong places just to see what happens. Record the results and draw a few conclusions. Rinse repeat until you’re bored or until you’re happy with what you have.

Much of today’s analytics suites are built around cookies and a bit of embedded script on a page somewhere. For those who don’t know ( and I suspect a few of you reading this will so apols to you guys)  when we view a web page on a device the web server has access to a number of environment variables. Not every web page utilises all these as they’re too much hassle (for most) to code into their projects and for most, analytics pages like GA or Omniture are as good if not better for what they need.

Attribution modelling is pretty much covered in most analytics packages but as referenced above it’s all about the set up of the funnel and the interpretation of results. What message you need and who you need to tailor it to. SEO is an amazing channel and it’s no surprise that Google for example, systematically seek to disassemble the ease of measurement whilst introducing new features at the same time. It’s pretty easy to lose people in technical theory; especially if we don’t all speak with the same understandings.   HSTS super cookies, super cookies, cross domain tracking, cross device tracking cookies are just a few examples that most folk will struggle with conceptually.

Anyways, I’ve gone off track a wee bit, so apologies…

So, what have I been playing with and how is it of use potentially?

If we have a big domain with lots of users who come to our site and buy or use and then go away and come back again then we can pretty much begin to measure what they are doing, frequency, visitor length, page views and all the standard stuff that analytics packages will tell us.

1000’s of domains don’t have user accounts and for ecommerce sites  especially, this is a huge lost opportunity.  Check out systems are rightly cautious in enabling folk to purchase without the need for an account (it’s easier to convert folk from the purchase email anyway; incentives etc)

If we have users who are account holders and who return frequently, then we can begin to model behaviour and do a whole lot more useful stuff with tracking.

If we record (locally) specific details about the devices used along with environment variables such as screen, color depth, resolution, IP addresses used, referers, mouse behaviours, GEO data and all those things that are unique to them, then  can we not begin to model the behaviours of those who aren’t logged in displaying similar behaviours  also and begin to assign them to user type pots perhaps? Yes we can.

We might for example, know that user A (lets call him John) originally turned up from Google and he landed on a page that sold Triumph Rocket Touring Back rests.

A very specific page with words relevant to backrest , Triumph, Rocket and Touring. All of the meta and page data, urls etc were pretty tight in terms of KW accuracy so, despite Google hogging all of the query data for themselves we could pretty much determine that John searched Google for a Triumph Touring Back rest or at least a subtle variation.

We can assume That John either went straight to Google himself or that someone suggested he search on Google . Whatever way it’s diced, we know that he came from Google and he used his iPhone to do so.

He didn’t purchase though and we didn’t know who he was. He was at work on their wifi and he wasn’t ready to commit to the purchase as he was in research mode. He looked again on the way home this time on the train, from an edge or 3G connection as he hurtled through the burbs on his way home.

Later that day when he he got home he opened his iPad and he searched Google again or maybe he used the link that he emailed from his phone earlier and went straight to the page. His wife meanwhile was sat on her Mac or PC even. John talked to her about how his back hurt and he wanted a backrest for his bike. John’s wife’s a bit of a bossy boots so asked him to ping her the link via iMessage. The page looked amazing on her retina screen super expensive Mac and after much interrogation, she agrees that it’s a good purchase decision.  Great says John and proceeds to make the purchase on the Mac.

The vendor some days later is looking at the purchases and tracking who came from where and what. He sees this isolated purchase that came from a Mac. One page view of the product and a purchase within seconds. No dilly dallying at all. He sees that the credit card info was from Mrs P Whatsherface (the details stored in John’s wife’s digital wallet)

On the face of things, the vendor has no real way of determining who to attribute the sale to. His ill configured analytics package, attributes it to the direct visitor pot and the vendor concludes that it was either from WOM or that amazeballs local motorcycle magazine campaign he paid extortionate money for just days prior. After all, he sees quite a few of these so they must be from his offline marketing efforts.

In any case, he’s kind of happy, he’s made a sale. He’s even going to renew his motorcycle magazine advert as maybe it’s working well after all. 50 sales of this type already this month…

Meanwhile, the day after, John is on the train to work. He’s on his iPhone again, fiddling around, going through emails and reads the follow up email about his back rest purchase. He clicks the link excitedly and logs in to this account on the motorcycle vendors website. He has a little browse and he’s off again.

So, what can we deduce from this little story? What lessons are there for the vendor?

At John’s first visit from his iPhone, the vendors server or analytics package should have segmented John’s visit in to a pot or database and recorded the various aspects relative to iPaddress, device type, referer, length of stay.

It would have dropped a little cookie too.

When John then returned whilst on the train it could have began to have matched some of this data, it could have seen the cookie and said aha!

It might have noticed the different IP addresses and said aha again!

It might even have noted the different ISP’s and GEO locational stuff and said aha again and then it could have seen those Mac purchase variables and concluded something different entirely.

It could have learnt that there was a whole pre purchase journey that did indeed start with Google and that when it ran a similar back reference model across a multitude of similar purchases that there were similar behaviours.

He’d have saved a small fortune on that crappy motorcycle mag ad also.

So, this is what I’m doing at the moment. Playing with these kinds of factors and seeking to create pots or tables that record specific user and device behaviour and record the various aspects of what they get up to.  I’m in danger of making this a TL;DR post so I’ll shut up for now, but if you’re interested in some of the specifics of how it might work or indeed, if you have any ideas yourself then I’m all ears.

Facebook has enormous power in this regard, but that’s a post for another day perhaps.

Moral of the story? Create accounts, convert your visitors and track everything and analyse retrospectively too.


How easy is it to determine a good or a bad link?

Is that a Good Link or a Bad Link?

I played with a new tool this morning. It was some kind of link evaluation tool.

It purported to tell you whether a link from a URL was good or bad or somewhere in the middle.

Cool, I thought.

So I gave it a go and popped in 6 URL’s. All came up with wildly wacky results, all were deemed to be spam, all suggested I should do something funny with them and run away screaming.


Continue reading

The Magic Box of Glut – Wizards, Sampmerians, Elves and Pink Marshamallow Castles

The Magic Box of Gluttony in the Land of Glut

A world without search

Once upon a time there was a magical world called Glut – Everything happened in the world of Glut, the people within it did all manner of things. They built pink castles from weather resistant marshmallows and cool lakes made out of lemonade and beer where hamburger flavoured fish swam. Some folks knew how to make really fast cars that ran on magic beans made by their friends in the forest of emeralds.

It all seemed ideal in the world of Glut but progress was slow. Few knew how to fish for the hamburger flavoured fish and the magic beans that grew in the forest of Emeralds were known but to the people of Fark. The bottom line was that news traveled slowly in Glut, information was often controlled by the powerful and where it wasn’t, it was difficult for merchants to gain wide reach or appeal for their ideas and products.

Building a search engine to conquer all

Up on a hill next to a mountain in a place called Gooleg  there lived two special wizards.

They were clever wizards backed by the powers of InvestorLand who knew that everywhere they went in the world, people had ideas that they wanted to share so they went to a little known place called Altavistaland and took on the mighty wizard Inktomi where they learnt the secrets of  Retrivicus Informanicus. They went to Microland and bought the ingredients required and built a big magic box. Continue reading

Paying to Play in SEO

The Bad SEO  Rep Thing

SEO as an industry has for a long time now suffered with a terrible rep. The web is littered with case after case of burnt individuals recounting stories of being mislead at best and defrauded at worst – An examination of a lot of these tales will often reveal a well trodden path of company promised one thing whilst delivering another, usually in the form of not very much at all, or in extreme cases a nice page 6 ranking penalty from the Google monster.

Top 10 is it then Len?

I think it’s interesting that this happens, despite the wealth of info out there.  Google even publishes a guide to SEO, which for the DIY brigade, is a good little reference point. Yet the reality is that whatever way you dice it, there’s only ever really 10 organic spots to be had and unless you’re above the fold, you might as well not be there.

Sure, there’s Local,  Universal and Social and all that blah blah blah but let’s face it, if you aren’t ranking at positions 1 to 5  in a clean non obfuscated SERP then…need I state the obvious?

Continue reading

%d bloggers like this: