It’s a fair question, and one that will get different responses from different companies.
Ultimately, your SEO will be looking to identify and unblock any bottlenecks and help return your domains search engine visibility for queries that are important to your business.
In this post, we are going to look at some of the typical aspects that a reputable SEO company should be looking at if you experience a sudden stop or gradual drop off in traffic to your website from search engines.
Where has my search engine traffic disappeared to?
Businesses that turn to SEO companies for help will often do so on the back of a crisis.
They may have seen a gradual decline in search engine traffic or a sudden drop in traffic that has a huge impact on sales or enquiries that matter to their bottom line. Such events are of course worrying and require investigation to see what is the problem and how best we can identify and present solutions.
You’ll need to give your SEO as much information as you can. They’ll need access to your analytics package to view past traffic performance and your Google and Bing webmaster
tools accounts search consoles to see if there are additional direct clues.
You should also be candid with them and tell them of anything that you know that has been done to help them identify things for you. If you bought a tranche of links from a link seller or signed up for a dubious website promotion strategy then tell them.
Lack of transparency will not help you and will cost you more money in the long run and the SEO will likely find out anyway through their investigations.
Using the webmaster search console to help identify problems
The webmaster search consoles may tell your SEO professional if there’s a specific issue relating to the domain due to a manual penalty or an onsite performance issue.
The webmaster search console contains specific information about your domain, generated through the search crawl and the responses received. It will also show search traffic numbers and limited information around keywords, volumes, positions and click through rates.
Manual Penalties – Maybe you have a manual search penalty
Search engines will (but not always) notify webmasters if a manual penalty has been applied. A manual penalty is applied for egregious abuse of search engine guidelines. These might be for link buying for example, hidden text or other spammy type activities that have been identified as unacceptable.
Where you have a manually applied penalty, you’ll need to file a reinclusion request from within the console. You’ll need to outline what you have done to correct any transgressions and politely beg for mercy, promising that you’ll never do what you’ve been penalised for again.
Generally, manual penalties are rare and there are often other reasons why a sites traffic has been impacted. Crawl errors are often responsible.
Let’s look at those.
Identification of Crawl Errors – Is your site generating debilitating site errors?
When a search engine visits a website, it effectively ‘crawls’ the pages using its search engine spider or robot. These spiders or bots as they are known are simple fetch and grab programs that read the content of the pages and then store and classify them in their databases. The codes returned by your web server are recorded and the outputs are then shown to you for analysis.
The crawl aspect of the search console will provide insights into how the search engine is evaluating the domain and will provide clues to any issues. Crawl errors are very useful as they help us see what may be going wrong onsite and contributing to poor performance.
Poor Robots.Txt File
An example of this might be a poorly formatted robots.txt file. The robots.txt file is a means of telling the search engines what should and what should not be indexed. It resides on your root domain and is accessed periodically by the search bots and spiders. Mistakes in these can often block an entire domain from being indexed, leading to very poor performance in search. A review of this file will help identify a problem.
Server Error Status Codes
The error code section of the search console is a great means of identifying on-site performance errors.
Server error status codes are generated by web servers, are numbered and have different meanings. Dependant upon the error, an SEO would advise and explain what each meant and how they were impacting your traffic. The worst type to have would be 401 or 403 as these are effectively saying to the search bots “go away, you’re forbidden or not authorised” If the bots can’t read your content, then your content cannot be ranked or indexed in search.
More common search status errors are so-called 404 errors. These occur when a page that is requested cannot be found. The web server will often (subject to config) return a generic page that says page not found. The better ones are useful to users giving supplemental help in enabling people to find alternatives.
Server error codes are a useful means of gaining insight into poor scripting or server performance generally so should always be considered as an early part of the investigation process.
DNS errors are often transient and can occur where the host server has issues relating to configuration or routing or hardware, DNS errors will restrict access for people looking to read your content. This includes search bots. Persistent DNS errors will prevent your site being seen in search so it’s important to get on top of the issue should it occur.
Server Connectivity and Performance
Sometimes, your web server will struggle to perform and might have connection issues that impact upon page speed and content delivery. Where this occurs, it’s important that you address the causes and return the site to peak performance. An SEO should look at performance factors as part of their investigation as ultimately, search engines would prefer any pages that they return to their users to be fast loading and functional. A poorly configured web server or script will drain server resources and switch users off to your site. If this happens with too much regularity, then search engines will lose confidence and trust in your site as a resource and your rankings may be impacted.
Algorithmic filters due to Panda or Penguin
Other reasons why your site’s traffic may have been impacted relate to so called algorithmic filters. There are many types of algorithm and they are rolled out periodically or generated upon the fly. The two we’ll look at here are called Panda and Penguin.
The search console with regard to these, isn’t that useful as the data ranges we like to use to review such things are limited to 90 days. To take a good look at these we need to see historical traffic data over a longer timeframe as this enables us to look at traffic patterns and discount things like seasonality or general growth over time.
Using Your Analytics Package to Identify Algorithmic Filters
The Panda algorithm is aimed at low quality or thin content and seeks to demote pages that are considered to trigger these signals. Panda has had a number of iterations over the years and SEO’s have identified the dates which can then be referenced against website traffic patterns. The general theory being that if your traffic plummets coincide with the published release dates of these, then it’s pretty easy to conclude what the issue is through looking at your traffic within your analytics package.
It may of course also be very obvious anyway and a good SEO should be frank enough with you to say that actually, your site is appalling and you need to reevaluate your content generation model…
Sites that were built in 1999 may not necessarily meet the expectations of 2015 perhaps. A good SEO company will at least discuss this with you and help you appreciate the needs of today’s web users. If you are answering a web query in 2015, then you need to be going above and beyond.
The penguin algorithm relates to your link graph. Some websites have unnatural inbound link patterns or have too many links that are considered to be from low quality sites. Where this is the case, a good SEO will help you identify what these are and will be able to help with a plan that will disavow any low quality links.
Again, the use of your analytics package will help the SEO align your traffic with known penguin release and refresh dates so that they can confirm whether or not your traffic fall off is penguin related.
You may have recently undergone a site redesign, your developer may have used a new technology or url structure that impacted your site in a negative way. Poor metadata, duplicate page titles, non existent page titles, poor keyword selection are just a handful of issues that may be present on site. A good seo company will help identify what these are and show you the way forward.
Wrapping things up
As we can see, there are many things that can contribute to poor performance of websites in search engines; manual penalties, algorithmic filters, poor content, poor site structure and architecture, poor hardware and each of these can pull your site down for the queries you aspire to. A thorough examination of these issues will help you take the steps that will eventually return your site to where you’d like it to be. It’s a good idea to have an seo site audit before issues occur as this can save many thousands of pounds in fixing subsequent issues arising.
Ps For the marketing DIY enthusiasts we have a range of products that can help you drive your business forward – maybe you need a manual link report to identify potential problems with your link profile , or an seo site review to unify your thinking and know you’re on the right track , or a content marketing module to give your creativity a kickstart and finally there’s a full audit and strategy report to give you that full on perspective.