Writing for the web what should a person be considering?
Creating good content isn’t as easy as people like to think, it takes thought and intelligence and an awareness of all the factors that make content stand out. In this piece we’ll take a look at the topic of writing for the web and the best practice you should use. We’ll think about what we’ll do before, during and after producing and our content and learn how a few simple disciplines can help improve the work that we produce.
So you want to create something for your niche?
It’s important to understand the factors that influence performance as without them, it’s difficult to appreciate the various nuances that come together to make up the whole.
We need to give people readable content that they’ll like; content that they might be inclined to share, content that they’ll want to click on when they see it appear in the search results.
Understanding the reasons for these will help us.
It’s pointless writing content that has a very limited audience or is devoid of aspects that’ll make it noteworthy. We need to create content that seeks to inspire, excite, inform, encourage, create and motivate.
From a technology perspective, we need to make sure that our content will be favourable towards the search engine algorithms too so that it is returned for the keyword phrases and queries that we’d like to see it appear for in Google or Bing.
The performance of a page in search is often influenced by four categories of on-site factors, offsite factors, keyword and competitor research.
On-site factors relate to most of the things that happen on the web server which we or our team can control directly. Things like website structure and architecture (navigation, site speed, crawl-ability, content mark-up, page urls, domain name, hosting etc.)
Off-site factors relate to site authority — links citations and variety, and to a lesser degree social signals from social media. The more of these link signals that sites have, the better they tend to perform amongst competing sites in their respective query space.
Keyword research is an important part of the content creation process as we need to consider what it is that people search for and the volumes in which they do so. It’s pointless writing a piece that has no keyword phrase or aspiration in mind. We need to think of our intended audience and give them what they need in a way that helps support our business aims and growth by looking at key phrases that have search volume.
Competitor Research is also vital in understanding our niche and what our competing brands and services are doing in the space.
The content that we write and the page URL’s we create will be largely based on our product or service knowledge and what we learn in our research phase.
The Pre Writing Phase
So, you are about to write or edit an amazing piece of content that you’ll get 1000’s of social shares, links back to the website and visitors that’ll convert to sales.
What should you be doing, what kind of thinking mode do you need to have, in order to be the most effective that you can? We’ll need to think about how others are doing things and ask how we can do it all better. We’ll need to think about who it will be who helps float our content and take notes on what works and what doesn’t. We’ll need to source images and ensure that they are well optimised and fit for purpose and to help us, we’ll need to use the search engine to find things out.
Keyword and Competitor Research
Whilst much of the initial keyword research may have already been undertaken, it’s a very good idea to take a quick look at what it is you have been tasked to write about, as it will help solidify the need and give you a broader understanding of who’s already succeeding and why.
Using Social Media
Conduct a Twitter search and make note of who’s tweeting in the space.
Consider the installation of chrome plugins like list builder and Klout to help build your Twitter network and to identify potential influencers in the space. Look at other tools like Peer Index and Kred too.
Use a tool like http://analytics.followthehashtag.com/ and take note of who’s talking about your topic in that space, they could be great advocates post publish point.
If you have other tools that work for you, use those instead or in conjunction.
Look at LinkedIn and see if there are any stand out brands or people and take a note of who they are.
Check out Pinterest and Instagram, weigh up whether your content could cater to these audiences too.
Not every piece can, but if it’s particularly visual, it might have a good home on Pinterest but if it’s more about charts and numbers then perhaps not.
Use #hash tags to supplement any products or content you share. In the example of a gift shop sign selling wedding signs, then hash tag the image with #wedding #weddinggifts #weddingideas and related words.
The same applies for images or pages that you share on Twitter.
It’s super important to get people onside and make them aware of what it is you do.
Don’t assume it’s as obvious as you think it is.
If you are producing excellent useful content then people who have interest in the space will want to share your stuff within their networks.
Set aside some time in identifying who these people are and follow them on social media just after you publish.
Image Selection and Naming
The best content pieces are a mix of text and imagery. Images are important to both content and search as they help break up big chunks of text and make a piece more visually attractive too. We’ve all heard the story of a picture and it painting 1000 words…
It’s best to have original images but if you can’t source one locally then use an image bank to find one. Be sure that your image is optimised for the web.
Keep it to a decent file size, optimise it for the web. Not everyone has superfast broadband and many of your visitors will be reading the content from a mobile device with a limited rendering or download speed. Keep these factors in mind with your use and selection.
When selecting your images think about the topic you are writing and aim for at least one or two directly related images that can be used to increase image content relevancy for your target phrase.
In addition to social media, Google image search is a great source of traffic. Name your images in ways that are likely to help in this regard. Using wedding-sign.png is infinitely better than image_43294.png.
You can of course source imagery from the web generally, but it’s best to avoid this where possible.
If you do, then it’s important to give attribution or to seek permission prior. Don’t just take it.
Using Search Engines
Search your primary keyword phrase in Google and Bing; take a look at who is and what is ranking for the phrase and take notes:
- Look at the content that is currently winning and ask yourself how you can improve upon what they’ve provided.
- Look at competitor page <titles> and headings and note why they had appeal
- Look at the types of content that are being returned in the search results images, video, text, rich snippets etc
- Are there bloggers or Tweeters appearing in the SERP?
- Is there a structured data result that appears in the Google knowledge graph perhaps?
- Look at the snippet and note any specific calls to action
Through looking at the SERP (search engine results page) we can gauge what it is that Google likes for the query set and what we will need to do to occupy the positions and raise our visibility.
We’ll also show opportunities for sharing post production (bloggers, Tweeters, related communities)
Pre Post Actions and Takeaways:
- Assess the organic SERP competition.
- Assess PPC competiton and see what people paying for ads are saying
- Ask yourself critical questions around your content plan.
- Look for opportunities within the SERPs and factor these into your content piece.
- Assess the social media space.
- Identify potential advocates for post publishing activities and to gain greater understanding of what’s happening in the keyword space.
- Name images in relevant ways, optimise image sizes, seek permission of copyright owners and give attribution where free or unpaid.
- Separate image file names with hyphens to maximise image search relevance down the line
Thinking About Content Type and Audience
We need to think about the type of content that we are creating and its intended audience. Until we’ve really looked at the former it’s difficult to know exactly, what’s the best way forward.
Sure, we can say that “I want to write a blog post” or “I want to write a press release” or “I want to create a guide” but to do so in the absence of exploring the keyword space first is folly. We need to look at what ranks, what gets shared and try and figure out why. This will help inform our decisions, and focus thought on what kinds of content will work best.
We can loosely categorise content in to a number of camps. I’ve put together a loose grouping of 6 which will give you some food for thought using a topic of gift signs to help illustrate the examples.
Blog type content — Conversational, topical, aimed at a specific aspect of what we do or personality/innovation/event/niche type thing. Content that has personality with a distinct user voice that encourages user engagement via comments or social sharing perhaps.
News type content — Some content is often newsy, directly informative, and not overtly sales like in tone — a piece announcing an acquisition or investment perhaps. Industry specific, breaking news type stuff
Core content piece — this will be product specific content. Highly targeted at a specific aspect of what we do or the product we have. This wedding signs page is one such top level example that we’d need to improve upon and create a new product item for perhaps, we’d expand upon what is being presented, look at creating a new search friendly URL and talk about the different wedding signs at hand.
Guide piece — A piece of content that adds context to the general proposition. What it is, what it does, how it works, what are its benefits. “10 Top Signs For Loved Ones” might be a good example for our sign company, giving inspiration for people looking to get a loved one a gift.
Infographic — Infographics are a great way of getting shares and creating conversations. They are a visual mix of data and text and when done correctly, can garner huge attention.
Maybe this gift company could look at sales and best sellers and see what the data is saying. Is there a story to be told? What kinds of words are people entering into their personalised signs section for example.
Thought leadership — Most people like direction and look up to authorities for guidance and leadership. This presents great opportunities in what is a fun feel good space. They might talk about ethical sourcing of materials perhaps, enviro friendly aspects of what they do for example and talk about lesser alternatives like plastics and less bio degradable products perhaps
Understanding The Target Audience
It’s of course critical that we know and understand our audience. Who it is we are seeking to communicate with and why. Sometimes, we might wish to talk to our Twitter audience. We know what they want and know what gets them excited. Other times, we might be reaching out to a different constituency, the general press, the trade, the creative community or simply prospective consumers.
If we know who were are targeting, we can better shape the delivery of what we say by using language and imagery that resonates.
Simple Marketing and Promotion Writing Ideas
So, what else could this gift sign company be doing?
How could we help them succeed?
Controversy — We could be getting a little guerrilla and shaking the tree of an established mind-set that we believe is no longer relevant and challenge otherwise accepted traditions of thought and doing things. We might create a little controversy and stimulate debate as a result.
Piggy Backing — We might be mindful of an upcoming industry event that we’ll be looking to ride the coat tails of or draw awareness to our value add. A big craft fair, a regional exhibition perhaps, piggy backing anticipated search volumes, using prior knowledge or trend tools. We can use social media to use the awareness in the space, using hashtags on platforms like Instagram and Twitter.
News and Events Cycles — We might have knowledge of a general news cycle that again, we believe we can add to and perhaps occupy a news place in Google news through proactively creating content that’ll be a good fit for lazy journalists.
By knowing who we are aiming at and adjusting our message to suit we can help increase the likelihood that our content will be well received; be it a blog or a guide or a new product or a news piece or some fantastic thought leadership white-paper; through thinking about who it is we are talking to we can better deliver on our goals.
Network Utilisation and Strategy
It’s a whole lot easier to get content to fly if you have a cunning plan.
Thinking about how you can leverage your network post content completion is vital to gain traction and momentum.
Have a mini brainstorm in the office “Guys and gals, I’m creating x y z, does anyone have anything to throw in to the mix?”
Marketing Hooks — what will be yours?
This is a great look at the topic which might fire a synapse or two.
We could possibly be looking to massage the ego of a well-known voice in the space, someone with an audience of their own who’ll appreciate the recognition. Someone who has a high profile and who might be willing to talk about their purchase.
We might offer users the ability to tweet or Facebook or Instagram about their recent purchase at checkout perhaps, an easy click to share that enables for the image to be embedded within their social feeds and seen by their networks.
Any action that amplifies the content we create is worth considering. If it raises brand awareness and creates conversations about us and our products then it’s worth doing.
Editing Pre-existing Content
Sometimes, we might need to edit an aged or out of date piece of content that gets lots of traffic already (or doesn’t).
For pages with a lot of traffic, It’s important that we don’t make too many radical changes as if the page is performing well, then we wouldn’t really want to change the page title or the headings too much as to do so could lose us a fair degree of traffic.
On the other hand. we might notice that there are some great opportunities to be had, especially for page that isn’t performing at all, in which case some radical surgery might be required.
Generally, we should check the performance of the page we are seeking to edit using our analytics package. If the traffic is awful, then we have very little to lose and should edit to our hearts content using the knowledge we’ve acquired from our actions further up the page.
In cases where traffic is good we should go ahead with caution. We wouldn’t want to unnecessarily edit a page URL for instance nor would we want to change a page title or core heading (date related aspects excepted)
There may often be cases where traffic is good and engagement is poor. In most cases the reasons should be obvious, in which case edit away and add value.
We might want to add some value (images/rich media/new data) or make subtle edits to copy. In most cases these shouldn’t present many, if any difficulties.
If we change a page URL then we of course need to put in a specific redirect so that the engines and crawlers know the new location. Talk to the tech team or SEO around the need for a 301 redirect.
During Content Creation
Having established what it is we are writing, it’s important that our content piece has a clear page structure with a good mix of related keywords and phrases with our core keyword phrase at the centre of it all.
We’ve thought about layout, message and flow, we’ve sourced our images and have named them appropriately and optimised them for use on the web and we are ready to put it all together.
What are the key structural components of a good piece of content?
Evidence shows that pages that do well in search engines share a number of key characteristics.
- Well thought out page titles <titles>
- Good <meta descriptions> with calls to action that generate decent CTR (click through rates)
- Good headings and hierarchy
- Keywords in the page content
- Visual Breakpoints
- Good navigation
Page Titles <titles>
The page <title> is the most important part of the page. It is the part of the content that will appear in the SERP’s and is used by the search engines as a primary means of establishing what the content is likely to be about.
Getting back to the gift sign company example, if we were creating a content piece that was to target the phrase “Handmade Wedding Signs” then we’d look to make sure that those words were present in our title. Of course, a page title that just said “Handmade Wedding Signs” would look exceedingly bland and would miss out on other opportunities to draw the click and to speak to humans, so we’d look at how we could make that work.
Our keyword and competitor research would have already established that competing pages have various characteristics that we can use too.
For a core content piece we might use the page title of “Buy Quality Handmade Weddings Signs, Environmentally Sourced” which would of course be better than “Handmade Wedding Signs” alone.
For a blog piece we might get a little more conversational and say “5 Little Known Rules for Creating the Perfect Handmade Sign”
People also love “How to’s” and “What” type queries. “How Can I Make the Perfect Custom Sign for less than £20?” or combined “How Can I Make the Perfect Custom Sign for less than £20 — What should I look For”
The variations above would also have a secondary benefit in that they’d target additional search phrases with search volume and use language that relates to what they do.
“Wedding Signs”, “Custom Signs” search query aspirations would all be assisted by the page title “Where Can I Find A Custom Wedding Sign Online — What should I look For?”
The sky is the limit and ultimately it’s bound by your creativity. The central message is that you of course need to include the primary phrase within your page title (as near to the start of the sentence as makes sense) and talk to humans at the same time gaining their attention and getting that all important click.
Try to keep your page titles sub 60 characters in length as Google typically shows just the first 60 or so characters; anything longer than this is truncated. This doesn’t mean that you can’t use titles that are longer but be sure to have your most important messaging and keywords within this limit.
The <Meta name=”description”> tag is important as it is used by Google and Bing to surface a snippet of content in their SERP’s. It sits in the code section of a HTML page.
Meta descriptions are a great opportunity whereby you can reinforce your message and add extra content and calls to action for the page that you are trying to attract that all important click from.
You should seek to add effective context with words that are related to the query space in a way that also talks to human beings.
Meta descriptions should be around 160 characters most but can be longer. Anything over 160 is truncated and will not be displayed.
Be sure to think about how you’d like your message to appear in the SERP’s and be mindful that the meta description will often be the search engines first port of call when adding supplemental information about a page.
Page Headings <H#>
Similar to books, good web pages have good structure and are headed up by relevant headings and subheadings.
HTML hierarchy dictates that headings are number 1–6 typically with <h1> being the largest and most important.
Heading tags are an opportunity to arrange your content in a way that is logical and readable to your user. They’ll help break up the content and make sense of what it is you are communicating.
The <h1> tag should be the first heading on the page and should only appear once. Typically, it often closely mirrors the page <title> which when you think of it, makes sense from a user readability perspective.
Additional subheadings <h2> <h3> <h4> <h5> should be used throughout the document as appropriate, in a clear hierarchy.
Note: You don’t have to use all of those headings. Pages rank well in absence of H1’s or through a mix of different headings to0. Structures like H1, H2, H3, H3,H3 H4, H2, H2, H2 are often used too and don’t appear to negatively impact performance to a large extent. The takeaway is, that logical structured headings are a good habit to get in to and help signify important aspects of the document or places within it.
Do’s and don’ts for headings and subheadings
- Resist the temptation to keyword stuff your target phrase in to every subheading that you use. If you do, then your piece will read poorly and will possibly be marked downed as keyword stuffy!
- Ensure that your primary keyword phrase appears in your <h1> heading
- If it makes sense, then use it in your first <h2> subheading too and subsequent or latter subheadings, but don’t over do it.
- Look for opportunities to use keyword variations — If your content piece is targeting Birthday Signs then be creative and think of variants that you can blend in to your subheadings “70th Birthday Wooden Signs” perhaps, or “Birthday Gift Ideas and Inspiration” or “Examples of Our Most Popular Wooden Birthday Signs” all of these titles are related and the search engines will get that too. Through adding variance and semantic relationships in your copy, you’ll help elevate relevance and increase readability.
- Don’t over use headings either. No point having a subheading for every sentence or paragraph.
When writing your copy, try to ensure that you make mention of your primary target phrase in the first or second sentence that you use. Make it stand out so the reader recognises the core message early into the piece consider the use of or tags where appropriate.
Search engines are relatively dumb so we need to give them as many clues as we can. Page titles, Meta descriptions, headings and content mark-up are ways in which we can do this.
That said, they are getting smarter so we need to be mindful that we don’t overdo things and resist the urge to keyword stuff. Keyword stuffing will damage our ability to rank and switch our readers off so it’s important to bear that in mind and not overdo the keyword phrase thing!
When writing, we should look to use related phrases and words where we can. We want to be on the lookout for easy related ranking opportunities and send out that strong message that our content is thematically related to our niche and contains strong word correlations throughout.
Our ultimate aim site wide is to be recognised as THE authority on our topic so through using words that have semantic relationships we help ourselves achieve this. Antonyms, synonyms, hypernyms, hyponyms, meronyms will all help, use them with gusto!
The most important factor of course is to write for humans.
We can have the most amazing rankings in the world, but if our content reads like crap and people hit and run or fail to engage then we will fall at the first hurdle and will see our rankings go backwards.
Do’s and Don’ts for Content Copy
- Do not keyword stuff — write for humans, your core keyword phrase shouldn’t appear more than 5–6% of the time typically, but this will vary across niches and isn’t set in stone.
- Do use variations of your target key phrase, think of related words you can include, think semantically.
- In most cases, try to keep your web documents to a decent readable length.
- Don’t consign yourself to TL;DR hell. Try to keep your blog posts to less than 1500 words. No one wants to read a long piece of content unless it’s really useful and merits their time. If it’s going to be long, then try and engage people along the way,
- Different types of content such as comprehensive “how to guides” and “white papers” may well be longer of course, simply because people are expecting highly detailed specific information.
- There’s a great piece on content length generally here https://blog.bufferapp.com/the-ideal-length-of-everything-online-according-to-science
Internal Link Reinforcement
One of the ways in which we can help ourselves is to continually reference our most important content pieces. We can do this by linking to such pieces as we create new content. If we are writing about topic x y or z and happen to use the term gift signs, then it makes sense to hyperlink that phrase and point it to the page that we would like to rank.
Less is often more of course, so we don’t want to overdo things. Who wants to read a document of a zillion hyperlinks!? If it makes sense, then do it. If you look back over your finished article and see some way of weaving in an internal link opportunity then do so.
Vary Your Anchor Texts
Try to add some variance to your internal links too. A page that targets birthday signs would also benefit from being linked to with variations of anchor text (the words contained in the hyperlink).
So, if we used the term birthday in our piece, then we might hyperlink that to our birthday signs page.
The same would apply for related variants. Birthday | signs | plaques | gifts etc.
Links are the glue of the internet, without them we’d all be a little lost. When creating your content, you’ll often link out to other sites and partners.
General rules of thumb are:
- Don’t link to spam sites! Think about who you are linking to and how — no point helping a competing service with a juicy link to their competing page.
- Is there a blogger in the space with influence who has said something cool? Link to them! They’ll love it and might even talk about you.
Using Images to Break up Your Copy
You may have already sourced your images in your research phase. If you have then great just be sure that they address the following:
Sources referenced where external or unpaid for and permission sought
Optimised for fast loading times (where the platform doesn’t do so automatically perhaps)
Solid naming conventions with references to keywords where appropriate separated by hyphens.
Through using well named imagery that’s related to the content and theme, that’s well named and engaging we can help improve performance in Google image search and raise our profile and traffic.
The page URL (web address) is an important element of search performance.
We should look to make sure that our page URL’s contain our target keywords and or variants.
URL’s have no limitation in length, but it’s sensible to keep them relatively short and punchy as they are often displayed in the SERPs. Page URL’s will be truncated if too long so try to limit them to 100 characters max. It doesn’t matter if you overstep this limit, but be mindful that any characters after will not be seen.
Some CMS’s like WordPress create our URLs on the fly, taking the page title or heading of the content piece.
The WordPress CMS presents us with an opportunity to edit our URLs.
Where appropriate, we should apply similar thinking to that of our page title selection looking for variance opportunities where they present themselves.
Google News — There is discussion within the tech team for content that is newsworthy or aimed at a specific news element to be tagged or identified as a news piece.
Where this occurs, it’s important to ensure that a 6 figure number is appended to the end of the URL. Failure to do so, will mean that the piece will unlikely be included within Google news.
Tagging and Topics
WordPress gives us the option to tag our content and file it under various headings. Try and categorise your content in to 4 or 5 relevant camps if possible. Most CMS’s allow for this, so use it if it exists.
Through doing so, we create content diversity for a variety of singular topics and increase the likelihood of ranking for a variety of related terms.
After You’ve Completed Your Content
So, you’ve created your content and have spell checked the hell out of it and ran it by a colleague for sanity checks. Woot! Well done, what next?
In our pre writing phase above, we looked at the various social media opportunities and took notes around who was talking in the space.
It’s a good idea to have a post publish process that makes the most of what you learnt in your research phase and try to build up a head of steam.
Fledgling content will not fly alone, it needs the help of friends and colleagues to gain wings and help it grow.
The following list of bullet points will help your content get eyeballs and create additional impetus.
It only really applies to blog posts or special reports/white papers , guides and news items so bear this in mind before you do. Don’t use it for boring old product pages, unless they are super awesome or innovative perhaps
- Follow folks and brands that were tweeting or visible in your key phrase space and seek to engage with them over time.
- Use a social scoring metric as an aid in identifying who might be worth the largest investment of your time.
- Publish your post to Twitter and tag it with #hashtags relevant to the topic — determine the ideal time to do so aiming for your largest GEO audience
- Utilise your internal office/company network and let them know that you’ve posted and ask them to retweet, favourite and share your post.
- If appropriate — Post your link to LinkedIn and linked groups that you participate
- Post the link to your Facebook page and consider a small post boost targeted at Facebook user cohorts interested in your space
- Evaluate any other spaces that could be good conduits for your piece. Reddit for instance
- Include your latest blog post in any email footers you may use
- Did your post have a cool image? Is it Pinterest or Instagram worthy? Tag and send it to these networks.
- If someone shares or retweets or comments on your stuff, then thank them for doing so. Be social.
- Monitor the performance of your post in Google analytics. Spend a few minutes each day looking at your recent posts or content pieces and see how well they are doing.
- Talk to Rob about tracking new keywords and URL performance
Rinse, refine, and repeat.
So the recent change in how Google displays its ads on its search engine has already pulled up a number of interesting outcomes with agencies that manage large accounts reporting a number of standouts.
An increase in CTR of 16% across SERPs should be pretty concerning to folks in the organic space, and frankly to advertisers as well. I’m not saying these results are instantly stealing 16% of traffic from organic results, but there’s certainly been a migration as a result of this change; however significant or insignificant is yet to be seen. Aaron at EliteSem
That’s quite a big chunk and is echoed by what icrossing saw too with big increases in CTR for the new ad slot.
Positive click-through-rate impact for top positions (+5%) and PLA (+10%), as competition at the top right has been eliminated.
Negative click-through-rate impact for positions 5–7 (-8%) as they moved from top right to bottom of the page.
Negative impression impact for positions 8–10 (-69%) and click impact (-50%). However since this segment accounted for a very small percentage of impressions in the “before” period, their loss doesn’t represent a significant impact.
There’s no doubt a slew of these across the web. Look at any account with a large enough dataset and you’ll likely see similar patterns.
But what does this really mean for organic? It’s pretty obvious what it means for PPC. In the short term, for competitive queries the new position four ad slot seems to be doing a sterling job at stealing organic click share. If CTR’s are up across ad slots, then it follows that available click share MUST be down for organic, even if we account for the loss of side ads, right?
I was talking with a client yesterday about conversion rates on site.
We had all been a little perplexed in how conversions rates had dropped off of late and had tried a variety of things to identify and reverse.
We looked at the usual suspects of onsite changes, page speed, competitor activity, sector innovations etc and were doing a degree of head scratching trying to establish what was going on. Most channel traffic was up, organic especially. The view was that maybe rankings had decreased for competitive head terms (nope) or that direct and referral traffic had increased due to PR activity and that was impacting conversion rates due to lower buyer intent (a fact, but also nope)
The client noticed that the conversion problem had occurred around the 22nd of February, which funnily enough was around the time that Google rolled out its new land grab. Aha! The smoking gun.
What was really interesting (but surprising) was that the inclusion of this new ad spot, appears to have impacted the click through on high converting pages for competitive search terms. Effectively, for every competitive position attained, visibility has dropped by an order of at least one position.
Is it really the case that people collectively have jumped the shark and no longer care about ads in google as they once did? Has Google created such a neat and compelling ad product that users are now more drawn to the ad than they would be the organic result? Are the ads more relevant today even? Is all that SERP diversity of images, videos, knowledge graph, news results and the like just a massive pain in the Goolies? Are ads the quicker route for commercial intent!? Maybe!
Of course, I’m surmising and using the data witnessed from one account. It may not necessarily be the same for every commercial query and determining what is and what is not a commercial query isn’t a walk in the park either. Just because a query doesn’t have ‘buy’ or ‘book’ in the string doesn’t mean that it’s an informational intent type query.
It’s only when you begin to dig in to your conversion data locally that you’ll even begin to notice, and even when you have your aha moment you’ll be none the wiser as to how to fix it.
In short, the only fix that matters is, to gain increased visibility for your commercial intent queries, and the only way you are going to do that in “Google Four Ad slots” is to buy ads.
Sure, you can up your activity in your other channels and up your efforts targeting queries of lesser commercial intent and create more wow moments in your PR and general marketing efforts but make no mistake. Those organic opportunities are continually diminishing as Google seek to eat more of that organic pie.
For those interested, it might also be interesting to take a little look at CTR generally and look at a few of the tactics Google has taken over the years.
Looking at CTR historically
If you look at click throughs around positions over the years you’ll see that it’s an interesting picture. Many of us will have read the various click through studies detailing how pos #1 gets x % position #2 y% position #3 z% tailing off the further you go down the SERP.
Here’s an old graph from Internet Marketing Ninjas showing the optify data
This is old of course and came from the days when there was a max of two ads above the fold at the top.
However, it does show the general picture and variations over the years show similar curves and it’s pretty safe to say that with the advances in PPC ads since (smart links, stars, better ad copy, blah blah) that those numbers and their respective share has likely diminished since as ad clicks, knowledge graph type distractions have gained click share.
Eye tracking and clicks
Heatmaps show us that generally, much of our attention is taken by the space above the fold.
A page loads, we scan it, see what we need and click it and many of the studies produced have helped inform ad placement, nav placement, button placement and the like.
This eye tracking study below shows the google of old 2005 and the google of 2015. The golden triangle versus the um…red guy with no arms and legs.
What’s really interesting is the whole background colour change in the ad slot in the image to the right. Note the background is some kind of distinctive yellowish colour.
Do a search today, and that colour distinction is no longer there. The only differentiator is the word “Ad” and that’s diluted by other distractions like ad links and gold stars.
Many of the features that Google used to show for its organic results, user rating stars for example are now seen in its ads, but increasingly, not in its organic results.
It would seem that increasingly in the organic portion, attention is taken away at every opportunity. One could be forgiven for concluding that Google sought to confuse the consumer by continually shifting such features around and blurring the lines between organic and paid. After all, we aren’t stupid are we? We don’t need to see the ads with a clearly defined different background colour, do we.
Some might say that it would appear that if it’s commercial and you monetise it, then the Google of today wants you to pay for those clicks.
For businesses looking to seek visibility for commercial queries, they are effectively a pay for inclusion engine today. If you want visibility, then they want you to pay for it.
It’s a risk laden strategy. Altavista did the same in 1998 and killed itself.
Users didn’t want ads shoved in their faces and users left in droves, enticed by the thing that was all Googley.
Google aren’t stupid and have learnt from the mistakes of their predecessors. They do lots of testing and use feature creep to change things. Revolutionaries they are not.
I’ll leave you to draw your own conclusions around the bait and switch tactics and overlaps of paid serps versus organics. There’s no reason why they’d seduce users with rich snippets, only to snatch them away and leave them hanging around in their paid results, no reason at all.
If you are seeing similar things in your campaigns, decreased conversions whilst organic traffic has increased, and it fits in with these date ranges, do let me know in the comments.
Just to be clear, I didn’t personally identify the reason for reduced conversions. A team member at the client put forward the hypothesis and the whole 4 ad slot scenario seems to fit. I’d love to say who that is, but client confidentially and all that stuff… Hat tip Nick!
Hello – I’m excited to announce the release of some really useful SEO products for 2016.
The products are aimed at marketers and business owners
and lazy SEO’s who’d rather not do the work themselves.
Presently, there are
three four to choose from but I’ll be developing more as time allows.
It’s a bit of a departure from the usual SEO product suite announcement in that none of these products are produced via automation or some clever backend api integration.
The reports created will of course use a suite of the best tools in the business. For starters most will use a combination of Kerboo, MajesticSEO, SEMRush, Google, Bing and Moz – we also use a few other top-secret ones too but if we told you what they were then we might have to tickle you to death.
The nature of the type of reports produced means that you’ll have to wait at least a few days for whatever you buy. Sometimes you’ll have to wait longer dependant upon what you’ve ordered and the number of others waiting for the same. Look for the status update on the product pages for the latest turnaround times.
As I said, there are
3 4 new products.
The products are all different and tailored to the specific client that requires them. There’s no template, no fluff, no sausage machine in action.
To go too much in to the finer details of each would be to spoil the surprise and delight of your purchase.
What I can say is that I love what I do and have been doing it for quite some time now (20 years OMG). I’ll provide you with actionable insights that will make a difference to your understanding of your business and niche. I’ll give you ideas and inspiration and will show you how to fix any general silliness you’ve managed to find yourself doing. I won’t tell you about anything you know already and I won’t kill you with charts and lists and intangibles.
You’ll find phraseology like – “This part of your site is sub optimal and my recommendation is that you change this line of code to this line of code” or “An analysis of your market shows that you have some major content opportunities at hand, my advice is that you do X Y and Z as a priority…”
I hope not to have to write stuff like “The majority of your backlinks appear to come from a suburb of Afghanistan, whereas you aspire to rank in the bustling community of NYC…” I’d prefer not to work with numpties if I can avoid it, so if that’s you then erm…sorry.
There are rare occasions when it’s clear that there’s very little to say or add.
If you are one of these fortunate people then accept my apologies in advance as I decline your request. Why not go spend your cash on nice holiday or give it to charity instead?
That’s it! Happy 2016 to you!
Ps For the referral minded among you, there’s an affiliate program full of half decent commission for completed sales.
PPs. For a limited time, enter BoxMeUp at the checkout for an additional 20% off
It’s a fair question, and one that will get different responses from different companies.
Ultimately, your SEO will be looking to identify and unblock any bottlenecks and help return your domains search engine visibility for queries that are important to your business.
In this post, we are going to look at some of the typical aspects that a reputable SEO company should be looking at if you experience a sudden stop or gradual drop off in traffic to your website from search engines.
Where has my search engine traffic disappeared to?
Businesses that turn to SEO companies for help will often do so on the back of a crisis.
They may have seen a gradual decline in search engine traffic or a sudden drop in traffic that has a huge impact on sales or enquiries that matter to their bottom line. Such events are of course worrying and require investigation to see what is the problem and how best we can identify and present solutions.
You’ll need to give your SEO as much information as you can. They’ll need access to your analytics package to view past traffic performance and your Google and Bing webmaster
tools accounts search consoles to see if there are additional direct clues.
You should also be candid with them and tell them of anything that you know that has been done to help them identify things for you. If you bought a tranche of links from a link seller or signed up for a dubious website promotion strategy then tell them.
Lack of transparency will not help you and will cost you more money in the long run and the SEO will likely find out anyway through their investigations.
Using the webmaster search console to help identify problems
The webmaster search consoles may tell your SEO professional if there’s a specific issue relating to the domain due to a manual penalty or an onsite performance issue.
The webmaster search console contains specific information about your domain, generated through the search crawl and the responses received. It will also show search traffic numbers and limited information around keywords, volumes, positions and click through rates.
Manual Penalties – Maybe you have a manual search penalty
Search engines will (but not always) notify webmasters if a manual penalty has been applied. A manual penalty is applied for egregious abuse of search engine guidelines. These might be for link buying for example, hidden text or other spammy type activities that have been identified as unacceptable.
Where you have a manually applied penalty, you’ll need to file a reinclusion request from within the console. You’ll need to outline what you have done to correct any transgressions and politely beg for mercy, promising that you’ll never do what you’ve been penalised for again.
Generally, manual penalties are rare and there are often other reasons why a sites traffic has been impacted. Crawl errors are often responsible.
Let’s look at those.
Identification of Crawl Errors – Is your site generating debilitating site errors?
When a search engine visits a website, it effectively ‘crawls’ the pages using its search engine spider or robot. These spiders or bots as they are known are simple fetch and grab programs that read the content of the pages and then store and classify them in their databases. The codes returned by your web server are recorded and the outputs are then shown to you for analysis.
The crawl aspect of the search console will provide insights into how the search engine is evaluating the domain and will provide clues to any issues. Crawl errors are very useful as they help us see what may be going wrong onsite and contributing to poor performance.
Poor Robots.Txt File
An example of this might be a poorly formatted robots.txt file. The robots.txt file is a means of telling the search engines what should and what should not be indexed. It resides on your root domain and is accessed periodically by the search bots and spiders. Mistakes in these can often block an entire domain from being indexed, leading to very poor performance in search. A review of this file will help identify a problem.
Server Error Status Codes
The error code section of the search console is a great means of identifying on-site performance errors.
Server error status codes are generated by web servers, are numbered and have different meanings. Dependant upon the error, an SEO would advise and explain what each meant and how they were impacting your traffic. The worst type to have would be 401 or 403 as these are effectively saying to the search bots “go away, you’re forbidden or not authorised” If the bots can’t read your content, then your content cannot be ranked or indexed in search.
More common search status errors are so-called 404 errors. These occur when a page that is requested cannot be found. The web server will often (subject to config) return a generic page that says page not found. The better ones are useful to users giving supplemental help in enabling people to find alternatives.
Server error codes are a useful means of gaining insight into poor scripting or server performance generally so should always be considered as an early part of the investigation process.
DNS errors are often transient and can occur where the host server has issues relating to configuration or routing or hardware, DNS errors will restrict access for people looking to read your content. This includes search bots. Persistent DNS errors will prevent your site being seen in search so it’s important to get on top of the issue should it occur.
Server Connectivity and Performance
Sometimes, your web server will struggle to perform and might have connection issues that impact upon page speed and content delivery. Where this occurs, it’s important that you address the causes and return the site to peak performance. An SEO should look at performance factors as part of their investigation as ultimately, search engines would prefer any pages that they return to their users to be fast loading and functional. A poorly configured web server or script will drain server resources and switch users off to your site. If this happens with too much regularity, then search engines will lose confidence and trust in your site as a resource and your rankings may be impacted.
Algorithmic filters due to Panda or Penguin
Other reasons why your site’s traffic may have been impacted relate to so called algorithmic filters. There are many types of algorithm and they are rolled out periodically or generated upon the fly. The two we’ll look at here are called Panda and Penguin.
The search console with regard to these, isn’t that useful as the data ranges we like to use to review such things are limited to 90 days. To take a good look at these we need to see historical traffic data over a longer timeframe as this enables us to look at traffic patterns and discount things like seasonality or general growth over time.
Using Your Analytics Package to Identify Algorithmic Filters
The Panda algorithm is aimed at low quality or thin content and seeks to demote pages that are considered to trigger these signals. Panda has had a number of iterations over the years and SEO’s have identified the dates which can then be referenced against website traffic patterns. The general theory being that if your traffic plummets coincide with the published release dates of these, then it’s pretty easy to conclude what the issue is through looking at your traffic within your analytics package.
It may of course also be very obvious anyway and a good SEO should be frank enough with you to say that actually, your site is appalling and you need to reevaluate your content generation model…
Sites that were built in 1999 may not necessarily meet the expectations of 2015 perhaps. A good SEO company will at least discuss this with you and help you appreciate the needs of today’s web users. If you are answering a web query in 2015, then you need to be going above and beyond.
The penguin algorithm relates to your link graph. Some websites have unnatural inbound link patterns or have too many links that are considered to be from low quality sites. Where this is the case, a good SEO will help you identify what these are and will be able to help with a plan that will disavow any low quality links.
Again, the use of your analytics package will help the SEO align your traffic with known penguin release and refresh dates so that they can confirm whether or not your traffic fall off is penguin related.
You may have recently undergone a site redesign, your developer may have used a new technology or url structure that impacted your site in a negative way. Poor metadata, duplicate page titles, non existent page titles, poor keyword selection are just a handful of issues that may be present on site. A good seo company will help identify what these are and show you the way forward.
Wrapping things up
As we can see, there are many things that can contribute to poor performance of websites in search engines; manual penalties, algorithmic filters, poor content, poor site structure and architecture, poor hardware and each of these can pull your site down for the queries you aspire to. A thorough examination of these issues will help you take the steps that will eventually return your site to where you’d like it to be. It’s a good idea to have an seo site audit before issues occur as this can save many thousands of pounds in fixing subsequent issues arising.
Ps For the marketing DIY enthusiasts we have a range of products that can help you drive your business forward – maybe you need a manual link report to identify potential problems with your link profile , or an seo site review to unify your thinking and know you’re on the right track , or a content marketing module to give your creativity a kickstart and finally there’s a full audit and strategy report to give you that full on perspective.
One of the great things about working for yourself is that subject to resource you can virtually do what you like.
I spend far too many hours messing around with what I’ve learnt over the years and applying aspects that will offer limited return. I guess I do it because it’s fun and it sates a curiosity and if I’m really lucky it sometimes causes me to stumble on something of real value.
We all read mountains of stuff about conversions and attribution and the challenges faced in matching up the various channels to their respective ROI pots. People will naturally gravitate to positions that effectively back up the department for which they’re responsible for, so it’s no surprise to read all manner of conflicting viewpoints that make the case for the relative efficacy of channel a or tactic b.
The best way to understand things is of course to pull them all apart and put them back together again, often in the wrong places just to see what happens. Record the results and draw a few conclusions. Rinse repeat until you’re bored or until you’re happy with what you have.
Much of today’s analytics suites are built around cookies and a bit of embedded script on a page somewhere. For those who don’t know ( and I suspect a few of you reading this will so apols to you guys) when we view a web page on a device the web server has access to a number of environment variables. Not every web page utilises all these as they’re too much hassle (for most) to code into their projects and for most, analytics pages like GA or Omniture are as good if not better for what they need.
Attribution modelling is pretty much covered in most analytics packages but as referenced above it’s all about the set up of the funnel and the interpretation of results. What message you need and who you need to tailor it to. SEO is an amazing channel and it’s no surprise that Google for example, systematically seek to disassemble the ease of measurement whilst introducing new features at the same time. It’s pretty easy to lose people in technical theory; especially if we don’t all speak with the same understandings. HSTS super cookies, super cookies, cross domain tracking, cross device tracking cookies are just a few examples that most folk will struggle with conceptually.
Anyways, I’ve gone off track a wee bit, so apologies…
So, what have I been playing with and how is it of use potentially?
If we have a big domain with lots of users who come to our site and buy or use and then go away and come back again then we can pretty much begin to measure what they are doing, frequency, visitor length, page views and all the standard stuff that analytics packages will tell us.
1000’s of domains don’t have user accounts and for ecommerce sites especially, this is a huge lost opportunity. Check out systems are rightly cautious in enabling folk to purchase without the need for an account (it’s easier to convert folk from the purchase email anyway; incentives etc)
If we have users who are account holders and who return frequently, then we can begin to model behaviour and do a whole lot more useful stuff with tracking.
If we record (locally) specific details about the devices used along with environment variables such as screen, color depth, resolution, IP addresses used, referers, mouse behaviours, GEO data and all those things that are unique to them, then can we not begin to model the behaviours of those who aren’t logged in displaying similar behaviours also and begin to assign them to user type pots perhaps? Yes we can.
We might for example, know that user A (lets call him John) originally turned up from Google and he landed on a page that sold Triumph Rocket Touring Back rests.
A very specific page with words relevant to backrest , Triumph, Rocket and Touring. All of the meta and page data, urls etc were pretty tight in terms of KW accuracy so, despite Google hogging all of the query data for themselves we could pretty much determine that John searched Google for a Triumph Touring Back rest or at least a subtle variation.
We can assume That John either went straight to Google himself or that someone suggested he search on Google . Whatever way it’s diced, we know that he came from Google and he used his iPhone to do so.
He didn’t purchase though and we didn’t know who he was. He was at work on their wifi and he wasn’t ready to commit to the purchase as he was in research mode. He looked again on the way home this time on the train, from an edge or 3G connection as he hurtled through the burbs on his way home.
Later that day when he he got home he opened his iPad and he searched Google again or maybe he used the link that he emailed from his phone earlier and went straight to the page. His wife meanwhile was sat on her Mac or PC even. John talked to her about how his back hurt and he wanted a backrest for his bike. John’s wife’s a bit of a bossy boots so asked him to ping her the link via iMessage. The page looked amazing on her retina screen super expensive Mac and after much interrogation, she agrees that it’s a good purchase decision. Great says John and proceeds to make the purchase on the Mac.
The vendor some days later is looking at the purchases and tracking who came from where and what. He sees this isolated purchase that came from a Mac. One page view of the product and a purchase within seconds. No dilly dallying at all. He sees that the credit card info was from Mrs P Whatsherface (the details stored in John’s wife’s digital wallet)
On the face of things, the vendor has no real way of determining who to attribute the sale to. His ill configured analytics package, attributes it to the direct visitor pot and the vendor concludes that it was either from WOM or that amazeballs local motorcycle magazine campaign he paid extortionate money for just days prior. After all, he sees quite a few of these so they must be from his offline marketing efforts.
In any case, he’s kind of happy, he’s made a sale. He’s even going to renew his motorcycle magazine advert as maybe it’s working well after all. 50 sales of this type already this month…
Meanwhile, the day after, John is on the train to work. He’s on his iPhone again, fiddling around, going through emails and reads the follow up email about his back rest purchase. He clicks the link excitedly and logs in to this account on the motorcycle vendors website. He has a little browse and he’s off again.
So, what can we deduce from this little story? What lessons are there for the vendor?
At John’s first visit from his iPhone, the vendors server or analytics package should have segmented John’s visit in to a pot or database and recorded the various aspects relative to iPaddress, device type, referer, length of stay.
It would have dropped a little cookie too.
When John then returned whilst on the train it could have began to have matched some of this data, it could have seen the cookie and said aha!
It might have noticed the different IP addresses and said aha again!
It might even have noted the different ISP’s and GEO locational stuff and said aha again and then it could have seen those Mac purchase variables and concluded something different entirely.
It could have learnt that there was a whole pre purchase journey that did indeed start with Google and that when it ran a similar back reference model across a multitude of similar purchases that there were similar behaviours.
He’d have saved a small fortune on that crappy motorcycle mag ad also.
So, this is what I’m doing at the moment. Playing with these kinds of factors and seeking to create pots or tables that record specific user and device behaviour and record the various aspects of what they get up to. I’m in danger of making this a TL;DR post so I’ll shut up for now, but if you’re interested in some of the specifics of how it might work or indeed, if you have any ideas yourself then I’m all ears.
Facebook has enormous power in this regard, but that’s a post for another day perhaps.
Moral of the story? Create accounts, convert your visitors and track everything and analyse retrospectively too.
Is that a Good Link or a Bad Link?
I played with a new tool this morning. It was some kind of link evaluation tool.
It purported to tell you whether a link from a URL was good or bad or somewhere in the middle.
Cool, I thought.
So I gave it a go and popped in 6 URL’s. All came up with wildly wacky results, all were deemed to be spam, all suggested I should do something funny with them and run away screaming.
The Magic Box of Gluttony in the Land of Glut
A world without search
Once upon a time there was a magical world called Glut – Everything happened in the world of Glut, the people within it did all manner of things. They built pink castles from weather resistant marshmallows and cool lakes made out of lemonade and beer where hamburger flavoured fish swam. Some folks knew how to make really fast cars that ran on magic beans made by their friends in the forest of emeralds.
It all seemed ideal in the world of Glut but progress was slow. Few knew how to fish for the hamburger flavoured fish and the magic beans that grew in the forest of Emeralds were known but to the people of Fark. The bottom line was that news traveled slowly in Glut, information was often controlled by the powerful and where it wasn’t, it was difficult for merchants to gain wide reach or appeal for their ideas and products.
Building a search engine to conquer all
Up on a hill next to a mountain in a place called Gooleg there lived two special wizards.
They were clever wizards backed by the powers of InvestorLand who knew that everywhere they went in the world, people had ideas that they wanted to share so they went to a little known place called Altavistaland and took on the mighty wizard Inktomi where they learnt the secrets of Retrivicus Informanicus. They went to Microland and bought the ingredients required and built a big magic box. Continue reading