Google Subscriber links
There’s a little bit of buzz about today over Googles subscribed links feature. I thought it might be handy to share my robwatts view on it. The view being one that is informed from 3 perspectives. A google user a site owner and an SEO.
I’ll stick to three classifications, good, not so good and tell it like I see it reality good.
From a user perspective it could be a handy thing to stumble across sites that display the ‘subscribe to links’ badge and know that subsequently if I search on a related keyword that the site owner has added to the Google system then provided that I’m logged in to my Google account, I’ll get a trusted result in the SERP to a site I’ve used before, kindly highlighted and formatted to stand out from the page. Saves me having to bookmark lots of sites and refer to things of old. I’ll just find a good site, see their Google badge and hit it. Done, subscribed.
From the perspective of a site owner a thing like this gives me an excellent opportunity to promote whatever it is I do and inject highly targeted keyword links to specific URL’s that for my subscribed user base, are guaranteed to appear in a good position for whatever keywords I choose to target in my TSV text or XML feed. I can get to position 4 for any search I want on any keyword related, provided of course I can win the numbers game (subscribers) and have the content to go with it. I have a potentially huge incentive to get as many people as I possibly can to sign up for this. I can lower both my PPC and SEO costs and save a bucketload of cash if I manage to crack it and hit it spot on.
From the perspective of an SEO/SEM a thing like this is a useful opportunity to demonstrate to clients that I have a broad understanding of the range of complexities and angles required to succeed in the Internet of 2008. If my client happens to have a huge database with some kick ass products too, then the whole process of getting their users to sign up to their subscribed links will of course inform my overall SEO strategy and be a part of any subsequent site rebuild. If you assume that Google is always on the look out for signals that are harder to manipulate, then this really could be just one very big signal indeed.
The not so good
Of course to every good there is a not so good. What I don’t like is this. I have to be logged in to my Google account to experience the benefits of this. Google will be a fabulous position to track my searches, the types of sites I visit and the types of sites i choose to buy from or subscribe to. If I don’t have an account and happen to stumble upon one of these sites pushing this thing, then if i want to participate I have to sign up for an account. I can’t do so anonymously, I have to join the borg collective and sign up to being tracked.
As a site owner I am sort of pushed into a little bit of a dilemma. Easy, apparently cost free traffic, might appear to be a good thing, yet there is a cost attached. I have to install Google code on my site which then gives Google insights to my business and visitors who land on my pages. I also provide Google with potential new account holders and have no guarantees with how they’ll use their actions in the future. If I want those coveted positions then it makes sense that I take steps to build my user base and get those casual searchers back and clicking on those subscription buttons. I have no guarantees that having built up a following via Google that Google will introduce options that’ll require some kind of payment.
A world where SERP results are informed solely on the basis of logged in user actions, of where people subscribed to every site they visited using fully personalised SERPs for practically every vertical would obviously impact upon the traditional SEO model. Link building – forget it. Ranking reports – forget them. Anchor text – scrap that . HTML formatting – forget it.
A site owner could in theory completely and utterly circumvent the need for traditional SEO. A system that allows for site owners themselves to decide on what their XML feeds are relevant for, is potentially huge. The Google back end might even add features and offer tools that suggest relevant keywords for them to target for each feed or text file, it could use such data to inform its algorithm and push out sites that had no such profile, especially in the more competitive niches, which is exactly where things like this will be targeted most.
Of course the reality is that some of the things above will and some will not come to pass. The truth of the matter is that take up of such a thing will probably be very low. Most people will swing merrily along with their existing set of habits and suspicions and will keep on doing things in the way they have.
We need to think about why the search engines are pushing such things and look for some of the reasons behind it. The number one reason is of course money. Sure, they probably do want to ‘improve their users experience’ too yet at the same time they also want to get a little more control and say so over who gets to put what in where and how. By getting site owners into a direct relationship and delivering traffic, they help set up a platform for subsequent monetisation and consolidation of the search marketing pie by taking a bigger cut of spend that is currently going to SEO. By demonstrating that SEO is not too effective, by converting as many people as possible to personalised models they undermine some of the attraction that traditional SEO offers. What good is an SEO ranking report if every bodies results differ based upon their user profile. How can an SEO firm demonstrate the impact of their actions if ultimately there is no sure way of measuring such actions. Why employ an SEO to advise you on markup and keyword density and inbound links and a whole lot of other paraphernalia, when you can simply upload a feed, choose your own keywords to rank for possibly assisted by Google back-end web page optimser tool , advising this that or the other.
From a user perspective. I’ll certainly use this and when logged on certain pc’s I might even subscribe to a site or two, especially if they are related to me or a client or are in some way exceptionally useful.
From a site owner POV, heck why not what’s to lose. I already use Google analytics, I don’t particularly care too much what they do or don’t know about my visitors, it’s really not a concern. It could be a positive it could be a negative. If I have 20,000 visitors per day and no one subscribes does it mean that I have a crappy resource just because a site in a similar vertical with similar visitor numbers gets people signing up to this? Conversely, will my site that has this code and gets the subscribers win out in the SERPs; just because the site of a competitor who kicks my arse in other ways doesn’t?
As an SEO – being the adaptive bunch that we are, I’ll probably see it as a selective opportunity tool. If a site has good content, then I’ll advise it as part of the overall mix. I’ll apprise a client of the pro’s and the cons. I’ll tell them how it could be used potentially. I’ll tell them all about quality scores and quality signals, of Googles aversion to SEO tactics and general dislike and disdain for its ability to detract from their PPC revenues. I’ll tell them and show them like I always have of the best methods for using it to their advantage. That’s what I’m paid to do after all.
Bring it on Googiebaby 🙂
I was doing a little search this evening and found this amongst the results on the 1st page.
I thought it might be something related to me being signed in to a Google account so I signed out and tried again. Same result. So I opened up IE7 and same result.
I tried the same query on Google.com but it didn’t replicate.
Who is this mystery user=016597473608235241540 I wondered, no one it seems there’s a few more of them too. A little site:google.com/coop/preview query reveals just 152 results which in the grand scheme of things is just tiny . The question is however why? Why spider customer results and include them in a SERP however small the sample.
The result page itself has the distance in Kilometres but no link to any defined map or directions/distance page. Just two links to wikipedia – crap really, not a good user experience.
I’m sure there’s a better result out there than this one.
I attended SES London on Thursday, my first since way back in 2002 or was that 2003, Christ I can’t remember!
Customers Prospects and
The first session I went to way a presentation by Brian Eisenberg of Future now entitled Redefining the Customer loosely speaking, Brian hit on important aspects to consider when designing a site, usability issues, seeing site visitors as citizens rather than ‘users’ and how the small changes can really have huge impacts. One funny thing during this speech was the woman sat next to me who started to snore whilst waking intermittently agreeing and nodding with what he was saying! Bizarre person. Anyhow on the whole a good session if a little too ‘markety’ for my liking.
The second session was entitled Dynamic Websites: Beyond the Basics and for me, whilst it didn’t teach me very much of anything I didn’t know already was hugely entertaining due in large part to the contribution of Mikkel Demib Svendsen who was an absolute hoot delivering his part of the session in an informative but humourous way.
Then there was Ralph Tegtmeier , aka Fantomaster who in fairness had a hard act to follow but made the key points about controlling Pagerank and ensuring that you adopt an optimal approach. I wasn’t that impressed by Ralphs’ contribution as for me at least he could have gotten a whole lot more specific and focused on one particular aspect as opposed to just sticking up lots of theoretical boxes with arrows pointing to them and not saying very much else on the wheres and why’s of it all. I got the feeling that it was more about if you want to know more about how to do this then come and talk me and maybe we can do business but hey.
The 3rd guy in the session Kristjan Mar Hauksson of Nordic eMarketing had ran out of time and zipped though some top level stuff around how optimizing your site and it’s structure can help it’s bottom line and drive more sales. Besides lots of ‘no shit sherlock’ moments overall it was a to the point here’s what you should be doing and come and see me to learn how we can help you do the same kind of thing approach.
After lunch I went to Beyond Linkbait: Getting Authoritative Online Mentions speakers were Alan Webb, CEO, Abakus Internet Marketing , Mikkel deMib Svendsen, Creative Director of deMib.com and Brian Turner, Director of Britecorp, Ltd . Alan Webb or Webby as he has been known over the years gave an excellent run down of tactics and methods that you should use when building your links and getting those all important social votes. I actually learnt something from Alan too which is very cool and will be used to good effect over the coming weeks and months. To share, it’s the linkdomain:site.com “keyword keyword” -site:site.com method of using Yahoo to find out who is linking to you thematically. For those who know what I’m on about, I’m sure you’ll appreciate the practicalities of this.
Mikkel deMib Svendsen was again on good form using a mix of humour and outrage to deliver a message that covered some of the more controversial link acquisition methods. Whilst a little blackhat, the important message he was getting out there was that as a site owner or site admin at least, you really do need to be aware of these methods and more importantly the means for closing such loops.
Brian Turner delivered a social link node presentation whereby he gave an illustration of how websites are social nodes and of how some are more active than others and of how if you use them in the right way and create the right conditions in the right regions of your vertical then you can harness the power that comes from these and use it to boost your core product or service. By and large I heard what he was saying but felt that the whole idea of buying communities and gaining mindshare publically is very hard to achieve and needs careful nurturing and TLC. Why? Well the idea of building a strong niche and using communities to help achieve that is all well and good, yet there is a fine line between getting a community to participate and drive love towards your business and alienating everyone to the point that they all up sticks and piss off out of it. This graph from Alexa showing what happened when SEO Chat was acquired sort of illustrates what can happen to a community if that community suddenly feels that they are contributing to the bottom line of some would be or megacorp.
So yeah, I think you can build it and grow it, but it’s difficult to just buy it and expect it to carry on as before.
The next session I attended was entitled Local Search Marketing Tactics which to be honest I really don’t know why I attended as it was very low level stuff that I could have read about online. Still, handy for those who wanted to know about getting in to the local search results and searching locally too.
Web Metrics and Analytics
My final session of the day was the Web Analytics & Measuring Success session which covered the whole metrics thing from the perspective of people who had both developed, commented upon and used them regularly.Moderatored by Mike Grehan SES London Co-Chair and Founder and CEO of Searchvisible Ltd
Andrew Goodman focused on Google analytics and showed a series of graphs and trend data highlighting how generally web analytics can be used to identify key user actions and draw benefits to your site as a result of that knowledge.
It was to be a common theme, Stephen Turner spoke elaborated on the concept of ‘segmenting’ which for the unawares amongst you is the means of being able to compare and contrast metric data and determine with increased clarity whether user action a or b was more effective because of measurement c or d. This can be pretty powerful in reporting how people behave on your domain. You could for example have a look at the behaviours of people from different geographic regions for instance and fine tune a page or section as a result. I use a tool called indextools daily which gives a similar level of granularity enabling you to draw down to very specific aspects of user visits both individually or as a group.
Demo Zlotin from San Diego gave an interesting account, I particularly liked his shelf analogy whereby he showed a graphic of a SERP with the top 5 positions inhabited or influenced by a company with vested interests and strengths within each domain. His segmentation point being, that by using your refererr data and knowing your presence you can quickly determine how effective or ineffective your recent strategies have been.
Frank Watson aka aussiewebmaster talked about how the knowledge and use of web analytics had helped his company save millions over the years. The message being track track and track again.
Overall, the day at SES was good. It was good to be back in Islington again being born and bred and whatnot. I met people who I’ve only ever seen before online and it’s always good to get a feel for what is happening out there on the ground. To be honest, my day visit doesn’t really do the thing any justice as by and large these events are more of a social networking opportunity the majority of which happen in the hotel bar and surrounding pubs and restaurants of an evening. Still, it’s always good to get out of the office, who knows maybe I’ll manage a longer visit elsewhere some other time.
It’s good to rant
I’ve probably written other stuff like this in the past, but hey, I might say it a little differently this time, so i’ll say it again and see how it comes out this time. You can’t beat a good rant! 😀
We all know already, I don’t need to preach to the converted, that amongst the inhabitants of this planet of ours that there are these people called the spam police who just so happen to have a lot of Internet user market share and naturally enough want to hold on to as much of the monetary pie that comes from this for themselves. In terms of the whole search economy they’ve been very clever indeed. What they’ve done is this. They’ve demonised the whole concept of buying links. They’ve made it seem like this hugely unfair unethical thing that gives people an unfair bump in what is supposedly an otherwise fair algorithmic system.
*non seo bod note:links drive the search economy, they push sites up the search engine rankings. Without them sites would not rank for jack
Here is the news – It isn’t a fair algorithmic system! Seriously, I shit you not! It’s based on a string of metrics that give those at the head of the race a distinct advantage. If you want to catch up then you’ve got to do things to compete. Acquiring links is one such way. However, by buying such links you are seen as manipulating the index and risk some form of penalty especially if ratted out or investigated for not spending enough on some PPC program.
See the thing here is that competitors, or those wanting to be on the 1st page of a SERP just don’t have the time to wait around for years of brand and so called ‘natural’ link building to take effect. They need to compete today, not tomorrow. They want the same competitive advantage afforded to their competitors in positions 1 – 10 of a SERP and provided no one dies and no laws are broken they’ll pretty much pay what it takes to get there. PPC won’t do it. PPC is dead, spend once money, whereas SEO is a real investment in actions and factors that will make a tangible benefit to their positions in some algorithmic link dependent system. Yet to say all that it needs to be said that it isn’t you or I that decides what is spam. The search engines decide on what is and what isn’t spam, it is as they like to say, ‘their index’. Link buying is spam, because they say so. Yet the irony is that link buying actually improves their indices! Funny.
The search engines will often say that it’s all about safeguarding the integrity of their index or stopping those pesky evil spammers but the reality is that those arguments don’t really hold water as at the end of it all, it’s all about money and advertising and where those advertising budgets are spent. Most search engines with PPC programs believe that such money is better spent with them on their PPC programs, contributing to their bottom line. I can’t blame them for this, they are after all a business with a profit mission, just like you and I. Yet as SEO’s with clients to rank and bottom lines to maintain and an Internet to play with we SEO’s still have to get our links from somewhere. We can buy them sure, do secret deals, set up little networks that no mofo knows nothing about. We can duck and dive and bob and weave and …yeah, we can get very tired playing a cat and mouse game of hide the link source. The more competitive the niche, the more we will see others either rat us out or tittle tattle at a search forum or some moody blog somewhere. You know the drill, those link bertie small characters giving it “ooh how did these guys get to be 1, so quick so fast…” tittle tattle tittle. Shame on the lot of them really, but hey you can’t stop people saying what they want to say, especially if they are desparate for a link or two or are being pushed to do so, selling their negative SEO services to some party with a vested interest.
Anyways I digress, yet again. What I wanted to say is that it isn’t so hard to get links, and lots of them too. You just got to be more imaginative and creative than the next guy. You can linkbait your way into a link or two. Controversialise yourself (check my link bertie smalls reference earlier) and get all the jealous naysayers cooing and oohing, you can spam peoples blogs with your crappy comments and make out that you give a stuff about what they wrote on their dofollow blog or you can use your brain instead and do something productive with your life.
Don’t get me wrong; buying links is productive, there’s nothing criminal about it except in the eyes of the search engine. No one looks at what they get up to, no one is allowed to look into their activities or question their profit motives, they have carte blanche to do what they like with the online activities of the planet, yet in fairness no one is forcing people to use their platforms either, so I suppose that at some point be it through law or natural boredom with it all, it’ll balance itself out.
Damn I did it again (digressed) getting back on topic.
Writing and stimulating buzz and getting links. Shit, for some verticals it must appear to be damn damn hard.
How can anyone really get excited about some of the commercial crap out there, but there’s the rub. Speaking with a colleague the other day about some project or other I remarked something along the lines of ‘yeah right, well I find it hard to get excited about that sort of stuff (finance)’ and why? Well, because in most cases nearly everything you read about it is piss poor or boring, to me at least. If you want me to get excited about it, then you’ve gotta hit my buttons and speak to me.
You’ve only got to look at TV and see how they approach it to see it for what it is. Humour is a good approach (Nationwides Mortgage TV ads -Brand new customers only) as are cheeky chirpy youngsters singing and dancing (Halifax Building Society) , yet the serious message is that they recognise that money and saving and banking IS boring shit (for most) yet they’ve got to turn people on somehow, so what do they do? They appeal to our emotions and build associations and feel good factors designed to make us look at them in different ways. We’ve all got to bank after all, and if we are young and looking for somewhere to bank then we might just be more inclined to bank with an organisation that resonates than with one that does not. Guess what? It’d the very same with links too! If we can get people to invest in us and in what we write then we get those links.
I’m sure you get it by now, I won’t ramp on anymore about it, other than to say that really, whoever you are, whatever you are doing, with a little creativity and some traditional marketing approaches you’ll most certainly be able to fire those synapses, get people talking about your customers and their brands. By doingit right in the online world, you’ll get a bucket load of links back in return. You won’t escape the effects of negative SEO, but it might make your job easier in the longer term.