Monthly Archives: February 2008

Google Subscriber links – The Good – The Not So Good and The Reality

Google Subscriber links

There’s a little bit of buzz about today over Googles subscribed links feature. I thought it might be handy to share my robwatts view on it. The view being one that is informed from 3 perspectives. A google user a site owner and an SEO.

I’ll stick to three classifications, good, not so good and tell it like I see it reality good.

The Good

User

From a user perspective it could be a handy thing to stumble across sites that display the ‘subscribe to links’ badge and know that subsequently if I search on a related keyword that the site owner has added to the Google system then provided that I’m logged in to my Google account, I’ll get a trusted result in the SERP to a site I’ve used before, kindly highlighted and formatted to stand out from the page. Saves me having to bookmark lots of sites and refer to things of old. I’ll just find a good site, see their Google badge and hit it. Done, subscribed.
Site Owner

From the perspective of a site owner a thing like this gives me an excellent opportunity to promote whatever it is I do and inject highly targeted keyword links to specific URL’s that for my subscribed user base, are guaranteed to appear in a good position for whatever keywords I choose to target in my TSV text or XML feed. I can get to position 4 for any search I want on any keyword related, provided of course I can win the numbers game (subscribers) and have the content to go with it. I have a potentially huge incentive to get as many people as I possibly can to sign up for this. I can lower both my PPC and SEO costs and save a bucketload of cash if I manage to crack it and hit it spot on.

SEO

From the perspective of an SEO/SEM a thing like this is a useful opportunity to demonstrate to clients that I have a broad understanding of the range of complexities and angles required to succeed in the Internet of 2008. If my client happens to have a huge database with some kick ass products too, then the whole process of getting their users to sign up to their subscribed links will of course inform my overall SEO strategy and be a part of any subsequent site rebuild. If you assume that Google is always on the look out for signals that are harder to manipulate, then this really could be just one very big signal indeed.

The not so good

User

Of course to every good there is a not so good. What I don’t like is this. I have to be logged in to my Google account to experience the benefits of this. Google will be a fabulous position to track my searches, the types of sites I visit and the types of sites i choose to buy from or subscribe to. If I don’t have an account and happen to stumble upon one of these sites pushing this thing, then if i want to participate I have to sign up for an account. I can’t do so anonymously, I have to join the borg collective and sign up to being tracked.

Site Owner

As a site owner I am sort of pushed into a little bit of a dilemma. Easy, apparently cost free traffic, might appear to be a good thing, yet there is a cost attached. I have to install Google code on my site which then gives Google insights to my business and visitors who land on my pages. I also provide Google with potential new account holders and have no guarantees with how they’ll use their actions in the future. If I want those coveted positions then it makes sense that I take steps to build my user base and get those casual searchers back and clicking on those subscription buttons. I have no guarantees that having built up a following via Google that Google will introduce options that’ll require some kind of payment.

SEO

A world where SERP results are informed solely on the basis of logged in user actions, of where people subscribed to every site they visited using fully personalised SERPs for practically every vertical would obviously impact upon the traditional SEO model. Link building – forget it. Ranking reports – forget them. Anchor text – scrap that . HTML formatting – forget it.

A site owner could in theory completely and utterly circumvent the need for traditional SEO. A system that allows for site owners themselves to decide on what their XML feeds are relevant for, is potentially huge. The Google back end might even add features and offer tools that suggest relevant keywords for them to target for each feed or text file, it could use such data to inform its algorithm and push out sites that had no such profile, especially in the more competitive niches, which is exactly where things like this will be targeted most.

Getting realistic

Of course the reality is that some of the things above will and some will not come to pass. The truth of the matter is that take up of such a thing will probably be very low. Most people will swing merrily along with their existing set of habits and suspicions and will keep on doing things in the way they have.

We need to think about why the search engines are pushing such things and look for some of the reasons behind it. The number one reason is of course money. Sure, they probably do want to ‘improve their users experience’ too yet at the same time they also want to get a little more control and say so over who gets to put what in where and how. By getting site owners into a direct relationship and delivering traffic, they help set up a platform for subsequent monetisation and consolidation of the search marketing pie by taking a bigger cut of spend that is currently going to SEO. By demonstrating that SEO is not too effective, by converting as many people as possible to personalised models they undermine some of the attraction that traditional SEO offers. What good is an SEO ranking report if every bodies results differ based upon their user profile. How can an SEO firm demonstrate the impact of their actions if ultimately there is no sure way of measuring such actions. Why employ an SEO to advise you on markup and keyword density and inbound links and a whole lot of other paraphernalia, when you can simply upload a feed, choose your own keywords to rank for possibly assisted by Google back-end web page optimser tool , advising this that or the other.

From a user perspective. I’ll certainly use this and when logged on certain pc’s I might even subscribe to a site or two, especially if they are related to me or a client or are in some way exceptionally useful.

From a site owner POV, heck why not what’s to lose. I already use Google analytics, I don’t particularly care too much what they do or don’t know about my visitors, it’s really not a concern. It could be a positive it could be a negative. If I have 20,000 visitors per day and no one subscribes does it mean that I have a crappy resource just because a site in a similar vertical with similar visitor numbers gets people signing up to this? Conversely, will my site that has this code and gets the subscribers win out in the SERPs; just because the site of a competitor who kicks my arse in other ways doesn’t?

As an SEO – being the adaptive bunch that we are, I’ll probably see it as a selective opportunity tool. If a site has good content, then I’ll advise it as part of the overall mix. I’ll apprise a client of the pro’s and the cons. I’ll tell them how it could be used potentially. I’ll tell them all about quality scores and quality signals, of Googles aversion to SEO tactics and general dislike and disdain for its ability to detract from their PPC revenues. I’ll tell them and show them like I always have of the best methods for using it to their advantage. That’s what I’m paid to do after all.

Bring it on Googiebaby :)

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Google spidering its own custom search results?

I was doing a little search this evening and found this amongst the results on the 1st page.

I thought it might be something related to me being signed in to a Google account so I signed out and tried again. Same result. So I opened up IE7 and same result.

I tried the same query on Google.com but it didn’t replicate.

Who is this mystery user=016597473608235241540 I wondered, no one it seems there’s a few more of them too. A little site:google.com/coop/preview query reveals just 152 results which in the grand scheme of things is just tiny . The question is however why? Why spider customer results and include them in a SERP however small the sample.

Noindex perhaps? How about robots.txt even

The result page itself has the distance in Kilometres but no link to any defined map or directions/distance page. Just two links to wikipedia – crap really, not a good user experience.

I’m sure there’s a better result out there than this one.

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

SES London 2008 Day 3

I attended SES London on Thursday, my first since way back in 2002 or was that 2003, Christ I can’t remember!

Customers Prospects and Drug users

The first session I went to way a presentation by Brian Eisenberg of Future now entitled Redefining the Customer loosely speaking, Brian hit on important aspects to consider when designing a site, usability issues, seeing site visitors as citizens rather than ‘users’ and how the small changes can really have huge impacts. One funny thing during this speech was the woman sat next to me who started to snore whilst waking intermittently agreeing and nodding with what he was saying! Bizarre person. Anyhow on the whole a good session if a little too ‘markety’ for my liking.

Dynamic Websites

The second session was entitled Dynamic Websites: Beyond the Basics and for me, whilst it didn’t teach me very much of anything I didn’t know already was hugely entertaining due in large part to the contribution of Mikkel Demib Svendsen who was an absolute hoot delivering his part of the session in an informative but humourous way.

Then there was Ralph Tegtmeier , aka Fantomaster who in fairness had a hard act to follow but made the key points about controlling Pagerank and ensuring that you adopt an optimal approach. I wasn’t that impressed by Ralphs’ contribution as for me at least he could have gotten a whole lot more specific and focused on one particular aspect as opposed to just sticking up lots of theoretical boxes with arrows pointing to them and not saying very much else on the wheres and why’s of it all. I got the feeling that it was more about if you want to know more about how to do this then come and talk me and maybe we can do business but hey.

The 3rd guy in the session Kristjan Mar Hauksson of Nordic eMarketing had ran out of time and zipped though some top level stuff around how optimizing your site and it’s structure can help it’s bottom line and drive more sales. Besides lots of ‘no shit sherlock’ moments overall it was a to the point here’s what you should be doing and come and see me to learn how we can help you do the same kind of thing approach.

Linkbaiting

After lunch I went to Beyond Linkbait: Getting Authoritative Online Mentions speakers were Alan Webb, CEO, Abakus Internet Marketing , Mikkel deMib Svendsen, Creative Director of deMib.com and Brian Turner, Director of Britecorp, Ltd . Alan Webb or Webby as he has been known over the years gave an excellent run down of tactics and methods that you should use when building your links and getting those all important social votes. I actually learnt something from Alan too which is very cool and will be used to good effect over the coming weeks and months. To share, it’s the linkdomain:site.com “keyword keyword” -site:site.com method of using Yahoo to find out who is linking to you thematically. For those who know what I’m on about, I’m sure you’ll appreciate the practicalities of this.

Mikkel deMib Svendsen was again on good form using a mix of humour and outrage to deliver a message that covered some of the more controversial link acquisition methods. Whilst a little blackhat, the important message he was getting out there was that as a site owner or site admin at least, you really do need to be aware of these methods and more importantly the means for closing such loops.

Brian Turner delivered a social link node presentation whereby he gave an illustration of how websites are social nodes and of how some are more active than others and of how if you use them in the right way and create the right conditions in the right regions of your vertical then you can harness the power that comes from these and use it to boost your core product or service. By and large I heard what he was saying but felt that the whole idea of buying communities and gaining mindshare publically is very hard to achieve and needs careful nurturing and TLC. Why? Well the idea of building a strong niche and using communities to help achieve that is all well and good, yet there is a fine line between getting a community to participate and drive love towards your business and alienating everyone to the point that they all up sticks and piss off out of it. This graph from Alexa showing what happened when SEO Chat was acquired sort of illustrates what can happen to a community if that community suddenly feels that they are contributing to the bottom line of some would be or megacorp.

So yeah, I think you can build it and grow it, but it’s difficult to just buy it and expect it to carry on as before.

Local Search

The next session I attended was entitled Local Search Marketing Tactics which to be honest I really don’t know why I attended as it was very low level stuff that I could have read about online. Still, handy for those who wanted to know about getting in to the local search results and searching locally too.

Web Metrics and Analytics

My final session of the day was the Web Analytics & Measuring Success session which covered the whole metrics thing from the perspective of people who had both developed, commented upon and used them regularly.Moderatored by Mike Grehan SES London Co-Chair and Founder and CEO of Searchvisible Ltd

The speakers were Frank Watson, CEO, Kangamurra Media Stephen Turner, CTO, ClickTracks/Lyris Andrew Goodman, Principal, Page Zero Media Dema Zlotin, Founder & VP of Strategic Search, SEMDirector

Andrew Goodman focused on Google analytics and showed a series of graphs and trend data highlighting how generally web analytics can be used to identify key user actions and draw benefits to your site as a result of that knowledge.

It was to be a common theme, Stephen Turner spoke elaborated on the concept of ‘segmenting’ which for the unawares amongst you is the means of being able to compare and contrast metric data and determine with increased clarity whether user action a or b was more effective because of measurement c or d. This can be pretty powerful in reporting how people behave on your domain. You could for example have a look at the behaviours of people from different geographic regions for instance and fine tune a page or section as a result. I use a tool called indextools daily which gives a similar level of granularity enabling you to draw down to very specific aspects of user visits both individually or as a group.

Demo Zlotin from San Diego gave an interesting account, I particularly liked his shelf analogy whereby he showed a graphic of a SERP with the top 5 positions inhabited or influenced by a company with vested interests and strengths within each domain. His segmentation point being, that by using your refererr data and knowing your presence you can quickly determine how effective or ineffective your recent strategies have been.

Frank Watson aka aussiewebmaster talked about how the knowledge and use of web analytics had helped his company save millions over the years. The message being track track and track again.

Wrapping up

Overall, the day at SES was good. It was good to be back in Islington again being born and bred and whatnot. I met people who I’ve only ever seen before online and it’s always good to get a feel for what is happening out there on the ground. To be honest, my day visit doesn’t really do the thing any justice as by and large these events are more of a social networking opportunity the majority of which happen in the hotel bar and surrounding pubs and restaurants of an evening. Still, it’s always good to get out of the office, who knows maybe I’ll manage a longer visit elsewhere some other time.

 

Share and Enjoy

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS

Buying links without a creditcard

It’s good to rant

I’ve probably written other stuff like this in the past, but hey, I might say it a little differently this time, so i’ll say it again and see how it comes out this time. You can’t beat a good rant! :D

We all know already, I don’t need to preach to the converted, that amongst the inhabitants of this planet of ours that there are these people called the spam police who just so happen to have a lot of Internet user market share and naturally enough want to hold on to as much of the monetary pie that comes from this for themselves. In terms of the whole search economy they’ve been very clever indeed. What they’ve done is this. They’ve demonised the whole concept of buying links. They’ve made it seem like this hugely unfair unethical thing that gives people an unfair bump in what is supposedly an otherwise fair algorithmic system.

*non seo bod note:links drive the search economy, they push sites up the search engine rankings. Without them sites would not rank for jack

Here is the news – It isn’t a fair algorithmic system! Seriously, I shit you not! It’s based on a string of metrics that give those at the head of the race a distinct advantage. If you want to catch up then you’ve got to do things to compete. Acquiring links is one such way. However, by buying such links you are seen as manipulating the index and risk some form of penalty especially if ratted out or investigated for not spending enough on some PPC program.

See the thing here is that competitors, or those wanting to be on the 1st page of a SERP just don’t have the time to wait around for years of brand and so called ‘natural’ link building to take effect. They need to compete today, not tomorrow. They want the sam