google

Buying and Selling Links to Rank on Google in 2013

Everyone buys and sells links

So, if buying links that pass Page Rank is against Google’s terms of service and to do so means a potential ding to your rankings what should you do? Not buy links?

But wait, what IS buying links exactly? Where is the line drawn?

Are the links that came about due to the hours of research put in to a piece that investigated or highlighted common interest piece {topic}, bought links, or are they free earned links?

It’s a serious question.

Continue reading

Slapped with a bouquet of barbed links

Spam spam spam!

I was going to say something about Interflora but It’s pretty much been said by virtually the whole SEO world now so I won’t but I will talk about various issues arising as there’s always value in pulling that apart.

In terms of penalties, of the 10’s of 1000’s of brands one or two brands getting dinged every 8 months or so is hardly earth shattering (unless you’re the brand of course) but imagine if Google dinged a brand every week or other week?

What then?

Today, the scale of Google being spammed across most verticals by brands of all descriptions is HUGE.

Few brands ranking in Google today for 100’s or 1000’s of keywords have a totally clean profile, in fact it’s fair to say that most will be more than a little grubby, especially if they’ve used companies in the past who advocated any of the tactics that Google has since frowned upon.

Few will hold their hands up and most will vehemently protest at how their tactics are Google compliant and blah blah blah… What else can they realistically say?

Just go and look at who’s ranking in your favourite vertical and answer with hand on heart that company X hasn’t used a tactic that under a microscope isn’t just slightly questionable. It’s all in the interpretation and of course, who’s doing the interpreting. Continue reading

Is the unwritten contract between Google and Webmasters broken?

I’m writing this on a phone so forgive the formatting and lack of links and screenshots, i’ll tidy up later.

There’s a lot of change on Google these days, a lot of activity in spaces that Google were once content to monetize with ads.The clear separation that once existed between Googles organic results and its paid ads inventory is becoming increasingly blurred as it pushes on into query spaces that were once the preserve of a diverse bunch of web publishers.

Unwritten Contract? WTF – Explain Yourself Man

Continue reading

Dear Google, Stop Trying to Control the World’s Information

Yet Another Woe No More Analytics Post

So, Google decided to take the gloves off and twist the screw that little bit harder down on organic search. Caution, I suspect I might curse and swear and rant a little but hey, you can always hit the back button 🙂

I’m not going to rant about the outrageousness of it all as that’s been said by all and sundry. If you’ve landed here and don’t know what I’m talking about then, the short explanation is that Google have made a move in the name of privacy but have added a pretty hefty “by the way clause”  that’s sending shock waves through the online marketing community.

Put shortly, if you are one of these people who enjoys crunching numbers and delivering actionable insights derived from user queries to a domain then, that’s all about to change as you will no longer be able to determine the query part of the journey. All you’ll know is that they arrived on your site from Google. If it’s a paid click then no worries there, Google will allow that to stay as it’s valuable to the advertiser and useful to Google.

Valuable in the sense that advertisers need to know how their adspend on Google converts. (No point spending money if you don’t know how well it performs)  and useful to Google as if people don’t spend money their whole house falls. Google isn’t interested in how your organic campaigns perform or convert. There’s no money in it for them. Continue reading

Old Mahalo Had a Farm SE SEO

Don’t feed the pigs excrement

I was thinking about the recent farmer update and around some of the things said and around how the algo might work and how new or existing farmers might keep on feeding the pigs and chickens.  A side win is that it also helps one to refocus ones efforts through prudent little implementations and tweaks that might help engagement and perhaps insulate from similar future changes. You can never afford to sit on your laurels in someone else’s playground. We might think that this web thing is open and accessible to all, but for today at least Google still is the defacto gateway and for that reason alone any business intent on getting traffic from them, would be foolish not to sit up and take note.

Are the Sheep Happy? Be a good Shepherd

Kates’ post here http://www.distilled.co.uk/blog/ppc/google-bounce-rates-the-untold-story/ reminded me of past considerations of bounce rates and the masses of misunderstandings that were out there around the issue. I’d both heard and read people going on about bounce rates as a quality metric as if it was some one size fits all thing that applied carte blanche to every web page out there. As Kate rightly says different pages have different outcomes. If user A gets what they want, and leaves within a short time, then the less informed amongst us might be forgiven for sniffing and thinking, crap page, hit and run, poor user experience.

Yet of course this is patent nonsense as the page in question might just have exactly what the user wanted, requiring no more time or interaction on the page other than the hitting of the red x or the back button. Some sites  like blogs, often have a one hit wonder effect, be they shared through a social network or arrived at through a search engine query. The user visits with the express intent of reading about that particular issue and that’s that.

They don’t want to go deep and read about a lot of  indirectly related topics as their focus is elsewhere. Old style forum threads in comparison have much lower bounce rates, due in the main to things like pagination or general time difference between search indexing and user visit. Lots of page visits of very small time samples followed by rapid exit might be a signal of a poor user experience. OTOH, it might also be the obverse (photo gallery for example) . The truth is that unless, there’s some like for like standardised similar type site to compare it’s very difficult to determine algorithmically, what is and what isn’t a poor user experience based upon single metrics like bounce or time on site.

There are lots of other examples, that have differing outcomes most of which I’m sure the experienced Internet user has encountered at one point or other, and I’ve kind of veered off the main point a little as this isn’t directly related to the content farm thing; at least not in the totality of reasons why you’d get your arse kicked in this update but it does nonetheless, bring to mind the core of what you should be considering when bringing people to your site and making them happy. Give them a shitty user experience where they don’t want to come back againor begin to rank for everything they want and they’ll start to complain about it. If they complain enough in sufficient numbers, then sooner or later you might just be toast. Thinking about shit like the above, get’s you back on track.

Elsewhere on the farm..

A thread at webmasterworld http://www.webmasterworld.com/google/4276279.htm cites the Cutts and Singhail http://www.wired.com/epicenter/2011/03/the-panda-that-hates-farms/all/1 post on Wired which is full of interesting little nuggets.

From an algo watcher perspective it’s fascinating stuff full of little clues and perhaps the odd red herring, yet much as I snark the truth is that in many ways it’s full of things that should really be common sense to the accomplished Webmasters of this world. A look at the list from Sistrix http://www.sistrix.com/blog/985-google-farmer-update-quest-for-quality.html shows the various winners and losers.

Outside quality raters were involved at the beginning

…we used our standard evaluation system that we’ve developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: “Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?”

The cynic in me had already covered the ground of hmmn, how many low quality type Q and A sites are out there and how long would it really take a multi billion dollar corporation to task a team of individuals to seek out and identify crap sites, or sites that were clearly just taking the piss a little with ads and stuff like that.  How long would it then take to run the sites through a bunch of  quality raters http://www.beussery.com/blog/index.php/2008/03/new-google-spam-recognition-guide-for-quality-rater-reviewed/ and score them across the various metrics? So this kind of re-inforces that as fact 🙂


Excessive ads were part of the early definition

There was an engineer who came up with a rigorous set of questions, everything from. “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?”

If you look at some of the sites involved prior to getting Google thumped, you’ll see that a lot of them were indeed rife with adsense and ads from other networks (some still are) . It wouldn’t be so difficult to have a script look for such instances and then determine a threshold above which, you get issued with a nice pair of lead boots to weigh you down.

The update is algorithmic, not manual

…we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons.

This part is of course all the more interesting as it more or less says that here are a bunch of sites with lots of quality signals and on the other are sites with not as many.  I’m not going to sit here and dissect the strategies of all those bumped, but there really is gold in them thar hills. Sure there are anomalies. Mahalo has been hit despite a big PR push on it’s recent change in approach. The powers that be IMO have decided that a continual get out of jail free card just wasn’t in their PR interests. EHow, that much maligned repository of textual verbosity has also survived the cut no doubt someone demanded that their media http://www.demandmedia.com/ was worthy of a little more time http://www.fastcompany.com/1723737/did-demand-media-ipo-just-in-time.

Some people (aka spammers) will no doubt have seen the opportunities that these ructions present and will have been up bright and early repositioning downgraded content into new loftier place holders. Lessons will have been learnt, content will take account of things said by Messrs Cutts and Singhail and the show will roll on. Only time will tell if Google has done enough to slay the beast of public scrutiny, these things come in cycles and for now at least the monster seems to have been given a bit to chew on.

Bing, Google, Cookie Jars and Data Scrapes

Bing Bong Bell, Google’s in Da Hell

There’s a been a lot of noise about Bing and Google this week regarding Bing stealing Google results. Matt Cutts is at the centre of it having had a bit of a ding dong with Harry Shum see video below (40 mins long) over Googles assertion that Bing have been a little sneaky and have scraped/stolen/reverse engineered/indexed Google SERPs.

I’ve no axe to grind with Matt or Google but on this one they seem to have got it wrong and misread the landscape.

Emulators emulate emulators emulate emulators…

Here’s what I don’t get – Google have most certainly copied features from other parts of the web, including features used by search competitors so I’m surprised that it’s such a big deal to learn that a competitor might be doing the same, albeit clandestinely.

Some are suggesting that the timing of the announcement to DS was also a bit snarky in that it was timed to collide w/ a  Bing announcement . Maybe this is just coincidence but from what I’ve read it wouldn’t be the first time that Google has sought to steal Bings thunder either, hence the various blogs of apparent indignation.

Like for Like

A while back I read that Bing was going to factor things like FB likes into its algo for logged in users. If Google decided to do the same (without FB knowledge) , and used say GA or the Google toolbar to do so, would they not be doing a similar thing to Bing? Would toolbar users be aware of any subtle change referenced in a previous possibly unread EULA?  I’d suspect not and ultimately very few would know.

If Google used such data and its SERPs improved as a result, then who’d even know? If questioned, surely Google would say that they use over xxx signals to rank their pages, including social data. If pressed, they’d also say that the exact mechanics of what they use and how are a closely kept secret. In other words, they’d say mind your own business, we aren’t telling.

I make reference to the EULA as Matt made a big thing of it in the video above suggesting that Bing users wouldn’t be aware that they would be used in this way.  In this regard, I think it reasonable to conclude that Google hasn’t ever forced a TB update on me, telling me that they’d changed an aspect to extra x data extraction factor z.

In that context, is there really a massive difference between what Bing is saying and what Google are saying they have stolen!? I’m not being an a$$  I’m just genuinely curious as to why Google would be so surprised to learn that Bing might have a huge dictionary of words and might just look to grab the odd ‘new string’ via use of clickstream metrics bought into by users of their services and then use it to improve what they already do. Products iterate, programmers seek to improve, automated scalable means are a good way of doing so, heck, Google itself use a similar approach to improve its algos and weed out rubbish.

I certainly get how it’s probably a matter of pride for the chaps at Google as it does appear that Bing  is using Google technology to augment its existing datasets through users on a Goog platform, but put in the context of how Google has used the tech and information of everybody else the world over to grow a world beating company, delivering fantasmagorical profits, then it does begin to look a little pot kettlesque.

Some might be forgiven for concluding that Google was taking a kicking in the public press re: spam and that this was a handy and timely deflection. In this regard I  wasn’t surprised to see the guys in the video being pretty anti Google and they appeared to double team Matt, with Blekko gaining excellent capital from the whole deal.

I’m sure this one will roll on.

Google to allow gambling ads on its UK Adwords SERPs

Pretty big news from Latitude,  Google are to allow gambling ads on UK Adwords!

Due to recent changes in legislation laws surrounding online gambling advertising, Google had decided as of tomorrow to allow gambling PPC adverts to be shown in the UK. They have been speaking direct to clients to ensure all the correct licensing information has been processed in time and there will be a frantic scrap for both clients and their respective agencies to get their accounts setup and put live in time for tomorrow, of which there is no guarantee.

Interesting times indeed, huge move which will shake up the market no end. All sorts of speculation as to why, the obvious one being it’s about the money dummy 🙂

What next? Alcohol, PR0N? Lean times call for harsh actions, especially when you have a profit annoucement in the offing.

css.php
%d bloggers like this: