=> Get Full Details Here
Hi! I'm Jennifer - known as 'PotPieGirl' online. Welcome to my blog! I just want to let you know that this post may contain affiliate links which means, at no cost to YOU, that I might receive compensation if you purchase something through a link on my site. In the blogging world that's called "affiliate marketing", and it's a very common way bloggers make money by sharing products they love (like I do). Yes, that's how many blogs make money. For example, This blogger makes $50,000+ a MONTH with her blog - blogging is solid home-business idea for just about everyone.
Thinking About Starting Your OWN Blog? If YOU'D like to learn how to make money blogging with affiliate marketing like I do, feel free to read my Free Blogging 101 E-Course here.
Perhaps All ‘Content Farms’ are NOT the Same
Ever since this whole Google algo update/Farmer update mess started, I’ve noticed that SOME article directory sites I am familiar with got nailed…and others didn’t. From a basic perspective, they’re all pretty much the same, right? They all put out mass-volumes of user-generated content every day…and make their money from ads ON those web pages. So why didn’t this latest algorithm update with Google cause ALL of these sites to lose rankings and beloved Google traffic? They’re all considered content farms, right? Want to hear MY theory on it and see my research?
Was this algo change REALLY aimed directly at the content farms?
When Google announced this algo update, they didn’t SAY they were targeting content farms. The “Farmer” update name came from inside the SEO world, but not from Google. In fact, Google folks call it the “Big Panda”. Apparently one of the key guys is the one that came up with this breakthrough a few months ago so internally, they nicknamed it after him (his name is Panda).
When Google posted their official announcement about this algo roll-out, they said –
“This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
I think people immediately assumed it was for Content Farms only since Blekko (a new search engine) announced a ban of some content farms from their site in late January/early February (see info on that here) so I guess folks just thought that is exactly what Google was doing, too.
According to an article on Wired.com, when Google rolled-out Caffeine at the end of 2009 (a Google update of sorts that improved Google’s ability to find and index content very quickly), they found their index was growing VERY quickly…and this latest update is pretty much to thin out the index and get some the “junk” of out of the way (and in attempt to get the “better” content ranking above the “shallow” content).
Google also said in that announcement (emphasis mine):
“…But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries…”
That statement leads ME to believe that they were targeting certain weak areas. Yes, 11.8% of search queries is a LOT of queries – but it still sounds like a specifically-targeted update to me.
In other words, they are going after certain query spaces – doesn’t it sound that way to you? If it was ME targeting certain query spaces, I’d certainly go after the problem areas…the queries that seem to generate the most “junk” in the Google index.
I’d want to thin out the over-saturated areas, wouldn’t you?
Yes, there is a big reason why I am pointing out that “11.8% of queries” part 😉
What I want to talk about is how some article directory sites started losing massive search engine traffic after this update – and others didn’t (in fact, some are doing BETTER). If “they’re all the same”, then why didn’t this algorithm change treat them all the same?
The sites I want to talk about
I’ve been carefully watching multiple sources for info and insight into this latest Google update, and it’s been quite an experience! I took long looks at lists of sites that others report as the “winners” and the “losers” and then picked some sites that I am quite familiar with and feel my readers are familiar with, too.
Those sites are:
Here is a quick video I made to give you a visual of what has happened to the search engine traffic for these 5 sites listed above so you can SEE what I am talking about.
(video is a little choppy at the very start but straightens out – sorry!)
What in the world could cause eHow to be doing BETTER, Squidoo hanging about the same as pre-update, and the rest bottoming out? Aren’t all these sites very similar? Wouldn’t you think that the algo would pertain to ALL of them?
Something different about these sites that isn’t so obvious
I’ve had a sneaky feeling as to what I think could be causing issues for the sites that have had the carpet pulled from under them since this algo update. Now, it’s not very obvious unless you get to poking around. While I have been poking around, I couldn’t think of some really good examples to prove my point.
Then, on the EzineArticles blog, some “really good examples” were sitting right in front of me…so I ran with it.
Do y’all remember, back in July 2009 when Squidoo began enforcing some MAJOR rule changes for their site? As of July 2009, they no longer accepted content on certain subject? Remember that? Remember the topics? (here’s their list, in case you’re curious)
Squidoo decided on their own to stop allowing these topics that they considered “junk” topics that brought a lot of onsite spam with them, too. The Squidoo site was also majorly over-saturated for these topics.
That time in Squidoo has kept creeping into my mind as I’ve been reading about this latest algo change…and watching my own Squidoo traffic and rankings remain pretty darn stable when other sites (like EzineArticles users and HubPages users) are reporting MAJOR negative changes.
And then there’s eHow.com… still coasting along and enjoying better traffic.
What in the WORLD is going on?
Then I came across this post on the EzineArticles blog and there, right in front of me, was my “test samples” to crunch some numbers for my theory.
On the EzineArticles blog, they asked members what they should do about content they have that is in over-saturated niches.
Then, they listed out these 7 keyword phrases/topics:
1. Penis Enlargement
2. Get Your Ex Back
3. Acai Berry
4. Reverse Cell Phone Lookup
5. Credit Card Debt Relief
6. Male Enhancement Pill
7. TV for PC
For those of you who lived through the Squidoo policy changes in 2009, do those topics look familiar to you?
So – I decided to take those exact 7 phrases and see how each site was doing for those topics/keywords.
It was a very interesting and insightful little test from my perspective.
How I ran this test
Now, for the record, this is FAR from an in-depth, highly scientific test, ok? This is me taking some test samples as an example to give us something to think about regarding this algo change and the sites we are familiar with that got hit.
I want to also point out that each time I run these sample searches in Google, I get something different (gotta love it). Sometimes a BIG difference, sometimes a minor difference….but, in true Google-style to prevent us from knowing TOO much, they’re different – so YOUR searches might be different, too. This is NOT an exact, scientific test – just showing some examples to start the conversation.
First thing I did was check Google to see how many pages each site had in the Google index. To do this, I simply typed (using hubpages as an example):
and then wrote down the number.
According to my searches on Google, each site had the following amount of pages in the Google index:
Then, I did the same type of site:hubpages.com search and isolated my results to “last month” – meaning “how many pages did Google index on each of these sites in the last month?”
The ‘Last Month’ results turned out like this:
That’s a LOT of content in a month, isn’t it?
Then, I took each of the 7 phrases that the blog post on EzineArticles listed, put it into quotes and did a search like this (again, using HubPages as an example):
site:hubpages.com “Reverse Cell Phone Lookup”
Then, I tallied the number of urls each site had in the complete index, and did a search again for the same phrases, but in the last month.
By searching this way I was asking Google –
“How many urls do you have in your index for this site that have this exact phrase on them?”
“How many urls do you have in your index that you have crawled/found in the last month that have this exact phrase on them?”
I then jotted down the number for each phrase, added them all up for each site, and got a % of urls indexed number – both for ALL urls in the index for that site…and for urls found/crawled in the last month.
Ready for the results?
The Results I Came Up With
Remember now, these numbers are only representing SEVEN specific keyword phrases – that is IT. Seven keyword phrases is a very low test sample, but the results sure speak volumes.
Results when searching All URLS from the site in the Google index:
Holy WOW! 10.38% of the urls Google has for EzineArticles.com have at least ONE of those 7 keyword phrases on them?
When you think of ALL the topics over at EzineArticles…and all the possible combinations of keyword phrases, these SEVEN PHRASES take up over 10% of their site?
Over 2 MILLION urls in Google from EzineArticles with just one of these 7 sample phrases on them.
Even with Squidoo’s policy changes back in 2009, they still show over 2% of urls in the Google index with at least one of those phrases on it.
To be fair, many of those urls that Google still has are locked lenses (pages Squidoo has taken steps to remove). However, considering how Squidoo WAS looking in the Google search engine for these phrases BEFORE their policy changes, this is a HUGE difference.
And, when I thought about that, I realized that I should look at all this from a more recent perspective. How many urls found in the last month for each site…and how many had at least one of those 7 phrases on them?
Results when searching only urls from the site added/crawled in the Last Month
In the last month, Ezine Articles has had close to 39,000 urls found/crawled in the Google index that have one of these 7 phrases on them. That means that 2.82% of the EzineArticles.com urls Google has found/crawled in the last month have this phrase on them.
That is almost 39 THOUSAND web pages in the Google index in the past month with one of those 7 phrases on them – from ONE SITE.
How can that NOT be a problem?
As you can also see, Squidoo is doing a pretty good job of keeping those topics/phrases OFF their site (nice job, Squidoo!) – eHow is has the lowest over-all.
Tell me this –
Could it be a coincidence that eHow is still fine, if not BETTER, with their Google rankings and traffic since this algo update rolled out AND they have a very low-percentage to almost NO pages on site with these three exact phrase on them?
The Kicker For me
HubPages was the kicker for me….
They seemed to be “ok” for their over-all index presence regarding these 7 keywords…and definitely good for the last month… so what’s going on with them? They are having one heck of a time with traffic since this update rolled out.
Yes, it’s still possible that the threshold for this potential signal would still snag HubPages – and yes, it’s possible that it’s simply a matter of me not choosing the RIGHT phrases to see what HubPages REALLY has on these topics…
Or – could it be something else?
Is it the on-page advertisements?
I really don’t think the ads on their own are an issue and besides, all these sites are monetized so advertisements, like AdSense ads, not an isolating factor that makes one site unique from another.
What I think DOES come into play is the presentation of these ads and the way the presentation of the on-page ads might cause for an immediate negative perception about that web page by someone coming from a Google search. That immediate negative first impression could easily cause a Google search visitor to leave quickly (ie, cause a high bounce rate). It also affects “time on site” which I feel is a signal that comes into play. Both of these things are metrics that Google kindly keeps up with for us inside our Google Analytics accounts (meaning, they keep track of those things so the theory of them using those metrics as ranking signals is pretty darn realistic).
When I open an ezine article I am instantly welcomed with lots of ads all around the block of content….and the content just comes across as “words to read” – no images or anything to make it eye-appealing. Granted, that’s just my opinion, but if others feel that way and quickly leave the site, it could have an effect on things, don’t you think?
I certainly think so.
With HubPages, I am wondering if their loss after this update might also have to do with all the “no-follow” links that are on the site. Unless you work your tail off and improve your author score to a certain level, all your out-bound links in your Hubs are no-followed. I can imagine that is a LOT of no-follow links going off site.
Now, don’t get me wrong, I completely understand why they do this and I respect their reasoning.
BUT – from a search engine’s “perspective”, if a site doesn’t trust the sites it links TO, why should that site be trusted?
Are they hoarding their Page Rank juice – or don’t trust who they allow their content to link out to? Why in the world would you continuously put content on your site and NOT trust the sites you link to? I don’t think that is something that would HELP rankings….but is it HURTING their rankings?
I know, I know… Wikipedia does the same thing – but apparently Google doesn’t consider Wikipedia a low-value site – and I don’t think they put Wikipedia and HubPages in the same class at all, do you?
Outbound links can be pretty important – especially when your content links out to other related content that Google “likes”. The web is built on links… Google trolls the web and finds new content via links. In short, the web IS links, if you think about it.
Anyway – Those are just my thoughts on something that could be causing HubPages to not fit in as obviously with my “saturated keywords” theory – yet still be losing traffic as a result of this latest Google update.
I have a funny feeling that the fall out from this ‘Farmer Update’ is far from over. Up to this point, this algo update has only rolled out in the US. When it hits more areas of the world, it could get even more interesting.
Sadly, a lot of other sites were casualties of this update…and they are sites that probably shouldn’t have been caught by these new filters. Google will be tweaking as it gets feedback, but I feel safe to say that it might get worse before it gets better.
What Should EzineArticles Do About This?
EzineArticles has a tough decision to make regarding those 7 topics these listed out on their blog. They are asking for feedback as to WHAT they should do moving forward about those over-saturated niches.
I envy them not right now – this is a BIG decision for the folks at EzineArticles… it could get pretty messy.
I am of the opinion that they might want to (pardon the expression) “man up” like Squidoo did as a proactive measure, and get rid of that content the best they can, and do all they can to prevent more from coming on site in the future.
I KNOW there are people out there that will HATE that opinion of mine. But let me tell you this, I have a good bit of content out there on EzineArticles in one of those niches – and I’d lose content on EzineArticles too. (and no, it’s not a topic about making something….uhhh….bigger, ok? lol!)
But for the good of the site as a whole, it might be a good choice in the long run. Just my 2 cents on that – and hey, Ezinearticles DID ask what we thought =)
So – what do y’all think?
Are you as surprised as I was to see the % of pages each site has with at least one of those 7 keyword phrases on it?
Do you think this could have anything to do with sites that are losing traffic and rankings – and those that are NOT?
I spent a lot of hours putting all this together…and I am STILL shocked by it. How in the world can ONE site have 2.8% – or 3.48% of it’s urls in the last MONTH have one of those 7 phrases on it?
How can that NOT be a problem?
As an aside regarding a topic we've had the last 2 days….
If you missed the live webinar, you can watch a replay of the webinar here (no opt in or anything – just click and watch. Might want to give the video a moment or two to load)