Google Un-Slap Case Study
Pandas, penguins, backlinks, anchor-texts… whoever you blame, chances are that you’ve experienced a Google slap or two, if you’ve been doing SEO for a while.
The same is true for me: several of my niche sites that were previously ranking and earning well were slapped from their SERP perches at one point or another. Is it possible to recover a slapped site and get back to previous rankings and traffic levels?
I don’t know, but with this case study, I intend to find out.
To Niche or Not to Niche
First, let me clarify the term “niche site”, as I use it. To do so, I’ll borrow Pat Flynn‘s terms “involved marketing” and “uninvolved marketing”.
To me, a niche site is a site created mainly because I found some promising keywords on a certain topic. I outsource most of the work done on niche sites, rather than doing it myself. And while I do always aim to create good websites with good content (some details on this here), I’m not personally involved with any of the niche sites. The sites are not associated with my name, I don’t update them regularly, I don’t create personal videos for them etc. The sites are built and updated for a while and then I move on.
That, to me, is the main difference between a “niche site” and a website such as IM Impact, where I am continuously and personally involved.
The initial situation is that for much of 2010 and part of 2011, I created and developed different niche sites of varying quality. Some were very cheaply outsourced, while others were more carefully crafted.
For a while, affiliate commissions and AdSense checks were my main sources of income and my sites were ranking in the top 3 spots for practically all of their target keywords (ah, the good ol’ days…).
Over the course of the last 12 months or so, my sites started dropping. At first, my crappy sites got slapped and this didn’t surprise me at all. But unfortunately, while my more carefully created, higher-quality sites weathered the storms of the Panda updates, they also fell victim to slaps once the Penguin and further Google updates came along.
I suspect that my crappy sites (i.e. sites I wouldn’t want to visit myself, because the content is rubbish) are beyond recovery, unless I completely re-write everything and do a 100% overhaul of the sites. However, some of my higher-quality sites would “deserve” higher rankings, since they really do serve the purpose of their target keywords very well and since they lack low-quality or filler content. For this case study, I will try to recover two of these higher-quality sites.
What We Know
Let’s look at what we know about the current state of Google rankings and the recent updates.
A big focus of the early Panda updates was on-site content. We know that having excessive duplicate content, thin pages with tons of advertisement and low-quality “filler” pages can lead to a site being slapped. This is how Google got rid of content farms. So, the first step in any recovery is to make sure we don’t have any of the above content types on a site.
We also know that user engagement on the site as well as social signals have become a lot more important, recently. For more of this in detail, check out the new rules of SEO.
The bottom line is that your page needs to keep visitors interested, if you want to have any hope of sticking in the top positions, once you reach them. Conversely, if your site is more engaging than the competing ones, you’ll see your rankings rise.
Backlinks and Anchor Text
We know that blog networks, previously one of the most effective backlink sources, have been targeted by Google. Some of the big ones were wiped out with large deindexation sweeps. While some blog networks may remain untouched, we have to assume that they are all on the chopping block, even if the axe may not have come down yet.
Based mainly on this great analysis, we can also be pretty sure that Google changed the way it looks at backlinks and anchor text. It used to be fine to create links with your target keyword used almost exclusively as the anchor text. Now, you need to mix up the anchor text and ideally keep the target keyword well below 50%. Basically, don’t try to get your target keyword as the anchor text anymore. Use almost only your on-page content to signal what keyword the page matches best.
The Recovery Plan
Based on the above, I’ve come up with the following recovery plan, that I will test on two sites.
Step 1: New Domain
First, I will register a new domain.
Because I suspect it’s easier to get good content ranked on a new domain than to try and recover a domain that has thousands of identical-anchor-text links pointing at it.
I’t possible that there’s no actual penalty for those kinds of links and that, instead, they just “don’t count”. But I’m not sure about this. My suspicion is that for sites that don’t have very high authority, there is an actual penalty for a one-dimensional link profile.
The second, and perhaps more important reason, is keywords in the domain name. Both of the sites I want to recover have long, keyword-based domain names. I suspect that short, brand-type domain names are a better choice. Even if Google itself doesn’t care about domain names, the name shows up in the Google listing and short, brandable names are more trustworthy and may get a better CTR than long keyword domains. After all, what are you more likely to click in the Google results: a page on iknit.com or one on learnhowtoknitateacosy.info?
Step 2: No Redirect
Will I redesign the old sites on the new domains and then do a 301 redirect?
Because a 301 redirect tells Google: this is the same site as before, but in a new location. So, Google’s reaction would logically be: we hated the site before and we still hate it now. In other words, I don’t see any reason why a 301 redirect wouldn’t pass on the penalty from one site to another, as well. As long as the content on old and new sites are very similar, anyway.
Step 3: Mild Rewrite and Redesign
I will redesign the websites, with usability in mind. Both of the sites in question are already quite nicely and effectively designed, so I don’t think any radical changes need to be made.
The same goes for the content. I will rewrite some of the main content, but in a fashion of “polishing up” rather than completely rewriting. I want to give Google something fresh to index and cache, but since the content is already good, I don’t think it should be necessary to completely revamp it.
In addition, I’ll also add some fresh content (i.e completely new articles). Again, this is to give Google something fresh to index and to show them the site is “alive”.
Step 4: Social Signals
This is something I still need to do some research on. Basically, I want to get some social signals going to the new sites, but I’m not yet sure how best to do that. One thing’s for sure: I don’t want to do any spamming. Crappy fiverr services and the like are out of the question.
Step 5: ????
I’m not yet entirely sure about how to transition from old to new sites. Maybe I’ll link from the old to the new sites. I’ll definitely remove the articles that I rewrite and add to the new sites, as I don’t want there to be near-duplicates out there. I might just scrap the old sites entirely.
Step 6: Profit!
In case any of the above works out, I’m hoping to get rankings and traffic back.
Now, you’re probably thinking…
Shane, you’re being an idiot.
No, I’m not. Well, I’m probably not.
You’re saying that to “recover” a site, you’ll move it to a new domain and rewrite some of it? That’s just creating a completely new site!
There is a significant difference which is that I already know that the content converts well and that the target keywords bring in lots of quality traffic. This makes it very different from starting a site completely from scratch.
Also, much of the content was ranking well on it’s own merits, without too much off-page SEO, before the sitewide slap came in. The slap was mainly caused by thousands of anchor-text links going to the sites.
Will you privately register the new domains and move them to their own c-class IPs, log in through proxies and wear a ninja-mask at all times?
I’m not trying to hide what I’m doing from Google. I also won’t rewrite the content to the extent where it isn’t clear that it’s mostly the same content. I’m trying to acknowledge that there are some issues with the old sites, fix those issues and start over. This should be cool with Google. But then again, who the heck can predict what Google is up to?
What if this doesn’t work?
Tough luck. This is a “live” case study, in that it’s happening as I’m writing this. I don’t get to cherry pick and only talk about good results. If this doesn’t work, then at least you’ll know what to avoid.
Can you provide more details on what exactly you’re doing?
Stay tuned for part two of this case study! I’ll be going into more detail on what exactly I’m up to and report on the results, as things progress.
To get a more general idea of what I’ll be up to, also check some of my recent posts such as the New Traffic Paradigm, this post about conversion rates and this one about mailing lists. The principles discussed in those posts, many of which emphasize the importance of not being dependent on Google alone, will play an important role in all the updates and changes I make to the sites for this case study.
Ultimately, unslapping the sites and getting Google traffic back is a secondary goal. The primary goal is to make the sites profitable again.
If you have any input, ideas, questions or comments about this case study, please let me know by leaving a comment!