With so many sites affected by Google’s Panda algorithm I thought I’d share some insights on techniques I used to not only pinpoint the incident but also actions taken to recover traffic.
Panda as you know is the code-name of one of Google’s recent updates to it’s ranking algorithm. It’s primary focus is to address sites heavy with advertising and with what Google considers low quality content and those looking to fool the system with keyword stuffing. However it did affect many legitimate sites as well. As was the case with a site I was asked to help with.
The site was using many of the standard SEO practices that were the norm, prior to Panda and Penguin that is. The content and reuse was legitimate to provide guidance and support to users as they navigated the website. In addition, SEO keyword phrases were used to help build a solid PageRank without making the content look unnatural. So far so good until we noticed a sudden dip in traffic, which didn’t go away. In hind site … a sure sign that you’ve been hit. But to verify the incident, it’s best to check other items such as your hosting provider , check for broken links, backlinks, and the like before jumping to conclusions.
So after ruling out anything server related, or basic site maintenance we started our research to see if there were ways of identifying when Google released algorithm changes. If we could match the release with our traffic loss, it would be a very good sign that indeed, the site was a Panda victim. We did find a number of sites, but the one I found most helpful was:
By matching the dip in traffic with the releases from Google, we were able to make a correlation. However just a word of caution. In researching this, it seems Google will often make changes, then roll them back without notice … just testing the waters I guess.
Now that we had a clear hit on the time frame, it was time to get to work on a recovery.
Recovery Steps
In research on Panda, we identified some clear site issues:
- Over use of SEO phrases
- Reuse of support content
- A few suspicious backlinks
We believe these three were the primary culprits.
So what do to? As in researching others who’ve been hit, it seems there is limited success in recovering traffic. So with some trepidation, we put our plan together and got to work.
Site Updates
First we reviewed the entire site, following the typical call-to-action path we designed for the user to follow and noted all places with like, reused content. Then we reviewed all pages for our target keyword phrases, listing these and where they were located and how they compared with the already identified pages. This gave us a complete list of pages to work on.
To limit redundant content we created a single help site for the process needed by the user, and then referenced back to this on all needed pages, removing the redundant content. Then on all marked pages where the SEO looked to be heavy, we spent some quality time rewriting the content. The focus was on maintaining readability, usefulness, while sprinkling key phrases judiciously. In addition on each page we limited the reuse of key SEO phrases. We also reviewed all backlinks, removing any that looked like they might run afoul of Panda.
Finally, we resubmitted our sitemap to Google (Bing and Yahoo as well).
The Results
While it took some time, weeks, we did finally start to reverse the loss in traffic. After a few months, about 75% of the lost traffic was back. We continue to monitor the Google changes, with the hope of staying in front of future issues. So while not perfect, we did have some good results.
On Your Own
While there are many good reference places for help with and avoiding a Panda hit, below is one that I found useful from WordTracker:
I also found this useful for a deeper understanding of the Panda algorithm from SEO Theory: