A while ago I wrote about some websites that were hit by a Google Panda Update. One of our readers also took a beating, but has since made a full recovery and left some great comments about his experience. He shared so much I thought It deserved an entire blog post. So I asked him if he’d be willing to share his story with all of you. Thankfully, he obliged…
I am just a little guy with a hobby website that aims to help people cook Italian food like my Italian grandmother did. I grew up Italian American and was lucky enough to spend time with my grandmother and learned how to cook like she did. There are many grandma secrets to Italian cooking of which I reveal to help others.
I’ve worked hard on this website for over 10 years, slowly building unique content over the years and Google loved me. Google and I had a nice relationship. Then comes the Panda and during the month of April 2011. That is when Google broke up with me. Slapped me hard in the face and just left me standing alone and heart broken. I did not know what I said or did to upset Google so. Google just up and left me. I lost 80% of my traffic overnight. Poof!
Once I picked myself up off the floor, I started to research just what the heck was going on! It was like a nightmare that I know a lot of other webmasters have gone through. There was much speculation across the forums but mostly, everyone was clueless about the Panda. What to do?
It was unbelievable what I had to go through to recover. In the end all the hard work was worth it. But during the process I thought I was going to go crazy and just totally lose it! I almost gave up completely and just ditched the entire website, but then something inside me said. “No way am I going to let 10 years of hard work just disappear!” I felt like Rocky getting ready for the big fight. A complete underdog going against a champion!
The good news for me is I was officially released from the Panda on in April 2012, almost exactly 1 year later.
Panda 2.0 is when I got slapped and I was not released until Panda 3.5.
I have placed “some” of the steps I took during that horrible year of trial and testing to make this happen. Maybe this will help others and save them from Panda and Penguin hell.
What was so painful about this slap is that I felt I was unfairly slapped. After reading all the articles of what might cause a Google Panda slap, I was doing none of those things. I had all original content and was not practicing any evil black hat SEO stuff.
In the end I did see what may have caused the slap. Part of it was Google’s fault and part of it was mine. (I will share what was my fault below). One of the biggest problems I was running into was that as soon as Panda came out Google could no longer tell who the authors were of the original content. Or at least was having a very hard time doing so. Spun articles and copied content had gotten so bad that Google really had a hard time figuring out who were the original authors. In many cases, Google was giving original author credit to newer articles which were copies of my original content! WoW!
So here we go, some steps I went through to get released from the Panda:
1.) I condensed 500 pages down to 70 content packed pages! – This was a massive under taking that nearly took me out completely! Innocently I had a lot of duplicate content because of the nature of my content and the way I was presenting it. I had a lot of recipe photos with commentary that were divided up into parts. Part 1, tons of photos with commentary, part 2, more of same, part 3, more of same etc… Each part was a separate page. So condensing all the separate parts into one long page was an important change.
2.) I don’t have a single page on my site with thin content – I carefully went through every single page on my website and threw out any pages that had thin content. Especially pages where the template navigation was more than the actual page content! Also see step 1 above. That helped me get rid of thin content as well.
3.) I shaved off a whole bunch of advertising, no above the fold advertising – This has definitely cost me to lose a lot of income from the site, but better to work on the traffic and then figure out the income later. I have since been able to “carefully” add advertising back on the pages, but slowly and not too much. I was guilty of having too much advertising on my pages. Yes, greed was there.
4.) I worked feverishly to speed up ALL pages on the site – I’ve used these tools extensively for every single page on my site which helped me a great deal with flagging areas that were slowing things down.
5.) I’ve cleaned up my old html and used css formatting for all my text and menus – This was a problem of my own doing. I had very messy code because the site was so old. I just kept updating along the way with different html programs never really overhauling all the code. So I had a bit of work to do with cleaning up all my old code. This also helped in the area of speed a lot.
6.) I added 404 and 301 redirects. (THIS WAS EXTREMELY IMPORTANT!) – To those that do not know about this. A 404 redirect sends a visitor to a custom “not found page”. If they are trying to get to a page that does not exist on the site any longer, they are placed on a page that helps them navigate to the new content. This also helps Google quickly figure out which pages I have done away with. This was something I was lacking and was a major help! This was also extremely important because I had deleted so many pages. 100s of deleted pages! I needed to make sure Google figured that out quickly!
I also added a 301 redirect. A 301 redirect points everything to the non-www version of my domain or visa-versa. In my case I picked the non-www version because that is what most of my backlinks were pointing to. This was a huge fix as well. I did not even know this was an issue until much studying up on the subject. Most shared webhosting package keep both the www and non-www version of the domain live which produces duplicate content in the eyes of Google. Extremely important to get this fixed! You also want to 301 redirect and index.html page or whatever your homepage format is to redirect to just http://domain.com. Again this was a problem with duplicate content for example: The 301 redirect cures this problem. This is called “Canonicalization”. Here is how Matt Cutts break it down: SEO advice: url canonicalization.
Google is trying to make the internet smaller so it is more manageable in regards to indexing. If you think about it you could cut the entire internet in half in regards to indexing if everyone picked a preferred domain and redirected it. I personally think this is one of their hidden agendas.
7.) I optimized every single image on the site – A laborious process that was worth the effort for faster loading pages.
8.) I found all the websites/blogs I could find that copied my content and had them remove it – I cannot explain how painful this was to correct. This took months of hard work! The biggest culprit was blogs created on godaddy servers. No contact information on the blogs and I had to work directly with the webhost provider. It took several emails and a lot of patience for each blog in order to get the copied content removed. This was all necessary so I could get Google to recognize that I was actually the original author!
9.) I had to get very social! – I have created a Facebook and Google + page for the website and have been active on both, heavily active on the Facebook page. I have a static website and was not about to redesign the entire site into a new blog format so there could be comment interaction, etc… So to get the static website social I had to create social pages that were interconnected and or associated with my static website and carefully add social buttons through the site. I had to figure out how much social to add. Too much of this “on page social activity” can slow down the page load times significantly.
10.) Added no-index to many pages that I felt Google did not need to index – This was a trial and error process and also a lot of research to figure out what Google considers to be a page with thin content.
11.) De-Optimized where I had over optimized – I have also worked hard on de-optimizing content where I had over optimized in the past for SEO purposes. I was having great success with SEO optimization so natural I got greedy and kept on optimizing. This was one of the things that slapped me! I was too key phrase dense on many pages. I had to fix this. Google now much prefers natural content.
12.) Added rel=”author” tags and linked to Google + account for original authorship – Matt Cutt’s explains the rel=”author” tag here: Matt Cutts of Google on rel=”author”. This was extremely helpful in regards to getting Google to see me as original author of my content. Also some good information on this here: Google Webmaster Tools Authorship and Using Authorship to Stand Out in Google SERPs.
At first my reaction to the Google Panda slap was anger and frustration. However, now being on the other side of this I am very grateful to Google for this. It forced me to overhaul my website from the ground up which it definitely needed.
Happy Cooking, Happy Times and Share The Love! ~ Anthony