I would like share what I have found to be successful when it comes to post-panda on-page optimization. Some of the strategies listed below are site-wide and some require drilling down to specific pages, all in efforts to help your website’s overall trust and relevance scores.
Below are the 10 tips any site that has experienced any recent drops on Google or wondering how they can improve their website optimization should consider implementing the below strategies.
1. Spell Check Top Traffic Pages – Spell check your top 10 ranking or most trafficked pages of your site. It’s probably better to spell check your whole site. But if time and software is an issue, at least spell check your top organic landing pages. Both Chrome & Firefox have extensions/add-ons to help you spell check specific pages. If you would like to spell check a whole website, you can use something such as Internet Marketing Ninja’s Free Online Spell Check Tool.
*This tip comes straight from Google’s Webmaster Central Blog post; More guidance on building high-quality sites, where it says “Does this article have spelling, stylistic, or factual errors?” In fact many of the below strategies will come from there or here.
2. Identify & Fix the Pages with the Worst Bounce Rates – Pick your top pages with the worst bounce rate. (Not the 100% ones with only 1 or 2 visits.) The pages that bring in at least 5% of your total traffic with bounce rates higher than your site average.
When in Google Analytics, go to Content > Site Content > Landing Pages, then click on Secondary Dimension, go to Traffic Sources and select “Keywords“. This way you can see which keyword/page combinations have the highest bounce rate. If you notice a keyword/page combination with a high bounce rate, start thinking about why. Ask yourself “If I was searching this term, what are all the possible things I could be looking for?” Then ask yourself “Does this page satisfy all those queries?” Make adjustments. Also try to figure out if there is any way to entice visitors to click onto another page of your website. (Something more alluring than the back button.)
3. Clean Up All Broken Links, Redirect Links & Crawl Errors – Use Google Webmaster Tools, Screaming Frog, Raven Tools, Bing Webmasters, or whatever tool-set you use to find all the broken URL’s in your website. First make sure all broken pages aren’t linked to from other pages within your website or on other sites you own. If so, change those links to point to the new URL, then redirect the old URLs to the new URLs. Or you can do it the other way around. All that matters is that you clean up all internally linked broken URLS and recognized crawls errors, so that the bots can smoothly crawl your website without any issues. While you are at, make sure your XML sitemap isn’t referencing any of your old/broken URL’s, then resubmit it to Google.
4. Fatten-up Unindexed & Thin Content Pages – After you make those changes and resubmit your XML sitemap to GWT, record how many pages were indexed and how many weren’t. Figure out which pages weren’t indexed (either with a tool or site:search in Google) and make sure those pages are 100% unique and have at least 250 words of content. In fact, make sure any pages with only a little bit of content has at least 250 words. This may be hard for some pages like your contact page, but if you look around, you’ll notice many top quality sites have figured out ways to address that.
5. LSI Content Implementation – If you are targeting a certain key phrase, make sure that the page targeting that phrase has all the necessary contextual terms to back it up. You can use Google Suggestion, UbberSuggest, Analytic Stats, Keyword Tools, etc. to help you identify which terms you need for support. This strategy will also improve your overall visibility and chances to rank for long tailed phrases.
6. Google Authorship & Structured Data – We are all noticing that Google Authorship, rich snippets and structured data is taking over the SERPs. I don’t even think you can type in a keyword without at least 1 result having some form of microdata.
I’m not sure how much this affects your rankings. But there are many correlation studies showing that G+’s are ranking factors and pictures and ratings increase CTR. But it is something I did incorporate into all my recoveries. Plus, Google is making strides encouraging webmasters to take advantage of it through GWT and by creating structured data tools. It’d be crazy to ignore.
7. Eliminate All Duplicate Content Issues – Run your website through an on-page analysis tool to help you identify all pages with duplicate content. I use Raven Tools’ Site Auditor to help me with this, and for duplicate, short, long title tags and meta descriptions (for sites up to 1,000 pages). You could also use GWT for the latter (but not for duplicate content). Once you’ve cleaned up all duplicate content within your website, use a tool like Plagspotter or Copyscape to help you find other websites with your content. Either send a threatening email to get that content removed or change your content so that it’s 100% unique and new. Depending on your business size, budget, legal issues & cost to produce your content, you could find yourself debating on if you should take further action. I had one webmaster respond, “what are you going to do about it? How about you change yours”. Dang, really?
8. Re-examine Google Webmaster Guidelines – You’ll be surprised if you revisited the whole Google Webmaster Guidelines document how much of it has changed. In theory Google hasn’t really changed its stances on major factors however, it’s a completely new search engine specifically calling out new and old factors. Make sure your website meets all the guidelines. Be confident enough to feel you can ask a Google rep to come visit your website to determine where it should rank on the engines.
9. Analyze Inbound Link Anchor Text & Bounce Rate – Use your backlink analysis tool to help you identify your keyword anchor text distribution. Record your highest non-branded anchor text percentages, then compare in Google Analytics. Check the bounce rate of those terms with the highest percentage. If you have a high percent of links saying a certain keyword and high bounce rate for visitors via that keyword, this is an obvious red flag. You’re pretty much telling Google, “look, this is what I want to rank for, even though I’m not about that, and since my visitors say I’m not really all about that term, I guess I’m trying to trick you!
An easy way I like to look at it is like this; if you have 1,000 backlinks saying a certain term, you better have a really low bounce rate for the next 1,000 visits that come via that term. Of course this isn’t a true measurement of link value and keyword correlation. This is just something to help you keep in mind when optimizing your website’s inbound linking and bounce rate correlation. I’ll discuss in more detail how to tie this in with referral traffic & bounce rate in my post-penguin tips post.
10. Revamp Outdated Content – If you have any content that is obviously outdated, (either because it includes dates or tips/news that are no longer applicable,) consider revamping the content. Replace that page with new content and move old content to a difference page (with reference to it), or add a disclaimer above the content giving some current news related to any changes since it was written. Or, maybe add a last updated snippet with link to latest version.
All that matters is that the users that come to that page don’t think your business is outdated or that what they find is old and misleading. Also, when revamping the content keep on the lookout for LSI terms relevant today that weren’t relevant when you originally created the content.
I’m on a roll… I might as well tell you the other 2 tips, even though I only promised 10.
Bonus Tip #1 – Make your content so good that it gets shared on social networks – I’m not going to elaborate on this. It’s pretty straight forward and we’ve heard this a hundred times before.
Bonus Tip #2 – Speed up your website – Another tip that I can’t take the credit for.
Google told us this…
“You may have heard that here at Google, we’re obsessed with speed in our products and on the web. As part of that effort, today we’re including a new signal in our search ranking algorithms: site speed.”
.. plain and simple. They even created a tool to help us speed up the site. That’s how serious they are about page speed.
I’m still on the fence about this because I’ve had a couple sites where that’s all I fixed and viola, improved rankings and several other websites that speeding up made no difference at all. But what Google says, goes!
As you may have noticed many of these strategies aren’t rocket science or even “advanced” SEO techniques. Many of them are so basic we usually forget about them and skip right over them. However the fact remains Google has been telling us what they want from a website for over a decade now. Even though techniques and strategies have evolved, the basics still remain.