Deoptimising - a new way of SEO

I'll start this post with the assumption that you have experienced problems with the new google's ranking algorithms. I.E sometimes your website might appear at the end of search results(950 page), at 6, 30, 60th position or may be listed as supplemental. Possible reason might be that it has lost its trustrank and have the so called 950 penalty applied at runtime. So in order to escape here is what to do.

Check whether you need to deoptimize:
If you want to see all the non-supplemental pages from your just type in google:*

Then to see just the supplemental pages from your site type:*

Keep the ratio below 50%

Also try the automated supplemental ratio tool:

What to do next:
1. Validate your website

2. If you use plenty of H1,H2,H3 tags remove most of them or replace 'H1' with stylized 'H2' or 'span' and 'strong' tags.

3. Don't use same data in: 'title', 'h1', 'meta description' tags.

4. In your website inner-linking navigation use same linking structure. Using same 'title' attributes in the menu on every page of your site is considered as spam.

5. Your affiliate/referral links should differ in anchor text. Please check the 'title' attributes to be unique and avoid keywords stuffing there.
Pay special attention to 'title' and 'alt' attributes: if they are overstuffed google bot will just place the first few lines of your page as a description in its search results which turns to be your repeating website heading information.
Solution: examine what your search results look like(site:, see what exactly indexes google and make according changes ie. reduce the 'title' attributes.

6. Remove any static content from the bottom page of your website especially the outbound links, etc...

7. Check your affiliate links whether they are thematic or not. Remove those that are not connected to your site theme or add rel="nofollow" to them. If your site displays RSS feeds be sure to add rel="nofollow" also.
Update: Try this tool to find whether you are linking to bad neighbourhoud websites:

8. Lower your content keyword density
Keyword density is an important factor if you're serious in SEO. Once when you have a great kind of content, the crucial part is to be able to present it in front of the right public.
And by having keyword density above for example the threshold of 2 - 4% will mark your content as thin and it won't compete/show with other websites in SERPS. You can check your website against my keyword density analyzer and I hope it'll help you in finding the ideal keyword density percentage. Any comments and suggestions are welcomed.

- check out your navigation(posts archive) - too many links to posts with keywords in their titles increase the overall post keyword density so be careful.
- when using forms you may also look over your hidden field values: do not use keywords there - it's an easily misunderstood issue.

- also don't forget to check your content for being detected as a potential spam:

9. Limit the usage of 'title' attributes in the <a href> tags as well as <b>/<strong> tags -> they weigh to the content presented in the SERPS.

These might sound as drastic measures but I've already managed to escape 3 websites using the above techniques. So experiment and look what will happen. Wait and hopefully soon you'll be out of the google's supplemental index too.

For more information look at:

Is your website in Google's sandbox?

Duplicate website content and how to avoid it

Update: If you've tried everything above but still aren't satisfied from results:
1. Load your navigation/advertising section using AJAX request not purely via JavaScript.

2. Check google's webmasters tools and fix if there are any potential duplicate issues.

3. Check all of your sub-domains for supplemental results and fix them as soon as possible. by Nevyan Neykov profile


  1. Anonymous4:24 PM

    Great tips, worked for me. Site was suffering from -950 penalty for 7 days.....Took your advice and boom back higher than ever - Thanks!

  2. I'm glad that it has helped you!