How to recover when your website has been de-indexed

I did a search for the news about e-commerce this week and found a problem that gave me a little laugh. The Australian government’s site advising businesses on how to get online has a major flaw in its online portal. If you do a search for the site, you’ll find that it’s dropped ranks in Google. Look at the listing and you’ll find a statement from Google saying that they can’t display the site’s description because of the site’s robots.txt. In other words, it’s been de-indexed by its own practices. A site that should be #1 in the rankings is currently sitting at sixth, and is likely to drop further.

It’s a good idea to do a search for your site at least weekly. If your site suddenly starts dropping in the rankings you’ll want to know why. If you’ve got the robots.txt line on your site, it will deny any robots the ability to crawl your content, and that includes Google’s ranking robots. You don’t even have to purposely put in the line. Some plugins have robots.txt as part of their programming, so you could have the problem without ever knowing it. You’ll only know when you see the bad results.

Luckily, the fix is simple. Do a search once a week for your site (using the site:). Your site should come up first in the rankings. If you’re not first on the page, you’ve got a problem. It’s a sign that Google doesn’t like it or can’t access it.

The good news is that it’s a quick process to repair the damage. One of my clients went from 12th to sixth in one day just by removing the robots.txt from their pages. If you do accidentally damage your site, it’s easy to fix the damage if you catch it right away.

For more information, visit the StewArt Media website.

Jim Stewart is a leading expert in search engine optimisation. His business  StewArt Media  has worked with clients including Mars, M2 and the City of Melbourne.

COMMENTS