Some Research Using Webmaster Tools

Share this on:

Post 22 of 100 posts

On day 20 of this blog challenge I was dismayed to find that my Google ranking had actually dropped lower than it had been before I started!

So I hatched a 3 point plan to get to the bottom of the issue. The first part of the plan was to look at Google's Webmaster Tools and see if there was any information there that could help me understand the reason for the drop.

The first thing to check is Site Messages, have Google sent me any warnings about penalties? Answer: No. Which is what I expected because I don't participate in any dodgy SEO practices.

Second, do they have any HTML Improvement suggestions? Answer: No. To quote them: "We didn't detect any content issues with your site". So it would appear that everything is hunky dory with the underlying code as far as Google is concerned.

Next, double check Site Settings. This is where you tell Google what the preferred name of your website is eg or You need to set this or Google will treat both sites as different, but they will have exactly the same content so you will be penalised for having duplicate content. I always go for the one without www, it's just extra letters to write! Anyway, yes, that setting was still set as it should be.

Next, lets look at the crawler information.

First, have there been any Crawl Errors? Answer: No. No DNS errors, web server connectivity issues and no problems fetching robots.txt for the last 90 days. So no issues there.

Then the Crawl Stats: the site has been visited on a regular basis, so that looks OK. Ah ha! Now then, what is this? The third graph of statistics shows "Time spent downloading page" and there is a definite rise in the time value at the start of May. From what looks like an average before May of about 600 milliseconds to about double that at 1200 milliseconds during May. Could this be the culprit? I had my suspicions that the site's speed might have something to do with it, and I am going to look into that in more detail tomorrow.

Next, Security Issues. And I'm pleased to read "we haven't detected any security issues with your site's content". Good.!

OK, now lets see what Sitemap information is held in Webmaster Tools. ( The sitemap is an XML file which holds a list of all the web pages on your site that you want indexing). So it says it has processed my sitemap quite recently on May 21st and has 76 pages Submitted and only 36 Indexed. So, a glimmer of hope, I've added a lot of pages and Google knows about them, but maybe it hasn't processed them yet?

But then when I look at Google Index => Index Status it shows 67 pages indexed. Now Index Status shows all the pages associated with my site. But I only have 36 indexed from my sitemap. So what are all the others? I needed to investigate.

To find out I used the Google search operator site:. When you use this followed by the url of your site it will only return results within that site. For example if you did a search website design it would only return pages from the bbc's website containing the words website and design (178,000). And if you just search for it should give you how many pages there are on the BBC's website (18,800,000 blimey!)

So I searched for and was surprised to get 147 results. So that's a lot more than mentioned in Index Status number. So what were all those extra url's?

Well when developing other sites I have sometimes developed them as a subdomain of, for example It was just for convenience. But what I should have done is put a noindex meta tag in the header of the sites while they were being developed, so that Google would ignore them. But I overlooked this at the time for several clients. So now I had some old out dated, in some cases blank pages associated with my domain. Not good.

So I took action. I went through all the links and did redirects for them so that they point at a relevent page on my site rather than blank stub pages. I don't know if this will have any affect on my Google ranking, because, as stated earlier, sub-domain addresses are regarded as separate websites, so I don't think it will, but it needed to be done anyway for tidiness.

So, tomorrow, I shall look into the issue of my site's speed.

The daily efforts of a web developer

What has kept this web developer awake today?:

4 hours 5 minutes Arranging user permissions on a client's website
3 hour 43 minutes Writing and updating journal entries and tinkering with my own website
6 minutes Hootsuite, not so distracting
3 minutes Email
1 minute Arranging user permissions on another client's website

Total: 8 hours 3 minutes

Today seemed very bitty, and I seemed to get distracted a lot, but not by twitter, by ideas :-)

Exercise: 1 hour of yoga. Om.

Tomorrow: As stated above I am going to investigate website speed on my site and other people's.


Share this on:
Mike Nuttall

Author: Mike Nuttall

Mike has been web designing, programming and building web applications in Leeds for many years. He founded Onsitenow in 2009 and has been helping clients turn business ideas into on-line reality ever since. Mike can be followed on Twitter and has a profile on Google+.

  1. Amy

    May 25, 2014 at 12:04 AM

    What an interesting breakdown. Not great to hear about the ranking drop however. Was there an update that has coincided with the 100 blogs challenge? Hope your tweaks work. Will look forward to further updates!

  2. Mike Nuttall

    Mike Nuttall
    May 25, 2014 at 06:45 AM

    Hi Amy,

    I don't know if there was a a major Google algorithm update that coincided with the #100blogs, I should check that out too! That would be a bit of a coincidence, but yes worth checking out, for sure....

    So I just checked on and there is a suggestion that there was some kind of update on March 24/25 which is a while before I started.

    So it looks more like something I have done to the site and I do come across something in the follow on post from this one: