Reviewing My Website With Google's Webmaster Guidelines Part 3

Share this on:
27th day of the epic 100 posts in 100 days challenge

This is my third session of working through Google's Webmaster Guidelines and checking over the health of my website.

I have worked through points 1 to 5 of Design and content guidelines in previous posts part 1 & part 2

6. "Use text to display important information, and use the 'alt' tag in images it convey information about the image"

I'm pretty safe on this one, it's a habit I have been in ever since I started building websites. Especially since one of my first jobs was at Leeds University where they had very strict rules about accessibility i.e. websites had to be made as accessible as they could be for students with disabilities, and one of the guidelines was not using images for navigation or important information and always using the alt tag. It's just the right thing to do and in the UK it is enshrined in law in the SENDA: Special Educational Needs and Disability Act of 2001.

This gives me another idea for a tool I could write that scans a website's image tags and checks/outputs all their alt tags. There must be stuff out there that does this already (but it's nice to have your own too), I need to do some research....

7. "Make sure that your <title> elements and ALT attributes are descriptive and accurate."

Again this is another habit that I have acquired from having built websites for so many years.

8. "Check for broken links and correct HTML."

The second part of this is something I used to spend a lot of time on. You can validate your HTML with the W3C Validation Service. You just input your websites address and they go through your HTML code checking that it is all correct.

It used to be a real pain in the neck. You would build a website and spend hours making sure it worked in all the different web browsers (which was a pain in the neck task in itself). Then you would validate your code with W3C and find lots of errors, you would correct the errors, then you would find that the site didn't work in Internet Explorer so you'd sort that out, then it would stop working in Safari, finally you would get them all working again re-validate and you'd have more errors! It was a frustrating vicious circle. So the best policy was start validating early on, and keep doing it on a regular basis, so you catch errors early before they become too intractable.

Then I saw this video "Does Googlebot Care About Valid HTML?" from Matt Cutts, head of Google's Webspam team:

 

He basically says that there are too many sites that have good content but don't have perfectly valid HTML already and that if Google penalized them they would be losing good useful content from their search results, so they don't. He says there are other good reasons for having good HTML but if you have a few errors you won't be penalized in the search rankings for it.

So I became a bit lax about checking, and can't remember the last time I checked my site, so it was with some trepidation that I went to the validator to check.... and ...... success my site passed inspection without any errors, which was rather pleasing, after quite a while of not checking and having made quite a few changes.

Broken links, oh no!

The first part of point 8 of the guidelines is check for broken links. So I went to W3C again where they have their Link Checker. And I put my website address in and .... it came up with lots of broken links, Yikes! That can't be right!

It said there were 26 broken links with "Status: 500 Internal Server Error". So I checked the links that it said were broken, and they were not broken! So that's a bit confusing and slightly worrying. Because W3C is reputable and I would have thought their software would be reliable. So even though their results are clearly wrong they must be wrong for a reason. And suppose the Googlebot is using similar technology, it could mistakenly think I have broken links too!

On reflection I don't think I need to be too worried about this, as if it was happening when Googlebot visits, something would have shown up in Webmaster Tools. I googled to see if other people have had the problem as well, and a few have though not exactly the same.

So, disheartened with those results I next tried: OnlineDomainTools who have a similar link checking tool. So I input my website address and it came up with more sensible results. It checked 579 links and came up with 4 pages that had 6 broken links. Genuine broken links this time, and something I could take action on.

I also came across a product called Link Sleuth which has a cheesy website but looks genuine, but you have to download it to use it and I'm always suspicious about software you have to download as it may contain trojan's and viruses, I think it should be OK because I have a vague memory of seeing it recommended somewhere else (was in a Gareth Mailer blog post?), but I will put it on an old, hardly used, windows machine just to be on the safe side.

So I installed Link Sleuth on an old machine and it came up with a good comprehensive report of not just broken links but links that have been redirected (so I should replace their old address with the new one), so lots of actionable items there....

These last results made me realise that this check is worth doing on a regular basis as old blog posts can contain links all over the place and you have no idea how well maintained those links are.

Enough for today!

 

What has this web coder coded today

My website shenanigans:

4 hours 12 minutes Networking: The Entrepreneurs' Mutual Support Network, preparation and participation.
3 hours 50 minutes This website: looking into caching issues, writing blog, writing script to clear web cache, creating blog template, and checking website ranking.
1 hour 25 minutes Client work: forms and layout, phone calls
32 minutes Research: caching
25 minutes Admin: new hub and scanner to sort out, email

Total: 10 hours 55 minutes

Felt like a lazy day today even though it wasn't!

Exercise: 45 min session, press-up, dips, arm curls and sit ups

Tomorrow: Restart stagnating project.

Share this on:
Mike Nuttall

Author: Mike Nuttall

Mike has been web designing, programming and building web applications in Leeds for many years. He founded Onsitenow in 2009 and has been helping clients turn business ideas into on-line reality ever since. Mike can be followed on Twitter and has a profile on Google+.