Useful Command Line Tools For Linux Users To Help Find How Fast Their Website Loads

Share this on:

I discuss how the speed of your website can effect your Google ranking in an article in my journal and I mention that I use some scripts to measure the speeds of my sites (and other people's).

I use a couple of Linux tools curl and ab, combine them in a bash shell script and run that in a crontab.

This command will get your Time To First Byte (TTFB) value which, it appears, is a ranking factor (see this article )

You will need curl installed, using it like this simply gives you the time, in seconds, to receiving the first byte of information from you web server.

curl -s -o /dev/null -w "%{time_starttransfer}\n" http:/ /

(there should be no gap in the url, that's there to stop my text editor turning it into a link )

Another similar tool is ab (apache benchmarking). This is more designed for load testing but you can also use it to get an idea of how quickly your website is loading:

ab -S -n 1 -c 1 http:/ /

-n the number of requests you want to make on the URL

-c the number of requests to run concurrently

-S removes standard deviation information.

So usually for server load testing with ab you would have something like -n 1000 -c 10 and push the numbers up to really test the limits of your server. But I'm not using it for that I'm just looking to get an idea of the time it takes to get the web page. Since I'm not using multiple hits at a time I use the -S switch to remove the standard deviation information because I'm only using 1 hit at a time so there is no deviation to measure.

But how do you know the values you get are consistent, there might be lots of traffic on your web connection or the server might be under heavy load from somewhere else?

Good question, well you need to run the commands over a period of time. To do this I used a bash shell script, and saved the results to a MySql databse, and then run the script every 5 minutes using cron.

For curl:


thisTime1=$(curl -s -o /dev/null -w "%{time_starttransfer}" http:/ /
thisTime2=$(curl -s -o /dev/null -w "%{time_starttransfer}" http:/ /

query="insert into timings_TTFB values(NULL,$thisTime1,$thisTime2)"

/usr/bin/mysql -uYourUserName -pYourPassword -DYour-DB-Name << eof

(In my database I have set the first field as a timestamp, hence the NULL )

In this script I am able to see the difference in the TTFB for my home page, compared to just a blank page. So what snippets am I calling that might slow down the delivery? ( getResources un-cached in my case!)

For ab:

thisTime1=$(ab -S -n 1 -c 1 http:/ / | awk '/Total:/ {print $3}')
thisTime2=$(ab -S -n 1 -c 1 http:/ /</a> | awk '/Total:/ {print $3}')

query="insert into timings values(NULL,$thisTime1,$thisTime2)"

/usr/bin/mysql -uYourUserName -pYourPassword -DYour-DB-Name << eof

In this script I am able to see the difference in the time it takes to receive the content from my home page, compared to just a blank page.

Share this on:
Mike Nuttall

Author: Mike Nuttall

Mike has been web designing, programming and building web applications in Leeds for many years. He founded Onsitenow in 2009 and has been helping clients turn business ideas into on-line reality ever since. Mike can be followed on Twitter and has a profile on Google+.

  1. Bharathi.B

    Oct 01, 2014 at 10:14 AM

    Thanks For Your valuable posting, it was very informative. i am working in Erp In India