At times, it is nice to know that your internet connection is running at the best possible speed. This could be just for bragging rights but mostly it is for checking latency and diagnosing connection problems.

The trouble with most speed testing sites now ( for example) is that they require a browser to perform the testing. This is great for the home user or even the windows server environments, but what about all those linux hosts? or other systems that only have a terminal window connection?

Well, here is the simple solution:

wget -o /dev/null http://filetodownloadforthe.test

What does this command do? wget is a command line tool for downloading files from the internet, it can do loads more but we are using it to grab one specific file from a site.

The first argument we see is -o this allows us to tell wget where to put the file, in this case /dev/null this is a special file that is equivalent to nothing, anything put in there is deleted, like putting things into a black hole. We are effectively not saving it, just downloading and discarding it. The final part is the file itself.

To do reliable speed testing it needs two things, a reliable host and a standard size file. Its hard to benchmark if you're grabbing random files.

Here are a bunch of files you might want to try:

Lets see the full command for an 100MB file and the results.

wget -O /dev/null

--2018-07-12 12:02:47--
Connecting to||:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 104857600 (100M) [application/zip]
Saving to: ‘/dev/null’

/dev/null                       100%[====================================================>] 100.00M   725KB/s    in 2m 37s

2018-07-12 12:05:24 (653 KB/s) - ‘/dev/null’ saved [104857600/104857600]