Skip to content

Command line download tools

December 13, 2011
By Administrator in All, Linux administration

This article is inspired by a blogpost (http://geekmondo.tumblr.com/) I recently ran into, about the axel program. It made me think that I had always been using the wget command to download files via the HTTP protocol while on the Linux command line and I had never looked for an alternative. Sure, wget is a great, proven, do-it-all tool but is it really the best option for simple HTTP file downloads? I decided to run a very basic test, comparing the download speed of several command line download tools – wget, axel, curl and aria2c.

I placed a 33MB file on an old box set up as a web server and located on my local network. Then, I downloaded the file with every one of the programs above and observed the download times. Below are the results:

wget:

[email protected]:~$ wget http://example.com/test.tar
--2011-12-13 01:23:02--  http://example.com/test.tar
Resolving example.com... 192.168.32.21
Connecting to example.com|192.168.32.21|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 33629601 (32M) [application/x-tar]
Saving to: `test.tar.2'

100%[=====================================>]33,629,601 2.45M/s in 13s     

2011-12-13 01:23:15 (2.52 MB/s) -
`test.tar.2' saved [33629601/33629601]

axel:

[email protected]:~$ axel http://example.com/test.tar
Initializing download: http://example.com/test.tar
File size: 33629601 bytes
Opening output file test.tar.2
Starting download

[  0%] ......... ......... ......... ......... ......... [  87.3KB/s]
[  0%] ......... ......... ......... ......... ......... [ 174.1KB/s]
[  0%] ......... ......... ......... ......... ......... [ 231.7KB/s]
.
.
[ 99%] ......... ......... ......... ......... ......... [2424.1KB/s]
[ 99%] ......... ......... ......... ......... ......... [2424.0KB/s]
[ 99%] ......... ......... ......... ......... ......... [2424.2KB/s]
[ 99%] ......... ......... ......... ......... ......... [2423.8KB/s]
[100%]  .......... .......... .......... .......... .

Downloaded 32.1 megabytes in 13 seconds. (2424.09 KB/s)

curl:

[email protected]:~$ curl http://example.com/test.tar --O test.tar
% Total   % Received % Xferd  Average Speed Time  Time   Time Current
                             Dload Upload  Total  Spent   Left  Speed
100 32.0M 100 32.0M   0    0 2563k   0   0:00:12 0:00:12 -:--:- 2488k

aria2c:

[email protected]:~$ date
Tue Dec 13 01:25:31 EET 2011
[email protected]:~$ aria2c "http://example.com/test.tar"
[#1 SIZE:30.4MiB/32.0MiB(94%) CN:5 SPD:1.1MiBs ETA:01s]                                                                                                      
2011-12-13 01:25:47.829360 NOTICE -
Download complete: /home/user/test.tar

Download Results:
gid|stat|avg speed  |path/URI
===+====+===========+========================================
  1|  OK|   2.1MiB/s|/home/user/test.tar

Status Legend:
 (OK):download completed.

As it can be easily seen, none of the command line download tools is a winner in this test. So, why are axel and aria2c, for example, advertized as considerably faster than wget? Those programs download files by establishing multiple connections to the server and using every connection to download a portion of the file. Thus, they overcome the bandwidth limitation for a single connection imposed on most Internet web servers. On my local web server, no such lmitations were implemented and that is why all programs showed the same results.

Let’s make the following additional test:

wget:

[email protected]:~$ wget http://ftp.kernel.org/../linux-2.4.17.tar.bz2
--2011-12-13 02:10:59--http://ftp.kernel.org/../linux-2.4.17.tar.bz2
Resolving ftp.kernel.org... 149.20.4.69
Connecting to ftp.kernel.org|149.20.4.69|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 23841012 (23M) [application/x-bzip2]
Saving to: `linux-2.4.17.tar.bz2.1'

100%[=====================================>]23,841,012 704K/s in 67s     

2011-12-13 02:12:07 (348 KB/s) -
`linux-2.4.17.tar.bz2.1' saved [23841012/23841012]

axel:

[email protected]:~$ axel http://ftp.kernel.org/../linux-2.4.17.tar.bz2
Initializing download: http://ftp.kernel.org/../linux-2.4.17.tar.bz2
File size: 23841012 bytes
Opening output file linux-2.4.17.tar.bz2.0
Starting download

[  0%] ......... ......... ......... ......... ......... [  66.4KB/s]
[  0%] ......... ......... ......... ......... ......... [ 132.5KB/s]
.
.
[100%] ......... ......... ......... ......... ......... [ 793.8KB/s]
[100%] .......... .......... .......... ..
Connection 0 finished

Downloaded 22.7 megabytes in 29 seconds. (793.50 KB/s)

This time, axel turned out to be more than two times faster than wget while downloading the same remote file. This is where the multiple connections download proved to be effective. Axel and aria2c behave in a similar way, the latter offering a wider range of configuration options and both of them using multiple connections to the server by default. Curl can be configured to use multiple connections as well. It is important to note that if connection throttling is not implemented on the server, it will be more effective to download the file using a single HTTP connection, especially if the file being downloaded is large of size.

To sum up, it is true that in some cases, it might be more effective to use an alternative program to download files. You should, however, make sure that the most suitable options are being used for the command chosen.

Tags: , , , , ,

Comment Feed

5 Responses

  1. can we resume downloads via axel?
    as -c option in wget is there any option to resume downloads through axel?

    • AdministratorDecember 22, 2011 @ 6:41 pmReply

      Yes. Resuming an unfinished download is the default behavior of axel. No specific options are necessary :)

  2. PrescillaJune 5, 2012 @ 6:05 amReply

    Thank you for this informative post.

    Just a few questions here.

    Do I need to explicity specify the number of connections with the -n option or does it open multiple connections by default?

    How do I change the output filename? In wget get I use -O, does this apply to axel too?

    As for resuming downloads, no additional options must be passed? Such as -c in wget?

    • AdministratorJune 6, 2012 @ 10:55 pmReply

      Hello, for axel, you can specify the desired output filename using any of the following options – –output=filename, or -o filename.

      There is no need to use any options for resuming an unfinished download as axel will do that by default.

      The default number of connections, axel will open by default is specified in the main configuration file /etc/axelrc. On my Ubuntu system, the default is set to 4. You can either edit the configuration file or use the -n to specify the number of connections for each download you perform.

  3. Wew it’s cool. What about curl? I often use it to download, I think it’s better than Wget. Can you compare Axel with Curl? Thanks :D



Some HTML is OK

or, reply to this post via trackback.