69

I've just upgraded my computer hardware(cpu + motherboard + graphic card + memory + hard disk), so that install a new OS is needed. I tried to download debian-6.0.6-amd64-netinst.iso with wget command but the speed is so slow that I could not bear. 4Kb/s ~ 17 Kb/s, slow like a running turtle, or even more slower if I use Chrome.

I've read the help information of wget, it seems like there are no options could make it more faster.

Is there anyway to make wget faster? Or is it possible to make it multi-threading download?

PS: my bandwidth is 4M. I use this command:

wget -c url  http://hammurabi.acc.umu.se/debian-cd/6.0.6/amd64/iso-cd/debian-6.0.6-amd64-netinst.iso
Peachy
  • 7,077
  • 10
  • 37
  • 45
Teifi
  • 1,262
  • 2
  • 12
  • 14
  • 1
    wget just uses your connection. So if its slow, thats your connection with the server. Maybe you are slow, maybe the server is. btw 4mbit = 0.5mb/s, not to mention loss etc – Dr_Bunsen Nov 07 '12 at 10:03
  • `@Dr_Bunsen` thank you for your advice, I tried the command that `@Gufran` suggested: `axel`, compared width `wget`, `axel` is faster than ever. I think in most situations, the bottleneck of my downloading speed are **1**. something occupied the bandwidth(as you said: **I'm slow**). **2**. single-threading. **3**. the server is slow. But i have nothing to do with the **1&3** point. – Teifi Nov 08 '12 at 02:39
  • @Teifi One possibility if `axel` multi-thread perform better than wget when using the same remote server, latency between your box and the remote server is very high. Check your ping to remote server. – John Siu Jan 08 '13 at 04:20
  • Have you tried HTTrack? http://www.httrack.com/page/1/en/index.html – amanthethy Aug 07 '14 at 03:23
  • See https://stackoverflow.com/questions/3430810/multiple-simultaneous-downloads-using-wget – rogerdpack Dec 25 '21 at 20:29

2 Answers2

90

Why not try axel? It is a fully fledged Command line based Downloader.

Install axel and spawn download by

axel -a -n [Num_of_Thread] link1 link2 link3 ...

where '[Num_of_Thread]' is the number of parallel connections to create for each link you want to download.

-a just show an improved progress bar.

Unlike many other download managers, Axel downloads all the data directly to the destination file, using one single thread. This saves some time at the end because the program doesn't have to concatenate all the downloaded parts.

Eliah Kagan
  • 116,445
  • 54
  • 318
  • 493
Gufran
  • 1,636
  • 12
  • 13
  • I am wondering if there is a way, maybe via alias and wrapper to use axel when is available instead of curl or wget, unless there is more than one parameter to the command line. – sorin Mar 06 '15 at 09:23
  • 2
    I think this proposal is not sufficient for the download of one file. Please, correct me if necessary. My attempt here http://askubuntu.com/q/813483/25388 – Léo Léopold Hertz 준영 Aug 17 '16 at 18:26
  • Thank you SO MUCH for this :) – rjurney Feb 23 '20 at 23:00
  • Re: "I think this proposal is not sufficient for the download of one file". It's actually the opposite, axel is designed to multi-thread download of a single file, especially where multiple source links exist. – Cieniek May 18 '22 at 10:25
  • axel worked great! My network connection was also an issue, but when using a different VPN, it worked at 7 MB in one second downloaded over 32 threads as indicated. Add -k if you need to allow for self signed certificates. – justdan23 Mar 20 '23 at 15:04
82

I tried axel upon Gufran's recommendation but it hugely disappointed me. My goal was to find a CLI replacement for DownThemAll because it hogs the CPU and hard disc and slows the entire system down even on an 8-core Mac Pro. I also wanted a multithreaded replacement for wget and curl, not some kludge of a script that runs multiple instances of these. So I searched further and found what I think right now is the ultimate most modern multithreaded CLI downloader there is -- aria2. The big problem I had with axel was that it 'faked' downloading files over SSL. I caught it doing that with tcdump. It was downloading https links as ordinary http. That really pissed me off and if I hadn't checked, I would have had a false sense of security. I doubt that many people know about this serious breach in security. Getting back to aria2, it is more advanced than any other downloader. It supports HTTP(S), FTP, BitTorrent, and Metalink protocols, is multiplatform, and is a download guerrilla. It maxes out my ISP's bandwidth with no load on the CPU or hard disk, unlike DTA. The man page is gigantic. I will never use more than a few of its many options. And oh, BTW, I checked its SSL performance with tcdump and it is solid, not fake. I wrote a script that mimics DTA's behavior, if not its convenience.

The basic command I use to get max bandwidth is

aria2c --file-allocation=none -c -x 10 -s 10 -d "mydir" URL

-c allows continuation of download if it gets interrupted, -x 10 and -s 10 allow up to 10 connections per server, and -d "mydir" outputs file to directory mydir.

aria2files.sh:

#!/bin/bash

filename="$1" # get filename from command line argument

while read -r line
do
    if [ "$line" ] # skip blank lines
    then
        if [[ "$line" =~ (https?|ftp)\:\/\/ ]] # line contains a URL, download file
        then
            echo "URL: '$line'"
            aria2c --file-allocation=none -c -x 10 -s 10 -d "$currdir" "$line"
        else # line contains a directory name, create directory if not already present
            echo "Directory: '$line'"
            currdir="$line"
            if [ ! -d "$currdir" ]
            then
                mkdir -p "$currdir" # '-p' enables creation of nested directories in one command
            fi
        fi
    fi
done < "$filename"

It reads a text file of the format:

files.txt:

dierctory 1
url1
url2
…
directory 2/subdirectory/sub-subdirectory/…
url3
url4
…
…
…

The script reads the filename from the command line:

aria2files.sh files.txt

It creates the directories and downloads to them. It can create nested directories as shown in the second example.

For more details see my post Bash script to download files from URLs to specified directories listed in a text file.

hmj6jmh
  • 921
  • 6
  • 3
  • Can you apply your method here too http://askubuntu.com/q/813483/25388 My unsuccessful attempt `aria2c -x10 -s10 http://horatio.cs.nyu.edu/mit/tiny/data/tiny_images.bin`. – Léo Léopold Hertz 준영 Aug 17 '16 at 18:29
  • 1
    Thanks @hmj6jmh! For the record on a Rapsberry Pi Model 3: `wget -4 -c ` gives me ~40 KB/sec (`87300K .......... .......... 11% 38.7K 4h28m` when the same file downloaded with `aria2c --disable-ipv6 -c` gives ~250 KB/sec (`144MiB/717MiB(20%) CN:1 DL:250KiB ETA:39m3s]`). – tuk0z Mar 20 '18 at 13:58
  • 6
    If you (like me) want to avoid Sourceforge, aria2 is part of the repositories, so you can install it with `sudo apt install aria2` – Bar Oct 23 '18 at 13:44
  • Good answer. For sites that force HTTPS-only, `axel` straight-up doesn't work. It seems to have been updated to use HTTPS, but that version isn't in my repositories yet. `aria2` worked well for me. – WindowsEscapist Dec 01 '18 at 21:34
  • To change default download behavior, you can provide those in aria2.conf file at `$XDG_CONFIG_HOME/aria2/aria2.conf` where XDG_CONFIG_HOME is typically at ~/.config/ for unix systems. See `man aria2c` for available options you can add to aria2.conf file. – Samir Jul 15 '22 at 21:19
  • Great answer. Works like a charm on RHEL9 (with EPEL repository). Made me download an ISO at 50MB/s instead of 6MB/s with curl. – Orsiris de Jong May 30 '23 at 10:06