4

I am looking for a command line tool that can download multiple urls with multiple threads e.g.

wget2 -n 5 http://stackoverflow.com/ http://askubuntu.com/ http://bobo.com/

Where -n = number of threads. I have come across Axel, but when I give it multiple URLs, it only downloads one.

I will be downloading HTML files.

Alvar
  • 16,898
  • 29
  • 91
  • 134
Kohjah Breese
  • 2,701
  • 10
  • 24
  • 32
  • 1
    Here are some answers: http://stackoverflow.com/questions/3430810/wget-download-with-multiple-connection-simultaneously and here is the same question on ask ubuntu: http://askubuntu.com/questions/214018/how-to-make-wget-faster-or-multithreading – Evenbit GmbH Nov 27 '13 at 13:44

2 Answers2

10

Aria2 is the best solution for this if you want CLI. Aria2 supports multiple connections, multiple threads and multiple sources.

Another benefit of Aria2 is that is works as a plugin for uGet so you can use the power of Aria2 with a nice easy to use GUI.

Aria2 - CLI - http://aria2.sourceforge.net/

uGet - GUI - http://ugetdm.com

  • multiple connections is adjustable in GUI when adding a download.

Update: based on OP's batch needs

uGet supports batch downloads via .txt, .html, clipboard and many more methods. While admittedly not CLI, I think it solves the problem quite well. I created a video tutorial to explain the various methods, the GUI has changed since this recording but the functionality is still relevant.

Michael Tunnell
  • 4,267
  • 1
  • 18
  • 21
  • Do you know the command to download multiple URLS? I have tried: aria2c -c http://www.google.com http://google.co.uk/zyzpage And again it only downloads the first URL. – Kohjah Breese Nov 27 '13 at 16:25
  • I don't know because I use uGet but here is the aria2 examples page from their wiki. http://aria2.sourceforge.net/manual/en/html/aria2c.html#example – Michael Tunnell Dec 09 '13 at 07:15
  • This answer is stub. Can you please add code examples. – Léo Léopold Hertz 준영 Aug 17 '16 at 18:10
  • @Masi it's not a stub. Aria2 does the multi-connection and multi-threading by default. uGet provides this functionality through a GUI so no code to show. I'll edit the answer anyway. – Michael Tunnell Aug 19 '16 at 15:14
1

All of the above and linked suggestions do not take two unique URLs. They only take URLs that are mirrors of the same file.

I've found a few programs that do this:

The best is puf (apt-get install puf), use puf url1 url2 etc.

Then there is HTTRACK, which requires a lot of tinkerings and has some limites I can't get past (speed and connection limits)

DownThemAll for Firefox is very good if you don't need a command line app.

UPDATE

I've since found puf has a tendency to crash. The best solution is to create a .txt file with URLs on new lines, e.g.

http://google.com/
http://yahoo.com/

Save that are urls.txt (for example) then run the command:

cat urls.txt | xargs -n 1 -P 10 wget -q

-n specifies to select each line from the file

-p specifies the number of URLs you would like to download in parallel.

Kohjah Breese
  • 2,701
  • 10
  • 24
  • 32