Both of these curl and wget are use for downloading files, etc. What are the notable differences one might take in account in order chose one over the other?
-
1Define "proper use" – A.B. Sep 25 '15 at 18:14
-
16Regarding the vote to close as opinion-based: how does asking about the difference between two tools encourage opinion-based answers? – Michael Martin-Smucker Sep 25 '15 at 19:26
-
1@MichaelMartin-Smucker "Suppose I need to download a web page lets say www.google.com, should I look for wget or curl ?" what is that if not opinion-based? – muru Sep 26 '15 at 04:18
-
2@muru, It depends on what you mean by "download a web page". If you mean send an http request and receive back the results, curl is ideal. If you mean download a page, other pages it links to, and related assets, curl will not work, and wget is the only option. There are factual differences between them which make this not opinion based. – barbecue Sep 26 '15 at 14:49
-
1The fact that the question is unclear does not make it de facto opinion based. There are factual, non-opinion differences between the tools, and a question about which should be used would be reasonably assumed to be a request for information about such differences. – barbecue Sep 26 '15 at 15:43
-
@barbecue if question is unclear, still vote to close it. – muru Sep 26 '15 at 15:47
-
@barbecue further, the part for defining "proper use" was itself added after someone requested clarification. If the question is in fact about a single web page from an HTTP server, the question is opinion-based. – muru Sep 26 '15 at 15:49
-
1@muru "answers to this question will tend to be almost entirely based on opinions, rather than facts" yet all of the answers here seem to provide valuable insight based on verifiable differences between the tools. – Michael Martin-Smucker Sep 27 '15 at 04:03
-
@MichaelMartin-Smucker because only two answers provide recommendations. A.B. says: "After you've defined "proper use", use wget," (because recursive, yet the question talks of *one page*). The second answer says: "Use it according to your requirement and which feature you want to use along with downloading." ... which is common sense. The rest of these answers and the other answers are just feature comparisons. – muru Sep 27 '15 at 04:06
-
1@muru you seem to be focusing on the "download... www.google.com" example, which was added to the question later. Given the title, the true question is "What is the difference...?" so of course the answers focus on feature comparisons. – Michael Martin-Smucker Sep 27 '15 at 04:35
-
@MichaelMartin-Smucker if the title is all there is, why do we have a question body? That part was added later as a *clarification*, which speaks volumes about the question's quality in the first place. – muru Sep 27 '15 at 04:35
-
4@muru I guess we just have different interpretations of the sentiment behind the question. In my interpretation, the sentiment is: "I've heard of these two tools which seem to do the same thing. What is the difference between them, and when should I choose one over the other? What about in this specific case?" ...which seems like a reasonable question to me. – Michael Martin-Smucker Sep 27 '15 at 04:54
-
1Over on [unix.se]: [What is the difference between curl and wget?](http://unix.stackexchange.com/questions/47434/what-is-the-difference-between-curl-and-wget) – muru Sep 27 '15 at 04:59
4 Answers
After you've defined "proper use", use wget.
Why? That's why:
Recursive! wget's major strong side compared to curl is its ability to download recursively, or even just download everything that is referred to from a remote resource, be it a HTML page or a FTP directory listing.
Shameless copied from here
curl
library.
curlis powered bylibcurl– a cross-platform library with a stable API that can be used by each and everyone. This difference is major since it creates a completely different attitude on how to do things internally. It is also slightly harder to make a library than a "mere" command line tool.pipes.
curlworks more like the traditional Unixcatcommand, it sends more stuff tostdout, and reads more fromstdinin a "everything is a pipe" manner.wgetis more likecp, using the same analogue.Single shot.
curlis basically made to do single-shot transfers of data. It transfers just the URLs that the user specifies, and does not contain any recursive downloading logic nor any sort of HTML parser.More protocols.
curlsupports FTP, FTPS, Gopher, HTTP, HTTPS, SCP, SFTP, TFTP, TELNET, DICT, LDAP, LDAPS, FILE, POP3, IMAP, SMB/CIFS, SMTP, RTMP and RTSP. Wget only supports HTTP, HTTPS and FTP.More portable.
curlbuilds and runs on lots of more platforms thanwget. For example: OS/400, TPF and other more "exotic" platforms that aren't straight-forward Unix clones.More SSL libraries and SSL support.
curlcan be built with one out of eleven (11!) different SSL/TLS libraries, and it offers more control and wider support for protocol details.curlsupports public key pinning.HTTP auth.
curlsupports more HTTP authentication methods, especially over HTTP proxies: Basic, Digest, NTLM and NegotiateSOCKS.
curlsupports several SOCKS protocol versions for proxy accessBidirectional.
curloffers upload and sending capabilities.wgetonly offers plain HTTP POST support.HTTP multipart/form-data sending, which allows users to do HTTP "upload" and in general emulate browsers and do HTTP automation to a wider extent
curlsupportsgzipand inflate Content-Encoding and does automatic decompressioncurloffers and performs decompression of Transfer-Encoded HTTP, wget doesn'tcurlsupports HTTP/2 and it does dual-stack connects using Happy EyeballsMuch more developer activity. While this can be debated, I consider three metrics here: mailing list activity, source code commit frequency and release frequency. Anyone following these two projects can see that the curl project has a lot higher pace in all these areas, and it has been so for 10+ years. Compare on openhub
wget
wgetis command line only. There's no library.Recursive!
wget's major strong side compared to curl is its ability to download recursively, or even just download everything that is referred to from a remote resource, be it a HTML page or a FTP directory listing.Older.
wgethas traces back to 1995, whilecurlcan be tracked back no earlier than the end of 1996.GPL.
wgetis 100% GPL v3. curl is MIT licensed.GNU.
wgetis part of the GNU project and all copyrights are assigned to FSF. The curl project is entirely stand-alone and independent with no organization parenting at all with almost all copyrights owned by Daniel.wgetrequires no extra options to simply download a remote URL to a local file, whilecurlrequires-oor-O.wgetsupports the Public Suffix List for handling cookie domains, curl does not.wgetsupports only GnuTLS or OpenSSL for SSL/TLS supportwgetsupports only Basic auth as the only auth type over HTTP proxywgethas no SOCKS supportIts ability to recover from a prematurely broken transfer and continue downloading has no counterpart in curl.
wgetcan be typed in using only the left hand on a qwerty keyboard!
- 89,123
- 21
- 245
- 323
-
2
-
1"The curl project is entirely stand-alone and independent with no organization parenting at all with almost all copyrights owned by Daniel." Not sure that's a bad thing... – barbecue Sep 26 '15 at 15:44
-
"Its ability to recover from a prematurely broken transfer and continue downloading has no counterpart in curl." Did I understand this sentence wrong? Isn't `curl -C -` continuing broken download? – Siyuan Ren Sep 27 '15 at 05:42
-
2Why do you recommend *wget*? Even after looking at the comparisons which you wrote yourself, *curl* sounds to be far superior. *`wget can be typed in using only the left hand on a qwerty keyboard!`*, *`wget requires no extra options to simply download a remote URL to a local file, while curl requires -o or -O.`* **WTF??** – Anmol Singh Jaggi Jul 08 '16 at 02:48
-
2Size matters too: on a fresh ubuntu image, wget size is 2M vs. curl 5M packed (with deps, x3 total unpacked) – Eran W Jun 13 '17 at 04:32
There are many tools that can download like curl ,snarf , wget, pavuk, fget, fetch, lftp ,aria2 , HTTrack etc. Use it according to your requirement and which feature you want to use along with downloading.
Check feature table and use accordingly .
Curl :
- Curl support more protocol FTP, FTPS, HTTP, HTTPS, SCP, SFTP, TFTP, TELNET, DICT, LDAP, LDAPS, FILE, POP3, IMAP, SMTP, RTMP and RTSP
- Curl support more SSL libraries
- Curl supports more HTTP authentication methods, especially over HTTP proxies: Basic, Digest, NTLM and Negotiate
- Curl is powered by libcurl -a cross-platform library with a stable API that can be used by each and everyone
Wget :
- While wget supports HTTP, HTTPS and FTP
- Wget supports only GnuTLS or OpenSSL for SSL/TLS support
- Wget supports only Basic auth as the only auth type over HTTP proxy
- Wget is command line tool only, it has no library.
Important Resources for more info :
Here is good explanation curl vs Wget.
Table of features : Compare cURL Features with Other Download Tools
Detail of curl supported features : Features -- what can curl do
Detail of wget supported features : wget features
- 11,067
- 7
- 41
- 41
They have much functionality in common, but curl has more options. For wget it may occasionally be sufficient to leaf through man wget, but for curl I need to study this webpage in a browser. I believe anything your browser can do, curl can do as well.
- 28,156
- 8
- 82
- 88
There's another difference between wget and curl which I think is significant.
Wget is a stand-alone command line utility that's intended primarily for retrieving internet content quickly and simply.
Curl on the other hand is basically a terminal front end for the powerful libcurl library. Libcurl provides a very powerful set of tools for working with URLs in all their forms and flavors, and is available for almost all languages and platforms. Curl basically gives you the ability to use this library in shell scripts.
- 141
- 5