0

Normally I use GNU wget to grab websites recursively.

But I want to fetch some websites with lots of Unicode URLs for various languages and I'd like to keep the Unicode in the filenames. (Here's an example.)

wget has commandline switches --local-encoding and --remote-encoding but they don't seem to support Windows' native filesystem encoding, UTF-16. I think this is because wget is designed for *nix which never has UTF-16 as a system encoding for filenames and Windows ports of wget are not official.

How can I do this under Windows and keep the files using correct Unicode?

hippietrail
  • 4,505
  • 15
  • 53
  • 86
  • @DavidPostill: I removed the shopping part of the question. It should now be as acceptable as other questions here such as http://superuser.com/questions/14403/how-can-i-download-an-entire-website – hippietrail Mar 21 '16 at 09:34
  • 1
    Much better. Vote to close removed ;) – DavidPostill Mar 21 '16 at 09:35
  • 1
    Have you tried [cygwin](https://cygwin.com/cgi-bin2/package-cat.cgi?file=x86_64%2Fwget%2Fwget-1.17.1-1&grep=wget)'s `wget`? – DavidPostill Mar 21 '16 at 09:38
  • I might try that. I'm not sure if this netbook's puny internal drive has enough space left to install cygwin. I doubt I'd use it much for other stuff these days. – hippietrail Mar 21 '16 at 09:40
  • 1
    The minimum install is only 1.4 GB (see [What is the minimal packages list for Cygwin?](http://superuser.com/q/879627)). Adding `wget` won't add very much to that. – DavidPostill Mar 21 '16 at 09:42
  • @DavidPostill: Should we close this old one? http://superuser.com/questions/67604/free-mac-os-x-application-for-downloading-an-entire-website – hippietrail Mar 21 '16 at 09:44
  • 1
    Yes. I've voted to close. – DavidPostill Mar 21 '16 at 09:45

0 Answers0