1

OK, so I've got wget 1.12 on Windows 7, and I can do basic downloads of it.

The site I'm trying to download: http://www.minsterfm.co.uk

and all images on it are stored externally at http://cml.sad.ukrd.com/image/

How can I download the site, and the external images and possibly allow all files to keep their original extension, without converting .php files to .htm

I would appreciate any help, since I'm new to wget.

Hennes
  • 64,768
  • 7
  • 111
  • 168
avenas8808
  • 149
  • 1
  • 6
  • What have you managed to do this far? – Kvisle Oct 31 '11 at 11:05
  • 3
    possible duplicate of [Make wget download page resources on a different domain](http://superuser.com/questions/129085/make-wget-download-page-resources-on-a-different-domain) – Daniel Beck Oct 31 '11 at 11:41
  • 1
    Does this answer your question? [How can I download an entire website?](https://superuser.com/questions/14403/how-can-i-download-an-entire-website) – Henke Jan 18 '21 at 13:54

2 Answers2

2

The manual told us:

Actually, to download a single page and all its requisites (even if they exist on separate websites), and make sure the lot displays properly locally, this author likes to use a few options in addition to ‘-p’:

wget -E -H -k -K -p http://the.site.com

You'll have to combine that with some Recursive Download options. You'd rather use --wait=xx, --limit-rate=xxK and -U agent-string to not be blacklisted by the server…

Renaud
  • 434
  • 2
  • 6
0

I've used BlackWidow for downloadinging sites recursively on Windows.

It has the following features, but is not free:

  • Scripting Engine
  • User Friendly
  • NetSpy (Network Spy)
  • SnapShot (Web page snap shot)
  • Windows Explorer like site view
  • Powerful scan filters
  • Expendable parser
  • Wildcards & Regular Expressions
Tamara Wijsman
  • 57,083
  • 27
  • 185
  • 256
Sirex
  • 10,990
  • 6
  • 43
  • 57