-2

I need a utility which is capable of downloading the contents of an website recursively. For example I have a website URL which has 10 hyperlinks. Using the utility, I should be able to download the contents of those 10 hyperlinks to my local system.

Please let me know if you are aware of any such utility.

Apps
  • 119
  • 1
  • 8

1 Answers1

0

I would suggest looking at wget.

Reference: http://www.linuxjournal.com/content/downloading-entire-web-site-wget

Jack
  • 1,303
  • 8
  • 7