0

I am trying to download a web page including everything. I tried several methods and so far what worked better is the build in save page in Firefox, nevertheless it does not allow me to include one or more levels of links. I also tried HTTrack but in this case it fails to include the js files.

How can I configure HTTrack to download js as well? A better tool to do this?

Thanks

nest
  • 103
  • 1
  • 4
  • http://superuser.com/questions/341960/how-do-you-use-wget-to-mirror-a-site-1-level-deep-recovering-js-css-resources?rq=1 – Ƭᴇcʜιᴇ007 Jan 20 '15 at 16:12
  • possible duplicate of [Rip a website via HTTP to download images, HTML and CSS](http://superuser.com/questions/130306/rip-a-website-via-http-to-download-images-html-and-css) – Ƭᴇcʜιᴇ007 Jan 20 '15 at 16:13
  • 1
    Does this answer your question? [How can I download an entire website?](https://superuser.com/questions/14403/how-can-i-download-an-entire-website) – Henke Jan 18 '21 at 15:10

2 Answers2

1

If you on Mac, install brew, then use brew to install wget, finally, use wget command to download the page.

wget -p -k -e robots=off -U 'Mozilla/5.0 (X11; U; Linux i686; en-US;rv:1.8.1.6) Gecko/20070802 SeaMonkey/1.1.4' https://www.google.com 
Byron
  • 111
  • 2
0

On the web page , view its source , and then copy it and paste it in notepad and for others open those css and js links from the website's source and save according to their file-extension.

Chirag
  • 55
  • 1
  • 2
  • 8