0

I want to completely save a webpage, so that when I open Firefox offline, I can navigate to this webpage and look into its contents. I used wget for this.

wget -m website.com

wget -r website.com

But when I open Firefox offline and type this website, I get server not found. How can I fix this, or find another way so that it works?

Thank you in advance!

Snow
  • 133
  • 1
  • 9
  • Use File -> OpenFile (ctrl-o) to navigate to saved content. Alternatively, use [offline mode](http://www.wikihow.com/Work-Offline-in-Mozilla-Firefox). I don't think typing URLs will do much when offline. – mikewhatever Nov 21 '16 at 17:19
  • Did you check this? http://askubuntu.com/questions/96516/how-can-i-cache-specific-web-pages-for-offline-viewing?rq=1 – M. Becerra Nov 21 '16 at 17:47
  • Is saving the page as `html`, with images and style included, good enough for you? – M. Becerra Nov 21 '16 at 17:48
  • 1
    Read `man wget`. I use the `--no-parent --relative --page-requisites --convert-links -nv -t 3 --waitretry=6 --random-wait` options, myself. YMMV. Read `man wget`. – waltinator Nov 21 '16 at 18:42
  • @M.Becerra I cannot archive my website in Scrapbook, because Scrapbook is not showing anywhere in Moxilla after I installed it. May you show how can I use it? – Snow Nov 21 '16 at 20:14
  • @M.Becerra I managed to capture the page, but it still doesn't work offline – Snow Nov 21 '16 at 20:23

1 Answers1

0

Use WebHTTrack Website Copier

Description: Copy websites to your computer, httrack with a Web interface WebHTTrack is an offline browser utility, allowing you to download a World Wide website from the Internet to a local directory, building recursively all directories, getting html, images, and other files from the server to your computer, using a step-by-step web interface.

Install it with

sudo apt-get install webhttrack

A good description of the functionality can be found on this link.

abu_bua
  • 10,473
  • 10
  • 45
  • 62