I really need the command to download everything from a website. It is arnos.gr and has a lot of content on it which I want to browse locally without Internet I have an account for login
Asked
Active
Viewed 8,547 times
0
-
Depending on the website this might not be possible to do at all. wget -mk will scrap all images, html and javascript files and replace all absolute links to the website with relative links. However modern sites serve a lot of content via javascript requests and you will not be able to access this content. – Jozef Legény Feb 11 '14 at 13:22
-
Here is a similar question with an answer: http://superuser.com/questions/405669/save-static-version-of-a-webpage-to-be-available-offline – Malt Feb 11 '14 at 15:51
1 Answers
-1
In Linux only: Run the command: wget -mk arnos.gr Then wait. However I agree with Jozef Legény that the JavaScript may cause problems.
ultrapenguin
- 67
- 2
- 7
-
1dude you can get wget for windows, it's gnuwin32 but -1 'cos there are going to be issues as Jozef mentioned that and as you said quoting Jozef, so I don't think it's satisfactory as an answer, and that's precisely why Jozef didn't post it as one, just as a comment. The questioner gave the site he is trying, as an example. You've just repeated Jozef's comment and made it an answer, but it's not an answer because it's not going to do it right at all. And your 'answer' doesn't add anything to what has already been said. wget -mk was already suggested and you know it. – barlop Feb 11 '14 at 15:21