0

I have a webpage that basically consists of a list of links to other pages. Using wget I would like to download all the pages listed.

Using "wget -r -l1 URL" I basically get what I want.

But how to do the same if the list is split over several pages (with URLs ending in "?page=3", "?page=4"....).

user1583209
  • 101
  • 1

1 Answers1

1

If you know the number of pages, you could use a for-loop:

for i in {1..5}; do wget -r -l1 URL?page=$i; done
etagenklo
  • 471
  • 3
  • 6
  • Thanks. That could work. In fact I have several such lists, but I could perhaps just pick some very large number and see what happens if the for loop hits a non-existant page. –  Apr 06 '13 at 14:35