0

I have a text file of URLs, each in the form of:

http://domain.com/a/b/c/d.jpg

I want to download all of these, but save each file under the name:

c_d.jpg

In other words, for each file, I want to save the file under its original filename prefixed by the name of its parent directory.

How would I go about doing this on Windows?

I'm fine with using a command line tool, such as wget or curl, just give me the arguments.

Thanks.

user294732
  • 146
  • 5
  • http://linux.die.net/man/1/wget http://linux.die.net/man/1/curl http://superuser.com/questions/362152/native-alternative-to-wget-in-windows-powershell –  Jul 08 '14 at 23:34
  • That is completely unhelpful. I know these tools exist. I'm asking how to get any one of them to do what I described. – user294732 Jul 08 '14 at 23:57

1 Answers1

0

Not sure how to make it in a pure windows environment, but in a cygwin environment, you could try this: (requires bash, sed, wget)

while read link; do a=`echo $link | sed 's/.*\/\(.*\)\/\(.*\)/wget \0 -O \1_\2/'`; echo $a; $($a); done < links.txt

where links.txt is your file.

Of course you can tweak the sed expression to convert the link to a filename in anyway.

Cheers

loluengo
  • 131
  • 3