0

We are testing our server.

I am a beginner to shell scripting.

I need an bash shell script, to read a text file having URLs, query each URL to the server and check whether the application server returns a 200 OK or 302 Redirect response.

Can you point me to any tutorial or link to do this ?

Hennes
  • 64,768
  • 7
  • 111
  • 168
user2434
  • 121
  • 1
  • 4

3 Answers3

2
for name in `cat textfile`; do curl -s -o /dev/null --write-out %{http_code} $name  2>&1 ; echo -e "\n"; done

the explanation:

for url in `cat textfile`

Output all the entries in your text file with the cat command, and make the current pointer available through the url variable.

curl -s -o /dev/null --write-out %{http_code} $url 2>&1

Call cURL and suppress any progress status with -s, write any other output to /dev/null (it's a black hole which you can throw anything into) with the -o switch, then use --write out %{http_code} to give you the return code for each link you do, put the current link down with $url, and redirect STDERR to STDOUT with 2>&1 (in case you do get any errors). Then...

; echo -e "\n"

Echo on a new line, regardless if the last statement (that whole cURL block) failed.

`; done

Finish the loop.

This probably isn't what you wanted though, since it only prints out the return codes.

qweet
  • 4,270
  • 1
  • 20
  • 15
0

Use wget command with --spider option. However this may cause HEAD request instead of GET which, depending on web site configuration may cause incorrect results. Another alternative is to use curl.

Artem Koshelev
  • 201
  • 2
  • 7
0

Follow-up to @qweet

Use curl with -K, --config <config file> command-line parameter. Each URL in this config-file must be defined with url parameter:

url="page/to/get"

Lazy Badger
  • 3,676
  • 13
  • 12