Response HTTP – linux bash
How to check HTTP response in linux under bash. I have a text file containing a list of URLs like these:
http://books.google.com http://bing.com/translator http://moz.com
I want to check the HTTP response codes of these URLs using bash and curl. The input file with urls must be in Linux format. If it is not, you need to convert it:
tr -d '\r' < dosfile.txt > output.file #zmieniam nazwe mv output.file input.txt
Here is a script to test the URL and save the results to a CSV file:
#!/bin/bash rm out.csv i=0 while read LINE do url=$(echo -e ""$LINE"" | sed 's/\"/\\"/g') echo -e $(((++i))) [ $i -lt 1 ] && continue echo -e '"'"$url"'"','"'$(curl -I -g ""$url"" 2>/dev/null | head -n 1 | awk -F" " '{print $2}')'"' >> out.csv done < ./input.txt #input.txt musi być w formacie linux exit 0
…and voila! Ready!
You can interrupt the script at any time and return to checking the rest of the URL list. In line 3 you need to change i=0 to i=111 if you broke on line 111 of the input file.