I was trying to find a way of using wget to log a the list of redirected website URLs into one file. For example:
www.website.com/1234
now redirects to www.newsite.com/a2as4sdf6nonsense
and
www.website.com/1235
now redirects to www.newsite.com/ab6haq7ah8nonsense
Wget does output the redirect, but doesn't log the new location. I get this in the terminal:
HTTP request sent, awaiting response...301 moved permanently
Location: http.www.newsite.com/a2as4sdf6
...
I would just like to capture that new URL to a file.
I was using something like this:
for i in `seq 1 9999`; do
wget http://www.website.com/$i -O output.txt
done
But this outputs the sourcecode of each webpage to that file. I am trying to just retrieve only the redirect info. Also, I would like to add a new line to the same output file each time it retrieves a new URL.
I would like the output to look something like:
www.website.com/1234 www.newsite.com/a2as4sdf6nonsense
www.website.com/1235 www.newsite.com/ab6haq7ah8nonsense
...