1

I am migrating a site in PHP and someone has hardcoded all the links into a function call display image('http://whatever.com/images/xyz.jpg').

I can easily use text mate to convert all of these to http://whatever.com/images/xyz.jpg.

But what I also need to do is bring the images down with it so for example wget -i images.txt.

But I need to write a bash script to compile images.txt with all the links to save me doing this manually because there are a lot!

Any help you can give on this is greatly appreciated.

4

2 回答 2

1

I found a one-liner on that website that should work: (replace index.php by your source)

wget `cat index.php | grep -P -o 'http:(\.|-|\/|\w)*\.(gif|jpg|png|bmp)'`
于 2012-07-24T08:21:03.903 回答
0

If you wget the file via. a web server, will you not get the output from the PHP script? That will contain img tags which you can extract using xml_grep or some such tool.

于 2012-07-24T08:11:52.763 回答