William Jiang

JavaScript,PHP,Node,Perl,LAMP Web Developer – http://williamjxj.com; https://github.com/williamjxj?tab=repositories

vi: wget images from site

vi: wget images from site

Sometimes when we visit a website, we are expressive by the images and want to download them. Manually download is bored, here is a quick way to download them in Linux platform.

  • First use FireFox’s ‘Web Developer’ to get ‘generated resource’.
    ‘View Source’->’View Generated Source’.
  • Save the source in a file of Linux/Cygwin enviroment., such as $HOME/images/src_file.
  • The next step we operate the src_file to get the images like following:
//0. locate to images/ folder.
$ cd images/

//1. strip images from html source.
$ grep -i '.png' src_file >img_file
$ grep -i '.jpg' src_file >>img_file
$ grep -i '.jpeg' src_file >>img_file
$ grep -i '.gif' src_file >>img_file

//2. extract href links.
$ vi img_file
:1,$s/.*http/http/g
:1,$s/".*//g

//3. use wget to download these images.
$ for i in `cat img_file`; do wget $i; done;

It is done within 1 minute, no matter how many and big the images are. The key is to use wget – which is a perfect tool for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols.

About these ads

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 73 other followers

%d bloggers like this: