SkillAgentSearch skills...

Googliser

a fast BASH multiple-image downloader

Install / Use

/learn @teracow/Googliser

README

icon googliser.sh

4th January 2021: This repo is inactive until a way can be found to request new pages in Google's "endless-page" of search results. This is beyond my limited web abilities, so I'm hoping someone out there knows how to do this. If so, please contact me and work on Googliser can resume!

This is a BASH script to perform fast image downloads sourced from Google Images based on a specified search-phrase. It's a web-page scraper that can source a list of original image URLs and sent them to Wget (or cURL) to download in parallel. Optionally, it can then combine them using ImageMagick's montage into a single gallery image.

This is an expansion upon a solution provided by ShellFish and has been updated to handle Google's various page-code changes from April 2016 to the present.

Big thanks to MBtech, stevemart and dardo82 for their work on macOS compatibility and coding some great new script features. Cheers guys!


#c5f015 Installation

Via Wget:

$ bash <(wget -qO- git.io/get-googliser)

or cURL:

$ bash <(curl -skL git.io/get-googliser)

#c5f015 Workflow

  1. The user supplies a search-phrase and other optional parameters on the command-line.

  2. A sub-directory with the name of this search-phrase is created below the current directory.

  3. Google Images is queried and the results saved.

  4. The results are parsed and all image links are extracted and saved to a URL list file. Any links for YouTube and Vimeo are removed.

  5. The script iterates through this URL list and downloads the first [n]umber of available images. Up to 1,000 images can be requested. Up to 512 images can be downloaded in parallel (concurrently). If an image is unavailable, it's skipped and downloading continues until the required number of images have been downloaded.

  6. Optionally, a thumbnail gallery image is built using ImageMagick's montage into a PNG file (see below for examples).


#c5f015 Compatibility

googliser is fully supported on Manjaro & Ubuntu. Debian, Fedora Workstation and macOS may require some extra binaries. If you install it as per the installation notes above, all dependencies will be checked and installed.

If you prefer to install these manually:

Debian:

$ sudo apt install imagemagick

Fedora:

$ sudo yum install ImageMagick

macOS:

$ ruby -e "$(curl -fsSL git.io/get-brew)"
$ brew install coreutils ghostscript gnu-sed imagemagick gnu-getopt bash-completion

#c5f015 Outputs

These sample images have been scaled down for easier distribution.

$ googliser --phrase "puppies" --title 'Puppies!' --number 25 --upper-size 100000 -G

puppies

$ googliser -p "kittens" -T 'Kittens!' -n16 --gallery compact

puppies

$ googliser -n 380 -p "cows" -u 250000 -l 10000 -SG

cows


#c5f015 Usage

$ googliser -p [TEXT] -dEGhLqsSz [PARAMETERS] FILE,PATH,TEXT,INTEGER,PRESET ...

Allowable parameters are indicated with a hyphen then a single character or the long form with 2 hypens and full-text. Single character options can be concatenated. e.g. -dDEhLNqsSz. Parameters can be specified as follows:

Required:

-p [STRING] or --phrase [STRING]
The search-phrase to look for. Enclose whitespace in quotes e.g. --phrase "small brown cows"

Optional:

-a [PRESET] or --aspect-ratio [PRESET]
The shape of the image to download. Preset values are:

  • tall
  • square
  • wide
  • panoramic

-b [INTEGER] or --border-pixels [INTEGER]
Thickness of border surrounding the generated gallery image in pixels. Default is 30. Enter 0 for no border.

--colour [PRESET] or --color [PRESET]
The dominant image colour. Specify like --colour green. Default is 'any'. Preset values are:

  • any
  • full (colour images only)
  • black-white or bw
  • transparent or clear
  • red
  • orange
  • yellow
  • green
  • teal or cyan
  • blue
  • purple or magenta
  • pink
  • white
  • gray or grey
  • black
  • brown

-d or --debug
Put the debug log into the image sub-directory afterward. If selected, debugging output is appended to 'debug.log' in the image sub-directory. This file is always created in the temporary build directory. Great for discovering the external commands and parameters used!

-E or --exact-search
Perform an exact search only. Disregard Google suggestions and loose matches. Default is to perform a loose search.

--exclude-links [FILE]
Successfully downloaded image URLs will be saved into this file (if specified). Specify this file again for future searches to ensure the same links are not reused.

--exclude-words [STRING]
A comma separated list (without spaces) of words that you want to exclude from the search.

--format [PRESET]
Only download images encoded in this file format. Preset values are:

  • jpg
  • png
  • gif
  • bmp
  • svg
  • webp
  • ico
  • craw

-G
Create a thumbnail gallery.

--gallery=background-trans
Create a thumbnail gallery with a transparent background.

--gallery=compact
Create a thumbnail gallery in 'condensed' mode. No padding between each thumbnail. More efficient but images are cropped. The default (non-condensed) leaves some space between each thumbnail and each image retains it's original aspect-ratio.

--gallery=delete-after
Create a thumbnail gallery, then delete the downloaded images. Default is to retain these image files.

-h or --help
Display the complete parameter list.

--input-links [FILE]
Put a list of URLs in a text file then specify the file here. googliser will attempt to download the target of each URL. A Google search will not be performed. Images will downloaded into the specified output-path, or a path derived from a provided phrase or gallery title.

-i [FILE] or --input-phrases [FILE]
Put your search phrases into a text file then specify the file here. googliser will download images matching each phrase in the file, ignoring any line starting with a #. One phrase per line.

-l [INTEGER] or --lower-size [INTEGER]
Only download image files larger than this many bytes. Some servers do not report a byte file-size, so these will be downloaded anyway and checked afterward (unless --skip-no-size is specified). Default is 2,000 bytes. This setting is useful for skipping files sent by servers that claim to have a JPG, but send HTML instead.

-L or --links-only
Only get image file URLs, don't download any images. Default is to compile a list of image file URLs, then download them.

-m [PRESET] or --minimum-pixels [PRESET]
Only download images with at least this many pixels. Preset values are:

  • qsvga (400 x 300)
  • vga (640 x 480)
  • svga (800 x 600)
  • xga (1024 x 768)
  • 2mp (1600 x 1200)
  • 4mp (2272 x 1704)
  • 6mp (2816 x 2112)
  • 8mp (3264 x 2448)
  • 10mp (3648 x 2736)
  • 12mp (4096 x 3072)
  • 15mp (4480 x 3360)
  • 20mp (5120 x 3840)
  • 40mp (7216 x 5412)
  • 70mp (9600 x 7200)
  • large
  • medium
  • icon

-n [INTEGER] or --number [INTEGER]
Number of images to download. Default is 36. Maximum is 1,000.

--no-colour or --no-color
Runtime display in bland, uncoloured text. Default will brighten your day. :)

-o [PATH] or --output [PATH]
The output directory. If unspecified, the search phrase is used. Enclose whitespace in quotes.

-P [INTEGER] or --parallel [INTEGER]
How many parallel image downloads? Default is 64. Maximum is 512. Use 0 for maximum.

-q or --quiet
Suppress stdout. stderr is still shown.

--random
Download a single random image. Use -n --number to set the size of the image pool to pick a random image from.

-R [PRESET] or --recent [PRESET]
Only get images published this far back in time. Default is 'any'. Preset values are:

  • any
  • hour
  • day
  • week
  • month
  • year

--reindex-rename
Downloaded image files are reindexed and renamed into a contiguous block. Note: this breaks the 1:1 relationship between URLs and downloaded file names.

-r [INTEGER] or --retries [INTEGER]
Number of download retries for each image. Default is 3. Maximum is 100.

--safesearch-off
Disable Google's SafeSearch content-filtering. Default is enabled.

-s or --save-links
Put the URL results file into the image sub-directory afterward. If selected, the URL list will be found in 'download.links.list' in the image sub-directory. This file is always created in the temporary build directory.

--sites [STRING]
A comma separated list (without spaces) of sites or domains from which you want to search the images.

-S or --skip-no-size
Some servers do not report a byte file-size, so this parameter will ensure these image files are not downloaded. Specifying this will speed up downloading but will generate more failures.

--thumbnails [STRING]
Specify the maximum dimensions of thumbnails used in the gallery image. Width-by-height in pixels. Default is 400x400. If also using condensed-mode -C --condensed, this setting de

Related Skills

View on GitHub
GitHub Stars209
CategoryDevelopment
Updated7mo ago
Forks33

Languages

Shell

Security Score

92/100

Audited on Aug 27, 2025

No findings