“Mini How To” Download an entire site with wget.
Ever wanted to download every image on a website? Like wallpapers but don’t wanna spend the time to download them individually?
There is an easy way using Terminal.
The website myhdwallpaper.com has 9000+ HD wallpapers but I really don’t have the time to click through every wallpaper and just download the one I want.
The simple solution is to just do a:
wget http://myhdwallpaper.com
This would download the entire website in a flash(depending on your internet connection). But what if we want to download without the webmaster of the website noticing right away (a good webmaster will notice no matter what) we can use the following command:
wget --wait=20 --limit-rate=20K -r -p -U Mozilla http://myhdwallpaper.com
Let’s explain.
--wait=20
will make wget wait 20 seconds before downloading a new file.
--limit-rate=20K
will limit the download rate to 20K
-r
recursively follow directories
-p
get all images etc. needed to display the html page.
-U mozilla
identify as a Mozilla web browser instead of wget version.
Check out:
man wget
and
wget --help
for more options.