wget is the best ftp client, automated warez fetcher, automated porn fetcher and automated everything fetcher in the world. And it even compiles on HP-UX.

Cool things to try are the -recursive option on any "free sex" page, and to wget your own ftp server while having /usr/home/ftpd/backup/ as your current working directory.
wget can give us warez and pr0n boys a particularly sweet feeling, when you manage to latch onto the naming scheme of a site, write a nice slurpfile full of URLs all of the form:
http://vast.porn.site.com/people-fucking-with/stones/pic0001.jpg
http://vast.porn.site.com/people-fucking-with/stones/pic0002.jpg
http://vast.porn.site.com/people-fucking-with/stones/pic0003.jpg
...
and then fire up that bad boy with

wget -t 30 -i slurpfile

Just sit back and watch the stuff roll in ...

As to baffo's example in the previous WU; a much easier way of doing that (no need for a slurpfile) would be:

wget -r -A jpg http://vast.porn.site.com/people-fucking-with/stones/

If you give the root directory, it will grab everything in that directory. The -r option means recursive (it will follow links and go into folders and stuff). The "-A jpg" means "allow (only) jpg files". Another useful parameter is the -l# (hyphen lowercase-L number). This specifies the levels deep to go; i.e. follow hyperlinks two levels deep.

Log in or register to write something here or to contact authors.