I have this client that posted several hundred files on their internal web site that I needed to download to my laptop for testing purposes. I started downloading them one by one and thought that there must be a better way without installing some big application to do this. With a little Google searching I found a great open source utility called wget that provides everything I needed from a command line (which had the added bonus of being able to easily script it).
An example of the command to recursively download a web site is shown below:
wget -l2 -r -k http://www.siteyouwanttoget.com/folder1
- The -l parameter tells the software how many levels to download (I only needed 2 levels deep in my example)
- The -r parameter tells it to download recursively
- The -k parameter tell is to convert non-relative links into relative ones so that there will not be any dependencies on the original site.
So there you have it. One simple, small 162Kb EXE that does exactly what I needed (it also does ALOT more than this). Have Fun!