now, all the geeks in the crowd will be familiar with wget - very handy tool for downloading files / websites etc from the net. well, here's two handy options that you might not know about:
1. downloading a list of files: if there is a bunch of sequential files that you want, put the names of the files in a text file, one url on each line and save it. then do this:
wget --input-file=/pathtoyourfile.txt
then wget will sequentially get all of them
2. throttling: you've got the super fast net connection - but you don't want to hog all of it... well, throttling is your friend. you can limit the amount of bandwidth that wget will take. kinder to the webserver too:
--limit-rate=amount
ie wget http://apple.com/something.dmg --limit-rate=20k will limit to 20k/sec
so, there you go. isn't wget a great tool?