If the owners of the website wants people to download many files, they might make it easy by setting up anonymous ftp, so that you can connect via ftp and use or some GUI tool to get all files in a particular directory.
You can try with wget, but the owners of the website might not want massive downloading, so they might have made it hard to connect via wget. See the next post by ofnuts. Anyway, it might work, so try it.
As an example, you can try the following command, which should download a png file illustrating a tool to make USB boot drives by cloning iso files and compressed image files.
Code:
wget 'https://help.ubuntu.com/community/mkusb/pictures?action=AttachFile&do=get&target=13-toggle-USB-only_show-all-drives.png'
works but gives you an ugly file name. This command creates a nicer file name
Code:
wget -O 13-toggle-USB-only_show-all-drives.png 'https://help.ubuntu.com/community/mkusb/pictures?action=AttachFile&do=get&target=13-toggle-USB-only_show-all-drives.png'
It is hard to get several files with one command, when the files are stored like that, because you must create a list and then use that list.
Another example shows that it can be easier, when the files are there like at the web page http://phillw.net/isos/linux-tools/mkusb/
Cut and paste the list of content to an editor, remove everything except the file names, and you get
Code:
md5sum.txt.asc
mkUSB-quick-start-manual-74.pdf
mkUSB-quick-start-manual.pdf
mkusb
mkusb-old
mkusb-old-plus-minor-fix
mkusb74
which you save as the file 'list'. Then this command will download all those files
Code:
while read i;do wget "http://phillw.net/isos/linux-tools/mkusb/$i";done<list
Bookmarks