PDA

View Full Version : help regarding wget



anuajay1988
February 3rd, 2012, 09:58 AM
Sir, I am downloading some files sequentially from a ftp server with the help of a shell script.... sometimes it happens that a particular file is not uploaded at that time n will be uploaded after few minutes.....

AT present it skips that file and proceed downloading rest files...
I want that it should wait or retry until that file is uploaded/... and then it should proceed further.... thanx

Lars Noodén
February 3rd, 2012, 12:50 PM
You'll need to write a script that knows the file names and downloads them one by one, waiting if one is unavailable. You can do that fairly easily in bash or perl. However, wget by itself will not be enough in that regard. It will need to be wrapped in a script.

ofnuts
February 3rd, 2012, 03:08 PM
You'll need to write a script that knows the file names and downloads them one by one, waiting if one is unavailable. You can do that fairly easily in bash or perl. However, wget by itself will not be enough in that regard. It will need to be wrapped in a script.A more efficient way is to skip the file that isn't available, process the next one, and at the end of the pass, compare the list of URLs with the list of obtained files, and rerun for the missing file(s).

anuajay1988
February 6th, 2012, 09:15 AM
A more efficient way is to skip the file that isn't available, process the next one, and at the end of the pass, compare the list of URLs with the list of obtained files, and rerun for the missing file(s).
can i get the script for that....

ofnuts
February 6th, 2012, 11:23 AM
can i get the script for that....
Something like

#! /bin/bash

function localfile
{
# extracts local file from URL
# here a barebones version, you may have to roll you own
echo $(basename $1)
}

while read url
do
#echo url if file isn't there
[[ ! -f $(localfile $url) ]] && echo $url
done
called as:

missed <original.lst >missing-ones.lst

anuajay1988
February 15th, 2012, 05:44 AM
it din worked dear....