Re: sync files (very huge directory of files)
thanks so much for the rsync suggestion.
I tried out rsync with grsync gui and it worked perfectly. What amazed me was the speed at which it operated even on the first initial sync. It went through the 73,000 files (36,500 in each directory) in only 40 seconds (that was according to rsync's computer timer, but it sure felt like only 25 seconds). On the second run, after the initial sync, it took only 6 seconds. (for anyone reviewing this, keep in mind that I was syncing to a nearly identical folder as the source not to an empty one).
This was cool because I could run the simulation first to see how it would work before actually executing. (it was also a way to "try out" the program before committing)
Also, thanks for the other tips because I think I'll be using the --backup option for my programming code, which I've yet to sync. Don't know if I can do that in the gui but I can drop back to the terminal easily. (there's a "make backups" in the advance options in the gui but I haven't checked if that's the same as the --backup option from the command line, though I suspect it is).
Thanks again and thanks a heep!
edited: changed 100,000 to 73,000 (I had forgotten that I did a major clean up and also used fslint to eliminate hords of duplicate files before I tested rsync - grsync)
Last edited by ClarkePeters; May 11th, 2010 at 07:46 PM.
Reason: clear up misinformation
the power of X: xhtml, xforms, xslt, xml