I need to back up a client server running Ubuntu Server 12.04 ( I think...) to an external drive. They had someone in the past set up a command for them that tars the entire drive to a backup file then they move that file to a backup drive. I'm asking for help on a few items:
- Please review the backup steps they take now and tell me if they are sufficient to accomplish a bare metal restore assuming the server hard drive fails or we move it to another machine altogether. I'm trying to test that now, but at 100GB, the backup file takes some time to move around and attempt a restore. (more below)
- I'm going to attempt restoring this to a virtual machine in VirtualBox. That should work, right?
- What is the best long-term backup strategy for this. My priorities are minimal down-time, but a few hours is alright. My feeling is I should be able to get incremental backups set up using three or four external drives they'd cycle off-site weekly. A RAID1 mirror is on my list too. So, what's the best way to set up incremental backups on Ubuntu Server? Note: I have a test server here and tried Backula, but couldn't figure it out, even after reading a bunch of tutorials. Does this or another application work to do as I wanted, especially considering multiple drives and incremental backups? If not, what scheme do you all recommend?
More info on 1.:
Here are the steps they currently take to make a backup:
- Plug in backup drive, navigate to Webmin, mount the external drive under /mnt
- through Putty (remote from another machine):
Code:cd /mnt tar cvpzf backup.tgz --exclude=/proc --exclude=/lost+found --exclude=/backup.tgz --exclude=/mnt --exclude=/sys /
- when complete they copy the backup.tgz file to a folder named by date
- unmount drive and walk off-site with it.
so, does this look good enough for now? They've never tested a restore and that's what I'll try once I have this file copied. Any advice for setting this up in a vm as if a bmr?
Thanks for any tips or advice.