Originally Posted by
ActionParsnip
Try an fsck from live CD / USB. You don't need to use a fancy backup software to make a backup. Just copy import data to another drive (preferably external) or external storage system when you need to. I use a cron'd cp command which runs weekly. It's crude but absolutely works.
For most end-user data, this is absolutely, 100%, truth. Only a small number of exceptions exist. Cannot say whether any user would be impacted or not. If they use ssh, scp, sftp, rsync, or any of the 50 other ssh-based connection tools, then the answer is likely yes and a cp will break those.
For some end-user files in $HOME, a cp to NTFS will drop the permissions, so at restore time, things will be broke.
For all system files, cp will drop owner, group, permissions, and probably miss some files do to access which prevents end-users from access - even if native Linux file systems are on the target disk.
Having 1 copy, no versions, of files is about 80% of what we need backups for. But if you can script a cp to run weekly, then you can script an rsync, which will be much quicker. And if you can script an rsync, you've just gotten to where scripting rdiff-backup is basically the same thing, for an single end-user.
Code:
# a cp backup script:
cp -rp $HOME /mnt/backup/
# a simple rsync backup script:
rsync -avz $HOME /mnt/backup/
# an simple, single-user, rdiff-backup script w/ 90days of versioned backups (if run daily):
rdiff-backup --exclude-special-files $HOME /mnt/backup/
rdiff-backup --remove-older-than 90d --force /mnt/backup/
The first 2 scripts create a single copy and always add new files, never delete old files. That can be a problem.
The rdiff-backup will create versioned backups, efficiently. The 2nd - 90th runs will be faster than rsync. You'll have 90d of versioned backups, probably using 1.3x the original amount of storage. Say your $HOME has 20G, then the 90 days of backups will have about 26G. Seems like a bargain to me. The --force option is there to remove all the backups made 90 days ago, even if we did 5 backups in 1 day.
All the scripts aren't too careful in excluding files that nobody needs in their backups. Things like Firefox or Chromium cache files, for example. Best to start and get too much than not enough.
All of those backup methods above create a mirror directory system, so to restore the last backup made we don't need any fancy tools. cp or nautilus or caja or any other file manager can be used - that includes the rdiff-backup areas. Just don't change any files or directories manually in the /mnt/backup/. Checksums for each file/directory are maintained. Wouldn't want to break those.
Anyway, good, versioned, backups of a single end-user's data aren't really THAT hard.
System backups require more thought and must be run as root/sudo to have access to protected files. What needs to be included and what absolutely must be excluded is a little different too. In rdiff-backup, the --exclude-special-files option prevents trying to backup a fifo, named pipe, or other special device files that would always provide never-ending data. rsync has options for that too - rsync has hundreds of options.
Bookmarks