hi
I'm doing backups using Acronis 2012 Home, but on one machine it won't compress.
Data size = image size.
Is it any way to defragment Ubuntu 10.04 EXT4 machine, before creating a backup image?
I've tried shutdown -F -r now, but it doesn't help.
hi
I'm doing backups using Acronis 2012 Home, but on one machine it won't compress.
Data size = image size.
Is it any way to defragment Ubuntu 10.04 EXT4 machine, before creating a backup image?
I've tried shutdown -F -r now, but it doesn't help.
Have a look at PING to back up entire disks. It copies only in use areas and compresses the backup. It runs off CD using an embedded Linux so all system files can also be backed up. It also backs up to a local disk or over the network and speeds are quite acceptable, depending on CPU speed and device speeds. You cold also just boot the embedded Linux and use tar/gzip to backup specific directories.
It is no good for backing up 24/7 servers as the machine has to be booted off the CD.
Good news is that it is totally free and works for Linux and Windows.
See here:
ping.windowsdream.com/
Last edited by shumifan50; January 31st, 2012 at 02:51 AM.
Depending on the type of file the amount of compression will vary. Already compressed files will not change much in size. My back ups of mixed data sees about 30% compression. This may sound trite, but did you check that compression is turned on for that one client?
As to defraging, Linux does not need to be defragmented, I doubt there is a tool available. I was told once many years ago that when you copy a file in Linux it is defraged during the copy. I don't know if that was true then or if it is true now, but I do know in 8 years of running Linux I have never defraged a computer.
All file systems, that allocate fles automatically, will fragment over time. Linux reduces fragmentation to some extent by allocating the closest next free block when it is required. However, if you have a file that extends over a longer period, like a log file, many blocks can be inserted in between, causing fragmentation. This improves from ext2 to ext3 to ext4, but can never be eliminated.was told once many years ago that when you copy a file in Linux it is defraged during the copy.
Defrag tools exist fro ext2 but not for ext3 or ext4 (that are safe).
All non-mainframe OS suffer from not having file allocation utilities allowing pre-allocation of files and so full control over disk organisation. On mainframes a lot of time is spent to put related files in specific relationships to each other, at specific disk addresses, to improve performance. Because of pre-allocation of the in-use as well as future extents, mainframe file systems do not fragment.
A simple way to do a defrag (but time consuming) would be to tar all the files to a different disk, then delete all the files and untar back to the disk, as this will create each file contiguous.
Last edited by shumifan50; January 31st, 2012 at 06:34 PM.
First, THX a lot for all answers.
Perhaps, this problem occurs after migrating (converting) Ext3 to Ext4.
http://www.debian-administration.org/articles/643
I've done a fresh Ubuntu installation with Ext4 and now it works well.
There is also defrag tool for ubuntu 12.04 and newer: https://apps.ubuntu.com/cat/applications/hdd-ranger/
Sorry for bad english.
Quite interesting..
Has anyone already tried this defrag tool?
Bookmarks