Re: Linux backup

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Malcolm Kay wrote:

Some weeks ago I enquired here about 'dump' for
use with ext3 file systems; and was strongly advised
the Linux and 'dump' don't play well together.

Reading the arguments including Linus Torvalds's comment
' Right now, the cpio/tar/xxx solutions are definitely the best ones, and will work on multiple filesystems (another limitation of "dump"). Whatever problems they have, they are still better than the _guaranteed_(*) data corruptions of "dump".'
I was and am still convinced that 'dump' is not the way to
go under linux.


So I've spent some time scripting to marry in 'tar' backups for recently acquired Linux machines with a backup system that uses 'dump' for our unix machines.

Yesterday I ran this for the first time on one of the Linux
machines and found the backup aborted with the following error in the log file:
/bin/tar: /home/thi/OM5438/test.hir1: file changed as we read it
/bin/tar: Error exit delayed from previous errors
Backup /data/pstar/root-0-z.tgz FAILED at Wed 18 Aug 2004 15:29:20 CST


So 'dump' leads to corrupt backups, 'tar' leads to aborted backups.
The abort message is undoubtably correct -- the file in question is a
temporary file used during circuit simulation analysis. Individual simululation runs can take from a few second upto a week. So it is not
practical to close down everything for backup. (If it was then
partitions could be dismounted for backup and the principal problem espoused for 'dump' would disappear.) Such files are not crucial to the backup. If tar simply skipped them or indicated that they were corrupt
in the archive while correctly preserving the rest of the file system
then this would be satisfactory -- but instead it aborts.



So is there someway to get 'tar' to continue when an odd file or two exhibits this sort of problem? I know about the option: --ignore-failed-read don't exit with non-zero status on unreadable files but from my interpretation of the man page it is not relevant to this problem.

Does 'cpio' have the same problem?


No. cpio version 2.5 does a lot of complaining about imaginary errors, but it does not abort, and AFAIK copies all files correctly. I use it for all my backups from samba and win2k shares.

Some have suggested 'amanda', but my understanding is that this is
just a wrapper that optionally uses 'dump' or 'tar' so this seems
to take us nowhere.

What else is there out there for backup? I am not looking for a backup system; just a reasonably reliable backup utility that can be used
so that the linux machines can be incorporated into the backup system
that works well for our unix machines.


Some advice please.

Malcolm Kay





HTH,
Peter Smith


-- redhat-list mailing list unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subject=unsubscribe https://www.redhat.com/mailman/listinfo/redhat-list

[Index of Archives]     [CentOS]     [Kernel Development]     [PAM]     [Fedora Users]     [Red Hat Development]     [Big List of Linux Books]     [Linux Admin]     [Gimp]     [Asterisk PBX]     [Yosemite News]     [Red Hat Crash Utility]


  Powered by Linux