Re: Back up entire system

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



In article <MSGID_7223=3a2300=2f101=40racenet.org_464ee2f7@fidonet.org>,
Naota <sean@tcob1.net> wrote:
| Originally to: All
| 
| On Tue, 25 Nov 2003 11:53:55 -0800, John C. Linford wrote:
| 
| > Thanks.  I'll give that a try, but there could be a problem:  The file
| > generated by dd are huge, even after gzip.  How can I break it apart
| > so it will span multiple backup disks?  Say 700MB chunks for CDs?
| 
| I'd suggest using the split utility:
| 
| split -b 700m --verbose my_bigass_backup.tar.gz my_bigass_backup.tar.gz
| 
| Rejoin them with cat.
| 
| cat my_biggass_backup.tar.gz.a* > my_bigass_backup.tar.gz

As another approach, I split my backups into blocks of a fixed max size.
I start with a file holding the filenames and size, and run a perl
script I wrote to generate a series of files each of which has names
adding up to the max size. It allows adding a fixed overhead per file as
well, to allow for overhead in tar, cpio, etc. You can write CDs, DVDs,
or tapes as suits your needs.

It all started backing up a 20MB 3B1 to 400kb floppies LONG ago.
-- 
bill davidsen <davidsen@tmr.com>
  CTO, TMR Associates, Inc
Doing interesting things with little computers since 1979.
-
: send the line "unsubscribe linux-net" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Netdev]     [Ethernet Bridging]     [Linux 802.1Q VLAN]     [Linux Wireless]     [Kernel Newbies]     [Security]     [Linux for Hams]     [Netfilter]     [Git]     [Bugtraq]     [Yosemite News and Information]     [MIPS Linux]     [ARM Linux]     [Linux RAID]     [Linux PCI]     [Linux Admin]     [Samba]

  Powered by Linux