Problem with remote backup using buffer

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi All,

I am trying to do remote backup between two linux servers connected by
a CAT 6 cable using gigabit interfaces. Without using buffering, i can
easily do the backup using the following command.

rdump -0auqn -L /sdc1 -f operator@xxxxxxxxxxx:/dev/nst0 /sdc1

However the achievalble throughput is near around 16 MB/s. This is
very low as comapared to the throughput supported by my backup system.
I came to know of the buffering technique developed at the Imperial
College, London which helps doing fast and more reliable remote
backups by employing buffering. However whenever i try to use it i get
a buffer write error. The following is the command which i am using.

dump -0auqn -L /sdc1 -f - /sdc1 | rsh -l operator 192.168.1.2 "buffer
-p 75 -o /dev/nst0"
The error message is

buffer (writer): write of data failed: Bad address
bytes to write=10240, bytes written=-1, total written          0K
DUMP: Broken pipe
DUMP: The ENTIRE dump is aborted.

I am not sure what is going wrong. Any help in this regard would be
highly appreciated.

Thanks in advance.

JS
-
To unsubscribe from this list: send the line "unsubscribe linux-config" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Audio]     [Linux Console]     [Hams]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux RAID]     [Samba]     [Fedora Users]

  Powered by Linux