Best practices for copying lots of files machine-to-machine

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



An entire filesystem (~180g) needs to be copied from one local linux machine to another. Since both systems are on the same local subnet, there's no need for encryption.

I've done this sort of thing before a few times in the past in different ways, but wanted to get input from others on what's worked best for them.

One consideration is that the source filesystem contains quite a few hardlinks and symlinks and of course I want to preserve these, and preserve all timestamps and ownerships and permissions as well. Maintaining the integrity of this metadata and the integrity of the files themselves if of course the top priority.

Speed is also a consideration, but having done this before, I find it even more important to have a running progress report or log so I can see how the session is proceeding and approximately how much longer it will be until finished... and too to see if something's hung up.

One other consideration: There isn't much disk space left on the source machine, so creating a tar file, even compressed, isn't an option.

What relevant methods have you been impressed by?



_______________________________________________
CentOS mailing list
CentOS@xxxxxxxxxx
https://lists.centos.org/mailman/listinfo/centos



[Index of Archives]     [CentOS]     [CentOS Announce]     [CentOS Development]     [CentOS ARM Devel]     [CentOS Docs]     [CentOS Virtualization]     [Carrier Grade Linux]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Linux USB]
  Powered by Linux