Problem in backing up Millions of files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I have Linux FTP server that I need to backup. It had 4.5 million files
and 900GB disk space. 
Backup client takes around 35 hours to do incremental backup. Simple
Rsync on nfs mount (on datadomain) takes around 25 hours. System takes
way too long just to calculate the number of files in the directory and
to start rsync. Even not many files are changed. 
Does any one have an idea what can be a better was to backup 4.5
millions files. I am looking for any option in rsync that can speed up
the process. 
 
Thank you,

 


Jai Rangi


 

-- 
redhat-list mailing list
unsubscribe mailto:redhat-list-request@xxxxxxxxxx?subjecthttps://www.redhat.com/mailman/listinfo/redhat-list


[Index of Archives]     [CentOS]     [Kernel Development]     [PAM]     [Fedora Users]     [Red Hat Development]     [Big List of Linux Books]     [Linux Admin]     [Gimp]     [Asterisk PBX]     [Yosemite News]     [Red Hat Crash Utility]


  Powered by Linux