Re: Deduplication data for CentOS?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Deduplication with ZFS takes a lot of RAM.

I would not yet trust any of the linux zfs projects for data that I
wanted to keep long term.

On Mon, Aug 27, 2012 at 8:26 AM, Les Mikesell <lesmikesell@xxxxxxxxx> wrote:
> On Mon, Aug 27, 2012 at 9:23 AM, John R Pierce <pierce@xxxxxxxxxxxx> wrote:
>> On 08/27/12 4:55 AM, Rainer Traut wrote:
>>> is there any working solution for deduplication of data for centos?
>>> We are trying to find a solution for our backup server which runs a bash
>>> script invoking xdelta(3). But having this functionality in fs is much
>>> more friendly...
>>
>> BackupPC does exactly this.    its not a generalized solution to
>> deduplication of a file system, instead, its a backup system, designed
>> to backup multiple targets, that implements deduplication on the backup
>> tree it maintains.
>
> Not _exactly_, but maybe close enough and it is very easy to install
> and try.   Backuppc will use rsync for transfers and thus only uses
> bandwidth for the differences, but it uses hardlinks to files to dedup
> the storage.  It will find and link duplicate content even from
> different sources, but the complete file must be identical.  It does
> not store deltas, so large files that change even slightly between
> backups end up stored as complete copies (with optional compression).
>
> --
>    Les Mikesell
>      lesmikesell@xxxxxxxxx
> _______________________________________________
> CentOS mailing list
> CentOS@xxxxxxxxxx
> http://lists.centos.org/mailman/listinfo/centos
_______________________________________________
CentOS mailing list
CentOS@xxxxxxxxxx
http://lists.centos.org/mailman/listinfo/centos


[Index of Archives]     [CentOS]     [CentOS Announce]     [CentOS Development]     [CentOS ARM Devel]     [CentOS Docs]     [CentOS Virtualization]     [Carrier Grade Linux]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Linux USB]
  Powered by Linux