git-repack & big files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hello,

first, I do know git is not optimized for big files, and that's fine.
But it is able, on my machine with 3 GB of RAM, to succesfully backup my
home directoryÂ, which contains, among others, several files of several
hundreds of megabytes each. And I like that a lot.

Since it perfectly does what it is not optimized to do... I then wonder
when it does not do what it declares: if I run git-repack with the
parameter --window-memory set to, for instance, "100m", it takes
hundreds and hundreds of MB of memory until it runs out of memory, fails
a malloc and aborts.
So, two questions:

1) is there a bug, is the documentation about that parameter a bit too
optimistic or did I just not understand it?

2) do I have any hope that in one way or another my 500+ MB mailboxes
with relatively small changes over time are archived smartly (=diffs) by
git at the current state of development? If I understand correctly, the
project git-bigfiles would just "solve" my problems by not making
differences of big files.

thanks for the clarifications

Pietro


 Just for the records: through gibak:
http://eigenclass.org/hiki/gibak-0.3.0

 git version 1:1.7.2.3-2.2 on Debian

 http://caca.zoy.org/wiki/git-bigfiles

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]