Sam Hocevar wrote: > As stated several times by Linus and others, Git was not designed > to handle large files. My stance on the issue is that before trying > to optimise operations so that they perform well on large files, too, > Git should usually avoid such operations, especially deltification. > One notable exception would be someone storing their mailbox in Git, > where deltification is a major space saver. But usually, these large > files are binary blobs that do not benefit from delta search (or even > compression). Yeah, in this case, I *know* that my binary blobs are completely different, and it's just a waste of time for git to come to the same conclusion. I'd be perfectly willing to have some knob I could turn that would tell git this. > Since I also need to handle large files (80 GiB repository), I am > cleaning up some fixes I did, which can be seen in the git-bigfiles > project (http://caca.zoy.org/wiki/git-bigfiles). I have not yet tried > to change git-push (because I submit through git-p4), but I hope to > address it, too. As time goes I believe some of them could make it into > mainstream Git. I'd almost be willing to help. I know the basic premise to how git works, but the devil is in the details, and I don't have time right now to learn the internals. Yet another thing to add to my todo list. > In your particular case, I would suggest setting pack.packSizeLimit > to something lower. This would reduce the time spent generating a new > pack file if the problem were to happen again. Yeah, saw that one, but *after* I had this problem. The default, if not set, is unlimited, which in this case, is definately *not* what we want. -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html