> As for how to estimate entropy, isn't that just a matter of feeding it > through zlib and compare the output size to the input size? Especially > if we're already about to feed it through zlib anyway... In other > words, feed (an initial part of) the data through zlib, and if the > compression ratio so far looks good, keep going and write out the > compressed object, otherwise abort zlib and write out the original > object with compression level 0. This if probably off topic now, but as the OP, I'd like to mention that I tried setting pack.compression = 0 and it did not solve my memory issues. So it seems to be that the packing itself that is sucking up all the memory -- not the compression. Thanks for all the insightful replies! -Ken -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html