On Mon, Jul 24, 2017 at 02:58:38PM +1000, Andrew Ardill wrote: > On 24 July 2017 at 13:45, Farshid Zavareh <fhzavareh@xxxxxxxxx> wrote: > > I'll probably test this myself, but would modifying and committing a 4GB > > text file actually add 4GB to the repository's size? I anticipate that it > > won't, since Git keeps track of the changes only, instead of storing a copy > > of the whole file (whereas this is not the case with binary files, hence the > > need for LFS). > > I decided to do a little test myself. I add three versions of the same > data set (sometimes slightly different cuts of the parent data set, > which I don't have) each between 2 and 4GB in size. > Each time I added a new version it added ~500MB to the repository, and > operations on the repository took 35-45 seconds to complete. > Running `git gc` compressed the objects fairly well, saving ~400MB of > space. I would imagine that even more space would be saved > (proportionally) if there were a lot more similar files in the repo. Did you tweak core.bigfilethreshold? Git won't actually try to find deltas on files larger than that (500MB by default). So you might be seeing just the effects of zlib compression, and not deltas. You can always check the delta status after a gc by running: git rev-list --objects --all | git cat-file --batch-check='%(objectsize:disk) %(objectsize) %(deltabase) %(rest)' That should give you a sense of how much you're saving due to zlib (by comparing the first two numbers for a copy that isn't a delta; i.e., with an all-zeros delta base) and how much due to deltas (how much smaller the first number is for an entry that _is_ a delta). -Peff