On Wed, Mar 18, 2015 at 5:55 PM, Graham Hay <grahamrhay@xxxxxxxxx> wrote: > We have a fairly large repo (~2.4GB), mainly due to binary resources > (for an ios app). I know this can generally be a problem, but I have a > specific question. > > If I cut a branch, and edit a few (non-binary) files, and push, what > should be uploaded? I assumed it was just the diff (I know whole > compressed files are used, I mean the differences between my branch > and where I cut it from). Is that correct? > > Because when I push, it grinds to a halt at the 20% mark, and feels > like it's trying to push the entire repo. If I run "git diff --stat > --cached origin/foo" I see the files I would expect (i.e. just those > that have changed). If I run "git format-patch origin/foo..foo" the > patch files total 1.7MB, which should upload in just a few seconds, > but I've had pushes take over an hour. I'm using git 2.2.2 on Mac OS X > (Mavericks), and ssh (git@xxxxxxxxxx). > > Am I "doing it wrong"? Is this the expected behaviour? If not, is > there anything I can do to debug it? It would help if you pasted the push output. For example, does it stop at 20% at the "compressing objects" line or "writing objects". How many total objects does it say? Another question is how big are these binary files on average? Git considers a file is "big" if its size is 512MB or more (see core.bigFileThreshold). If your binary files are are mostly under this limit, but still big enough, then git may still try to compare new objects with these to find the smallest "diff" to send. If it's the case, you could set core.bigFileThreshold to cover these binary files. -- Duy -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html