John <john@xxxxxxxxxxxxxxxxx> writes: > We're seeing serious performance issues with repos that store media > files, even relatively small files. For example, a web site with less > than 100 MB of images can take minutes to commit, push, or pull when > images have changed. > > Our first guess was that git is repeatedly attempting to > compress/decompress data that had already been compressed. We tried > these configuration settings (shooting in the dark) to no avail: > > core.compression 0 ## Docs say this disables compression. Didn't seem to work. > pack.depth 1 ## Unclear what this does. > pack.window 0 ## No idea what this does. > gc.auto 0 ## We hope this disables automatic packing. > > Our guess that re-compression is to blame may not even be valid since > we can manually re-compress these files in seconds, not minutes. > > Is there a trick to getting git to simply "copy files as is"? In > other words, don't attempt to compress them, don't attempt to "diff" > them, just store/copy/transfer the files as-is? Search for `delta` attribute, which should be unset for files that you don't want for git to attempt (binary) delta against, in gitattributes manpage. P.S. There is also git-bigfiles project that migth be of interest to you. -- Jakub Narebski Poland ShadeHawk on #git -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html