I'm working with a repository containing a large number of revisions (it has the full 2.6 kernel history) along with a bunch of large tarfiles of prebuilt images (basically firmware images). These tarfiles are about 50MB and are updated frequently. I'm trying to figure out how to limit memory consumption when fetching from this repo. The repo lives on a network drive, and during the "remote: Compressing objects: ..." phase, git-pack-objects quickly grows to 3 or more GB in size. I've tried setting pack.windowMemory = 512m, and pack.deltaCacheLimit = 512m. Any other suggested things to try? This is running 1.5.4.3. Thanks, David -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html