Linus Torvalds, Wed, Sep 05, 2007 09:09:27 +0200: > I personally repack everything way more often than is necessary, and I had > kind of assumed that people did it that way, but I was apparently wrong. > Comments? I do it from time to time. Seldom in working repositories, because they usually come and go before they have a chance to accumulate enough of loose objects. I do a partial repack (git repack -d) after every import from p4 repo, because every snapshot of it is an ugly mess changing files all over the tree. Sometimes, after I merged a big chunk with the p4 repo and sent it over (the process involves rebase). It is usually concious decision when to do a repack or gc. The repack time is seldom a problem: it is fast enough even on windows (and I do have big repos and binary objects). The gc causes my machines to swap, though. Some of them heavily, so there my repos stay longer partially packed. I do use .keep packs for this reason (and because windows or cygwin or both have more problems with big files the they have with small). I used to clone repos with "-s", but quickly stopped after a few broken histories. This also tought me to think before running "git gc" or "git repack -a -d". On a rare occurance I even use "git repack -a -d -l" and "git pack-refs" separately. This was all specific to my day-job. At home, on linux systems I just run git-gc whenever I please, without even thinking why. It finishes mostly in less than a minute (the kernel: ~40-50 sec on my P4 2.6GHz, 1Gb). - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html