Re: pack operation is thrashing my server

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Aug 11, 2008, at 15:10, Andi Kleen wrote:

As a quick workaround you could try it with a 32bit git executable?
(assuming you have a distribution with proper multilib support)

I think the right fix would be to make git throttle itself (not
use mmap, use very small defaults etc.) on low memory systems.
It could take a look a /proc/meminfo for this.

I've always felt that keeping largish objects (say anything >1MB)
loose makes perfect sense. These objects are accessed infrequently,
often binary or otherwise poor candidates for the delta algorithm.

Many repositories are mostly well-behaved with large number of text
files that aren't overly large and compress/diff well. However, often
a few huge files creep in. These might be a 30 MB Word or PDF documents
(with lots of images of course), a bunch of artwork, some random .tgz files
with required tools or otherwise.

Regardless of their origin, the presence of such files in real-world SCMs is a given and can ruin performance, even if they're hardly ever accessed
or updated. If we would leave such oddball objects loose, the pack would
be much smaller, easier to generate, faster to use and there should be no
memory usage issues.

  -Geert
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux