On Tue, 12 Feb 2008, Johannes Schindelin wrote: > Hi, > > On Sun, 10 Feb 2008, Johannes Schindelin wrote: > > > $ /usr/bin/time git repack -a -d -f --window=150 --depth=150 > > Counting objects: 2477715, done. > > Compressing objects: 19% (481551/2411764) > > Compressing objects: 19% (482333/2411764) > > fatal: Out of memory, malloc failed411764) > > Command exited with non-zero status 1 > > 7118.37user 54.15system 2:01:44elapsed 98%CPU (0avgtext+0avgdata > > 0maxresident)k > > 0inputs+0outputs (29834major+17122977minor)pagefaults 0swaps > > I made the window much smaller (512 megabyte), and it still runs, after 27 > hours: > > Compressing objects: 20% (484132/2411764) > > However, it seems that it only worked on about 4000 objects in the last > 20(!) hours. So, the first 19% were relatively quick. The next percent > not at all. Yeah... this repo is really a pain to repack. I have access to a 8-processor machine with 8GB of ram and all my repack attempts so far were killed after using too much memory, despite the window memory limit. Those were threaded repack attempts, so the first 98% was really quick, like less than 15 minutes, but then all threads converged on this small fraction of the object space which appears to cause problems. And then I'm presuming I ran into the same threaded memory fragmentation issue. Might be worth attaching gdb to it and extract a sample of the object SHA1's populating the delta window when the slowdown occurs to see what they actually are... I'm attempting a single-threaded repack now. Nicolas - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html