Re: [PATCH] Speedup recursive by flushing index only once for all entries

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




On Thu, 11 Jan 2007, Alex Riesen wrote:
> 
> It must have been large leak, as I really have seen the memory usage
> dropping down significantly.

I really think it was about 6MB (or whatever your index file size was) per 
every single resolved file. I think merge-recursive used to flush the 
index file every time it resolved something, and every flush would 
basically leak the whole buffer used to write the index.

Anyway, 40-50 sec on a fairly weak laptop for a 44k-file merge sounds like 
git doesn't have to be totally embarrassed. I'm not saying we shouldn't be 
able to do it faster, but it's at least _possible_ that a lot of the time 
spent is now spent doing real work (ie maybe you actually have a fair 
amount of file-level merging? Maybe it's 40-50 sec because there's some 
amount of real IO going on, and a fair amount of real work done too?)

			Linus
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]