Re: Performance impact of a large number of commits

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 2008-10-24 at 13:11 -0700, david@xxxxxxx wrote:
> > git commit explicitly (i.e., walking the tree to stat files for finding
> > changes is not necessary).
> 
> I suspect that your limits would be filesystem/OS limits more than git 
> limits
> 
> at 5-10 files/commit you are going to be creating .5-1m files/day, even 
> spread across 256 directories this is going to be a _lot_ of files.

The files are organized in a way that places no more than ~1.000 files
into each directory. Will Git create a directory containing a larger
number of object files? I can see that this would be a problem in our
use case.

> packing this may help (depending on how much the files change), but with 
> this many files the work of doing the packing would be expensive.

We can probably do that even if it takes several hours.

-Samuel

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux