Re: large files and low memory

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 5 Oct 2010, Nguyen Thai Ngoc Duy wrote:

> On Tue, Oct 5, 2010 at 2:41 PM, Enrico Weigelt <weigelt@xxxxxxxx> wrote:
> > * Enrico Weigelt <weigelt@xxxxxxxx> wrote:
> >
> > <snip>
> >
> > Found another possible bottleneck: git-commit seems to scan through
> > a lot of files. Shouldnt it just create a commit object from the
> > current index and update the head ?
> 
> You mean a lot of stat()? There is no way to avoid that unless you set
> assume-unchanged bits. Or you could use
> write-tree/commit-tree/update-ref directly.

Avoiding memory exhaustion is also going to help a lot as the stat() 
information will remain cached instead of requiring disk access.  Just a 
guess given $subject.


Nicolas
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]