Re: large files and low memory

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Oct 5, 2010 at 2:41 PM, Enrico Weigelt <weigelt@xxxxxxxx> wrote:
> * Enrico Weigelt <weigelt@xxxxxxxx> wrote:
>
> <snip>
>
> Found another possible bottleneck: git-commit seems to scan through
> a lot of files. Shouldnt it just create a commit object from the
> current index and update the head ?

You mean a lot of stat()? There is no way to avoid that unless you set
assume-unchanged bits. Or you could use
write-tree/commit-tree/update-ref directly.
-- 
Duy
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]