Re: Performance impact of a large number of commits

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 2008-10-24 at 22:29 -0700, david@xxxxxxx wrote:
> when git stores the copies of the files it does a sha1 hash of the file 
> contents and then stores the file in the directory
> .git/objects/<first two digits of the hash>/<hash>

> it would be a pretty minor change to git to have it use more directories 

Ah, I see how this works. Well, I'll think of a way to cope with this (I
might patch my Git installation, or see how well it performs on an
indexed file system). If all else fails we'll have to slash the number
of commits even if this means that some files are not added to the
history.

> my concern is that spending time creating the pack files will mean that 
> you don't have time to insert the new files.
> 
> that being said, there may be other ways of dealing with this data rather 
> than putting it into files and then adding it to the git repository.
> 
> Git has a fast-import streaming format that is designed for programs to 
> use that are converting repositories from other SCM systems.

I'm pretty sure that the streaming format won't do us much good, as the
files are re-created from scratch between commits.

Thanks a lot for the information, this was very helpful.

-Samuel

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux