Re: Performance impact of a large number of commits

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 24 Oct 2008, Samuel Abels wrote:

Hi,

I am considering Git to maintain a repository of approximately 300.000
files totaling 1 GB, with a number of ~100.000 commits per day, all in
one single branch. The only operations performed are "git commit", "git
show", and "git checkout", and all on one local machine. Does this sound
like a reasonable thing to do with Git?

100,000 commits per day??

that's 1.5 commits/second. what is updating files that quickly?

I suspect that you will have some issues here, but it's going to depend on how many files get updated each 3/4 of a second.

David Lang
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux