Re: Performance impact of a large number of commits

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 2008-10-24 at 12:43 -0700, david@xxxxxxx wrote:
> 100,000 commits per day??
> 
> that's 1.5 commits/second. what is updating files that quickly?

This is an automated process taking snapshots of rapidly changing files
containing statistical data. Unfortunately, our needs go beyond what a
versioning file system has to offer, and the data is an unstructured
text file (in other words, using a relational database is not a good
option).

> I suspect that you will have some issues here, but it's going to depend on 
> how many files get updated each 3/4 of a second.

That would be 5 to 10 changed files per commit, and those are passed to
git commit explicitly (i.e., walking the tree to stat files for finding
changes is not necessary).

-Samuel


--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux