On Fri, 24 Oct 2008, Samuel Abels wrote:
On Fri, 2008-10-24 at 12:43 -0700, david@xxxxxxx wrote:
100,000 commits per day??
that's 1.5 commits/second. what is updating files that quickly?
This is an automated process taking snapshots of rapidly changing files
containing statistical data. Unfortunately, our needs go beyond what a
versioning file system has to offer, and the data is an unstructured
text file (in other words, using a relational database is not a good
option).
I suspect that you will have some issues here, but it's going to depend on
how many files get updated each 3/4 of a second.
That would be 5 to 10 changed files per commit, and those are passed to
git commit explicitly (i.e., walking the tree to stat files for finding
changes is not necessary).
I suspect that your limits would be filesystem/OS limits more than git
limits
at 5-10 files/commit you are going to be creating .5-1m files/day, even
spread across 256 directories this is going to be a _lot_ of files.
packing this may help (depending on how much the files change), but with
this many files the work of doing the packing would be expensive.
David Lang
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html