Re: Git Scaling: What factors most affect Git performance for a large repo?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Thu, Feb 19, 2015 at 5:21 PM, Stefan Beller <sbeller@xxxxxxxxxx> wrote:
> On Thu, Feb 19, 2015 at 1:26 PM, Stephen Morton
> <stephen.c.morton@xxxxxxxxx> wrote:
>> I posted this to comp.version-control.git.user and didn't get any response. I
>> think the question is plumbing-related enough that I can ask it here.
>>
>> I'm evaluating the feasibility of moving my team from SVN to git. We have a very
>> large repo. [1]
>>
>> [1] (Yes, I'm investigating ways to make our repo not so large etc. That's
>>     beyond the scope of the discussion I'd like to have with this
>>     question. Thanks.)
>
> What do you mean by large?
> * lots of files
> * large files
> * or even large binary files (bad to diff/merge)
> * long history (i.e. lots of small changes)
> * impactful history (changes which rewrite nearly everything from scratch)
>
> For reference, the linux
> * has 48414 files, in 3128 directories
> * the largest file is 1.1M, the whole repo is 600M
> * no really large binary files
> * more than 500051 changes/commits including merges
> * started in 2004 (when git was invented essentially)
> * the .git folder is 1.4G compared to the 600M files,
>    indicating it may have been rewritting 3 times (well this
>    metric is bogus, there is lots of compression
>    going on in .git)
>
> and linux seems to be doing ok with git.
>
> So as long as you cannot pinpoint your question on what you are exactly
> concerned about, there will be no helpful answer I guess.
>
> linux is by no means a really large project, there are other projects way
> larger than that (I am thinking about the KDE project for example)
> and they do fine as well.
>
> Thanks,
> Stefan

Hi Stefan,

I think I addressed most of this in my original post with the paragraph

 "Assume ridiculous numbers. Let me exaggerate: say 1 million commits,
15 GB repo,
  50k tags, 1,000 branches. (Due to historical code fixups, another
5,000 "fix-up
  branches" which are just one little dangling commit required to
change the code
  a little bit between a commit and a tag that was not quite made from it.)"

To that I'd add 25k files,
no major rewrites,
no huge binary files, but lots of a few MB binary files with many revisions.

But even without details of my specific concerns, I thought that
perhaps the git developers know what limits git's performance even if
large projects like the kernel are not hitting these limits.

Steve
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]