Re: Git blame performance on files with a lot of history

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, Dec 14, 2018 at 10:29 AM Clement Moyroud
<clement.moyroud@xxxxxxxxx> wrote:
>
> Hello,
>
> My group at work is migrating a CVS repo to Git. The biggest issue we
> face so far is the performance of git blame, especially compared to
> CVS on the same file. One file especially causes us trouble: it's a
> 30k lines file with 25 years of history in 3k+ commits. The complete
> repo has 200k+ commits over that same period of time.

After you converted the repository from CVS to Git, did you run a manual repack?

The process of converting a repository from another SCM often results
in poor delta chain selections which result in a repository that's
unnecessarily large on disk, and/or performs quite slowly.

Something like `git repack -Adf --depth=50 --window=200` discards the
existing delta chains and chooses new ones, and may result in
significantly improved performance. A smaller depth, like --depth=20,
might result in even more performance improvement, but may also make
the repository larger on disk; you'll need to find the balance that
works for you.

Might be something worth testing, if you haven't?

Bryan



[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux