Re: [git-users] worlds slowest git repo- what to do?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, May 16, 2014 at 2:06 AM, Philip Oakley <philipoakley@xxxxxxx> wrote:
> From: "John Fisher" <fishook2033@xxxxxxxxx>
>>
>> I assert based on one piece of evidence ( a post from a facebook dev) that
>> I now have the worlds biggest and slowest git
>> repository, and I am not a happy guy. I used to have the worlds biggest
>> CVS repository, but CVS can't handle multi-G
>> sized files. So I moved the repo to git, because we are using that for our
>> new projects.
>>
>> goal:
>> keep 150 G of files (mostly binary) from tiny sized to over 8G in a
>> version-control system.

I think your best bet so far is git-annex (or maybe bup) for dealing
with huge files. I plan on resurrecting Junio's split-blob series to
make core git handle huge files better, but there's no eta on that.
The problem here is about file size, not the number of files, or
history depth, right?

>> problem:
>> git is absurdly slow, think hours, on fast hardware.

Probably known issues. But some elaboration would be nice (e.g. what
operation is slow, how slow, some more detail characteristics of the
repo..) in case new problems pop up.
-- 
Duy
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]