Re: Organizing (large) test data in git

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tuesday, February 27, 2007 at 20:52:38 (+0100) Johannes Schindelin writes:
>Hi,
>
>On Tue, 27 Feb 2007, Bill Lear wrote:
>
>> We are contemplating files on the order of 500 megabytes a piece.
>
>I recommend splitting the files so that no file is that large (but the sum 
>of them can be). But I think that you really wanted to say that.
>
>I think the problem of large packs is tackled right now by Troy, Shawn and 
>Nico. Troy had exactly the same problem AFAIU, and Nico and Shawn are 
>working on a new pack file format, which would lift the 4GB limit on packs 
>while at it.
>
>This should solve your problems.

Welll... it's not really a matter of capacity, though I do agree that
lifting that limit will help.  We are more concerned with time to
clone the repos over the (often very slow) corporate network, for
example.  With future ratios of about 1% code to 99% test data, we
really would like to have a light-weight code repo that we can throw
hither and yon with little care, and a monster data repo that is
(somehow) sanely managed with git as well.  I was just curious if
others had run into the management problems that I mentioned with
separating test data from code and what they may have done to surmount
them.


Bill
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]