16 gig, 350,000 file repository

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I'm starting a new, large project and would like a quick bit of advice.

Bringing in a set of test cases and other files from a ClearCase
repository resulted in a 350,000 file git repo of about 16 gigabytes.

The time to clone over a fast network was about 250 minutes.  I could
not verify if the repo had been packed properly, etc.

However, we are thinking of using submodules or subtrees, to allow a
person to selectively clone only a part of the repo they need for
their work.  Is there a way to do this without submodules/subtrees?

We also need to be able to branch the entire repo, which I think would
make submodules kind of a pain, but don't know...

What is the current thinking on these issues in the git community?


Bill
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]