Re: git-p4 out of memory for very large repository

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 23/08/13 02:12, Corey Thompson wrote:
Hello,

Has anyone actually gotten git-p4 to clone a large Perforce repository?

Yes. I've cloned repos with a couple of Gig of files.

I have one codebase in particular that gets to about 67%, then
consistently gets get-fast-import (and often times a few other
processes) killed by the OOM killer.

What size is this codebase? Which version and platform of git are you using?

Maybe it's a regression, or perhaps you've hit some new, previously unknown size limit?

Thanks
Luke



I've found some patches out there that claim to resolve this, but
they're all for versions of git-p4.py from several years ago.  Not only
will they not apply cleanly, but as far as I can tell the issues that
these patches are meant to address aren't in the current version,
anyway.

Any suggestions would be greatly appreciated.

Thanks,
Corey
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]