Brian Swetland wrote:
One observation on git-p4 -- it's a little memory hungry when processing
large syncs. I haven't tried incremental syncs on top of the initial
one though -- if it's only the initial that's expensive it's not that
big a deal.
It seemed to top out around 988MB resident. The branch I was importing
is about 562MB when checked out and the resulting git repository is
about 175MB.
While importing each change, I think git-p4 puts into memory two copies
of the contents of all changed files, one in p4CmdList and one in
readP4Files. (That's the raw contents, not just the delta.) I don't
think there's any fundamental reason it couldn't stream them instead.
So incremental syncs may or may not take less memory. If the first
change imports a huge project and no subsequent change ever touches all
those files at once, then yeah. But if, say, you periodically change the
copyright dates in all files in the repository, you'll have this memory
usage whenever syncing such a change.
As long as we're listing git-p4 complaints, here are a couple of mine:
1) coding style. *self-nag* Simon Hausmann mentioned he was happy to
accept patches...and I made one up a while ago; I just need to do a
merge and final check that I haven't broken anything before sending it off.
2) it breaks on tempfile purges. My previous employer has these in their
repository, and I think for the moment they're working around it by
treating a "purge" as a "delete". If I read the Perforce documentation
right, though, only the latest version of a tempfile's contents is kept
in the repository anyway. Their history can't be captured accurately, so
the proper thing is probably to omit tempfiles entirely. (And deleting
files when they become tempfiles and creating files when they become a
normal type.)
Best regards,
Scott
--
Scott Lamb <http://www.slamb.org/>
-
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html