I have a CVS repository which is mostly sane, but has an approximately 20MB RTF file that has two hundred revisions or so. (Thank you, Windows help.) Now, since this is old history, I want to make it as small as possible. The only problem is that when I use high --window values for repack, it goes along swimmingly until it gets to this file, at which point memory usage quickly rises to the point where I'm well into my swap file. I think what I'd like is an extra option to repack to limit window memory usage. This would dynamically scale the window size down if it can't fit within the limit, then scale it back up once you're off of the nasty file. This would let me repack my repository with --window=100 and have it actually finish someday on the machines I have access to. The big file may not be as efficiently packed as possible, but I can live with that. My question is, is this sane? Does the repack algorithm depend on having a fixed window size to work? I'd rather not look into implementing this if it's silly on the face of it. Thanks, -bcd - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html