Dana How wrote: > There's actually an even more extreme example from my day job. > The software team has a project whose files/revisions would be > similar to those in the linux kernel (larger commits, I'm sure). > But they have *ONE* 500MB file they check in because it takes > 2 or 3 days to generate and different people use different versions of it. > I'm sure it has 50+ revisions now. If they converted to git and included > these blobs in their packfile, that's a 25GB uncompressed increase! > *Every* git operation must wade through 10X -- 100X more packfile. > Or it could be kept in 50+ loose objects in objects/xx , > requiring a few extra syscalls by each user to get a new version. Or keeping those large objects in separate, _kept_ packfile, containing only those objects (which can delta well, even if they are large). -- Jakub Narebski Warsaw, Poland ShadeHawk on #git - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html