"Dana How" <danahow@xxxxxxxxx> writes: > The packed X too big combination is the problem. As the > commit message says, this could happen if the packs > came from fast-import,... > We have three options in this case: > (1) Drop the object (do not put it in the new pack(s)). > (2) Pass the object into the new pack(s). > (3) Write out the object as a new loose object. > > Option (1) is unacceptable. When you call git-repack -a, > it blindly deletes all the non-kept packs at the end. So > the megablobs would be lost. Ok, I can buy that -- (1) nor (2) are unacceptable and (3) is the only sane thing to do for a previously packed objects that exceed the size limit. Since you have to handle that case _anyway_, I think it makes sense to always say "Ok, we will write it out if there is no loose representation already available". That is, unless somebody smarter than me, like Nico or Shawn, come up with better ideas to do this ;-). > ... why did I implement --max-blob-size instead > of --max-object-size? I take this to mean that I should use > the blob size if undeltified, and the delta size if previously deltified? No, I think the only sensible way for the end user to specify the size is uncompressed size of the object. For a blob, that is the size of checked-out file. IOW: $ git cat-file $type $sha | wc -c Nothing else would make any sense. - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html