On Mon, Mar 1, 2021 at 8:34 PM Junio C Hamano <gitster@xxxxxxxxx> wrote: > > It is just that the output stream of fast-export is designed to be > "filtered" and the expected use case is to modify the stream somehow > before feeding it to fast-import. And because every object name and > commit & tag signature depends on everything that they can reach, > even a single bit change in an earlier part of the history will > invalidate any and all signatures on objects that can reach it. So > instead of originally-signed objects whose signatures are now > invalid, "fast-export | fast-import" pipeline would give you > originally-signed objects whose signatures are stripped. I need to merge two unrelated repos and I am using `reposurgeon` http://www.catb.org/~esr/reposurgeon/repository-editing.html to do this preserving timestamps and commit order. The model of operation is that it reads revisions into memory from git using fast-export, operates on them, and then rebuild the stream back into git repo with fast-import. The problem is that in the exported dump the information is already lost, and the resulting commits are "not mergeable". Basically all GitHub repositories where people edited `README.md` online are "not mergeable" after this point, because all GitHub edited commits are signed. For my use case, where I just need to attach another branch in time without altering original commits in any way, `reposurgeon` can not be used. > Admittedly, there is a narrow use case where such a signature > invalidation is not an issue. If you run fast-export and feed that > straight into fast-import without doing any modification to the > stream, then you are getting a bit-for-bit identical copy. I did just that and signatures got stripped, altering history. git -C protonfixes fast-export --all --reencode=no | (cd protoimported && git fast-import) -- anatoly t.