anatoly techtonik <techtonik@xxxxxxxxx> writes: > Is fast-export/import the only way to filter information in `git`? Maybe there > is a slow json-export/import tool that gives a complete representation of all > events in a repository? Or API that can be used to serialize and import that > stream? I do not think representation is a problem. It is just that the output stream of fast-export is designed to be "filtered" and the expected use case is to modify the stream somehow before feeding it to fast-import. And because every object name and commit & tag signature depends on everything that they can reach, even a single bit change in an earlier part of the history will invalidate any and all signatures on objects that can reach it. So instead of originally-signed objects whose signatures are now invalid, "fast-export | fast-import" pipeline would give you originally-signed objects whose signatures are stripped. Admittedly, there is a narrow use case where such a signature invalidation is not an issue. If you run fast-export and feed that straight into fast-import without doing any modification to the stream, then you are getting a bit-for-bit identical copy. But "git clone --mirror" is a much better way to do get such a bit-for-bit identical history and objects. And if you want to do so with sneakernet, you can create a bundle file, sneakernet it to your destination, and then clone from the bundle. So...