I am working a project using git where we have many repositories on machines that can never be directly connected, but which need to have the same objects and development history. Existing git protocols offer limited support: we can either a) publish and apply patch files branch by branch, or b) copy an entire repository from one machine to another and then do local push or fetch. While both are workable, neither is a completely satisfactory solution, so I wrote the attached scripts that support a "bundle" transfer mechanism. A bundle is a zip archive having two files: a list of references as given by git-show-ref and a pack file of objects from git-pack-objects. git-bundle creates the bundle, git-unbundle unpacks and applies at the receiving end. The means of transporting the bundle file between the machines is arbitrary (sneaker net, email, etc all can work). This transfer protocol leaves it to the user to assure that the objects in the bundle are sufficient to update the target machine. This is a direct consequence of the prohibition on direct communication between the machines. The approach supported here is to use normal git-rev-list format to specify what to include, e.g. master~10..master, or ^master pu next, etc. Having too many objects in the pack file is fine: git-unpack-objects at the receiving end happily ignores things not needed. git-unbundle normally checks that the updated references are fast-forward (--force to override), and that all required objects exist (--shallow to override). This latter option supports a disconnected shallow clone. I offer this for inclusion in the main distribution, comments and suggestions for improvement are welcome regardless. The scripts are working for me today and I find them very useful. Mark Levedahl - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html