Sverre Rabbelier: > Heya, > > On Fri, Feb 19, 2010 at 13:52, Gabriel <g2p.code@xxxxxxxxx> wrote: > > FWIW, I'm interested in that remote-vcs code, but never figured out > > where it was published. TIA for pushing it. > > The remote-helpers code is already in git.git, and has been since git > 1.6.6 with some improvements now in 1.7.0. Yes, good; I meant I haven't found example uses of that code. I'm aware of work on a cvs helper and an hg helper. Users of the python library would be extra convenient. > > I'll be using it to better integrate fast-import based backup > > scripts. > > Do you mean that you'll write a remote helper to import your backups? > If so then my code could be useful as a demonstration on how to hook > up a fast-import stream. I think so. I'm using fast-import to break down a big json file into a tree, and slurp the tree into git. I'm doing this read only. This gives me a history of the file, compact, human-readable with git log, with an incremental remote mirror. I expect making it work with remote-helpers (I'll need to rebuild json from the git input to also handle push, I don't know what else is required) will give me convenient push-pull semantics and I'll be able to sync that file across machines, taking advantage of git's merging abilities. I imagine this kind of sync use case would also work with something like CouchDB or MongoDB, but git has a low barrier to entry for me, and I'm not sure how well the others handle tree merging, that sometimes requires human intervention, in a multi-master replication setup. Speaking of remote-helpers requirements, besides the fast-import stream, what else do I need? I imagine I'll do some book-keeping to note when the current import was last modified (and avoid reimporting it when unchanged), and the last hash that was pushed (and avoid re-exporting it needlessly). When pushing, how do I tell that the current push is a non fast-forward? I can look for the hash of the json file in the history of the pushed branch; if it isn't there I'll refuse the push. Do I keep json hashes in log messages like git-svn does? Or do I store a commit_id -> json hash mapping next to the json file? Do I use git notes for that mapping? How do I index back from json hash to matching commit to commits having the matching commit in their ancestry? When pulling, how do I tell there's nothing further to import? Same approach as for push: look for the json hash in the history or in a local mapping, import nothing if found? Thinking about it some more, those two operations need a json_hash -> commit_id lookup followed by a commit_id, commit_id -> bool DAG ancestry test. Is that something the support library provides, or would welcome? -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html