I'm planning on using Git for a deployment process where the steps are basically: 1. You log into a deployment host, cd into software.git, do git pull 2. A tool runs "make" for you, creates a deployment-YYYYMMDD-HHMMSS tag 3. That make step will create a bunch of generated (text) files 4. Get a list of these with : git clean -dxfn 5. Copy those to to software-generated.git, removing any that we didn't just create, adding any that are new 6. Commit that, tag it with generated-deployment-YYYYMMDD-HHMMSS 7. Push out both our generated software.git and software-generated.git tag to our servers 8. git reset --hard both of those to our newly pushed out tags 9. Do git clean -dxf on software.git remove old generated files 10. Copy new generated files from generated-software.git to software.git 11. Restart our application to pick up the new software For this I'll need to write some "git snapshot-commit" tool for #5 and #6 to commit whatever the current state of the directory is (with removed/added files), and hack up something to do #9-#10. This should all be relatively easy, I was just wondering if there was any prior art on this that I could use instead of hacking it up myself. For what it's worth the reason I'm using Git like this for deployment is that I'm converting things away from an rsync-based sync process that's becoming increasingly slow since rsync with -c needs to recursively checksum everything we're syncing out (which is quite a lot), since it's all text files Git can be really efficient in just transferring deltas and quickly doing a hard reset to a new commit. -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html