* Andrew Benton (b3nton@xxxxxxxxx) [100220 05:37]: > Hello world > I have a project that I store in a git repository. It's a bunch of source tarballs and > some bash scripts to compile it all. Git makes it easy to distribute any changes I make > across the computers I run. The problem I have is that over time the repository gets ever > larger. When I update to a newer version of something I git rm the old tarball but git > still keeps a copy and the folder grows ever larger. At the moment the only solution I > have is to periodically rm -rf .git and start again. This works but is less than ideal > because I lose all the history for my build scripts. > What I would like is to be able to tell git to not keep a copy of anything that has been > git rm. The build scripts never get removed, only altered so their history would be > preserved. Is it possible to make git delete its backup copies of removed files? This reminds me of a scenario I wish git had some way of supporting: I have a large collection of mp3s that I have duplicated across several computers. I would love to be able to use git to sync changes between the copies, but there are several problems: 1) git is really slow when dealing with thousands of multi-megabyte blobs. 2) commiting it to git is going to double the size of the directory, and I don't really have space for that on one of the computers that the directory lives on. 3) there's no way to discard old history without breaking push and pull. I'm not sure exactly what it would take to address 1, but 2 could be addressed pretty easily using btrfs file clones (once btrfs is stable), and 3 could be dealt with by improving support for shallow clones. --larry -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html