I maintain a corporate MediaWiki installation. Currently I have a cron job that runs daily and tar's up the contents of the installation directory and runs a mysqldump. I keep backups of the past 45 days. Each backup is about 200M, so all in all I always have about 9.0G of backups. Most of the changes are in the database, so the mysqldump file is changed every day. Other than that, there can be new files uploaded but they never change, just get added. All configuration files stay the same. I wrote a script that untar'd the contents each backup, gunziped the mysql dump, and made a git commit. The resulting .git directory wound up being 837M, but after running a long (8 minute) "git gc" command, it went down to 204M. == Questions == What mysqldump options would be good to use for storage in git? Right now I'm not passing any parameters to mysqldump and its doing all inserts for each table on a single huge line. Would git handle it better if each insert was on its own line? Lets say that the repo gets too big and I want to throw away history. I'd have a linear history with a single commit every day. Is there a way to take just the last 30 commits and throw away everything else? Am I insane? Are there other tools more suited toward this? I just thought of using Git since I looked at my 9G worth of data out there in my backup directory that is almost exactly the same and said "git could handle this well". Are any of you using git for a backup system? Have any tips, words of wisdom? Thanks, ~Eric -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html