Hi Joey, On Thu, Jun 2, 2011 at 5:46 AM, Joey Hess <joey@xxxxxxxxxxx> wrote: > Qingning Huo wrote: >> I tried to use git to manage my digital photos but encountered some >> problems. The typical file sizes are hundreds of KB or a few MB. In >> total, about 15GB data in about 10,000 files. My intention is to get >> them into a git repository and cloned into a few computers. Probably I >> will make some occasionally changes like editing and deleting. But I >> think most of the files would stay at version one. > > I try not to mention git-annex too much here, but this is a perfect > use-case for it. http://git-annex.branchable.com/ > > Well, it would be more perfect if you had enough data in your repo that > you didn't necessarily want to clone it all to every computer. Like so: > > # git annex status > local annex size: 58 megabytes > total annex keys: 38158 > total annex size: 6 terabytes > > :) Thanks a lot for the pointer. I'd love to use git-annex if I can get my hand to it. I had a look of the web site and searched a bit on the web, but there does not seem to be an easy way to install it on windows/cygwin. I might try the bigFileThreshold setting first. And maybe git-bigfiles. > >> I wonder whether anyone has tried using git in a similar scenario. Is >> git capable of handling this kind of data? And, are there any settings >> and/or command line options that I should use? I had a quick look of >> git help push (and pull/fetch) but cannot see anything obvious. > > There is a tunable you can use to improve things, see core.bigFileThreshold > > That originally came from this project. > http://caca.zoy.org/wiki/git-bigfiles -- it may have some other > improvements that have not landed in git, I'm not sure. > > -- > see shy jo > Thanks Qingning -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html