Re: out of memory error with git push and pull

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Qingning Huo wrote:
> I tried to use git to manage my digital photos but encountered some
> problems. The typical file sizes are hundreds of KB or a few MB. In
> total, about 15GB data in about 10,000 files. My intention is to get
> them into a git repository and cloned into a few computers. Probably I
> will make some occasionally changes like editing and deleting. But I
> think most of the files would stay at version one.

I try not to mention git-annex too much here, but this is a perfect
use-case for it. http://git-annex.branchable.com/ 

Well, it would be more perfect if you had enough data in your repo that
you didn't necessarily want to clone it all to every computer. Like so:

# git annex status
local annex size: 58 megabytes
total annex keys: 38158
total annex size: 6 terabytes

:)

> I wonder whether anyone has tried using git in a similar scenario. Is
> git capable of handling this kind of data? And, are there any settings
> and/or command line options that I should use? I had a quick look of
> git help push (and pull/fetch) but cannot see anything obvious.

There is a tunable you can use to improve things, see core.bigFileThreshold

That originally came from this project.
http://caca.zoy.org/wiki/git-bigfiles -- it may have some other
improvements that have not landed in git, I'm not sure.

-- 
see shy jo

Attachment: signature.asc
Description: Digital signature


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]