Re: Roadmap to better handle big files?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Nick Triantos <nick@xxxxxxxxxxxxxxxxxxx> writes:

> Is there any planned functionality to better support large files in
> git?  (> 100MB / file)
> 
> We've been happily using git but we now have some files which we'd
> very much like to have under the same version control as our source
> code, and some of those files have been as large as 450MB/file.  We
> are looking at chunking the file up before commiting it to git, but
> is there any plan to better support chunking of these files during
> repacks or other operations?  Right now, it appears either the whole
> file, or the whole collection of files in a commit (not sure which)
> can need to be resident in memory up to twice, from reading various
> places on the web.  Our poor 32-bit server is barfing on this.  We
> are going to put more RAM and a 64bit OS on the machine, but this
> still seems like an unnecessary design decision.

Git has a roadmap???

More seriously, take a look at git-bigfiles project (fork):
http://caca.zoy.org/wiki/git-bigfiles

HTH
-- 
Jakub Narebski
Poland
ShadeHawk on #git
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]