Re: Roadmap to better handle big files?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thanks.  I had looked at that project, but the logo being a piece of poop sort of scared me away from it (and it looked to be very early on in their design work so far)...

thanks!
-Nick

On Feb 24, 2010, at 3:51 PM, Jakub Narebski wrote:

> Nick Triantos <nick@xxxxxxxxxxxxxxxxxxx> writes:
> 
>> Is there any planned functionality to better support large files in
>> git?  (> 100MB / file)
>> 
>> We've been happily using git but we now have some files which we'd
>> very much like to have under the same version control as our source
>> code, and some of those files have been as large as 450MB/file.  We
>> are looking at chunking the file up before commiting it to git, but
>> is there any plan to better support chunking of these files during
>> repacks or other operations?  Right now, it appears either the whole
>> file, or the whole collection of files in a commit (not sure which)
>> can need to be resident in memory up to twice, from reading various
>> places on the web.  Our poor 32-bit server is barfing on this.  We
>> are going to put more RAM and a 64bit OS on the machine, but this
>> still seems like an unnecessary design decision.
> 
> Git has a roadmap???
> 
> More seriously, take a look at git-bigfiles project (fork):
> http://caca.zoy.org/wiki/git-bigfiles
> 
> HTH
> --
> Jakub Narebski
> Poland
> ShadeHawk on #git

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]