Re: blobs (once more)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, Apr 6, 2011 at 5:25 AM, Johannes Schindelin
<Johannes.Schindelin@xxxxxx> wrote:
> I understand. The problem in your case might not be too bad, after all.
> The problem only arises when you have big files that are compressed. If
> you check in multiple versions of an uncompressed .dll file, Git will
> usually do a very good job at compressing them.

Except when they are very large; in that case git tends to OOM. But
just yesterday Junio posted a proposed patch to honour a max file size
for compression (search the archive for 'Git exhausts memory' and
'core.bigFileThreshold'.

So Pau might be in luck with current git + Junio's patch + enough RAM
on the workstations. Pau, I definitely suggest you try it out.

If it still consumes too much memory with the largest filest (ie: the
VM images you mention), the fedpkg approach (discussed in this thread)
is good too. It's a Python wrapper around git, which tracks a text
file listing the hashes of the large files, and fetches them (or
uploads them) via SCP. You only need to use it when dealing with the
large files -- most of the time you're just using git.

The fedpkg code is quite readable, and I've already "stolen" some of
its code for my local needs. Recommended.

cheers,


m
-- 
 martin.langhoff@xxxxxxxxx
 martin@xxxxxxxxxx -- Software Architect - OLPC
 - ask interesting questions
 - don't get distracted with shiny stuff  - working code first
 - http://wiki.laptop.org/go/User:Martinlanghoff
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]