Re: Large pack causes git clone failures ... what to do?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Thanks Shawn,

On Wed, Sep 1, 2010 at 3:32 AM, Shawn O. Pearce <spearce@xxxxxxxxxxx> wrote:
> Geoff Russell <geoffrey.russell@xxxxxxxxx> wrote:
>> I did a "git gc" on a repository and ended up with a 4GB pack ... now I
>> can't clone the repository and get the following:
>> ...
>
> Are you on a 32 bit Linux system?  Or 64 bit?  Git should be auto
> selecting a unit that would allow it to mmap slices of that 4GB pack.

32bit

>
>> I've looked at "git repack --max-pack-size", but which that
>> created new packs it didn't delete the old monster.
>
> You really needed to run:
>
>  git repack --max-pack-size=.. -a -d
>
> The -d flag tells it to remove the old packs once the new packs
> are ready, and the -a flag tells it to reconsider every object
> in the repository, rather than just those that are loose.

Ok, will try.

>
> But if you can't clone it, you probably can't repack it.  Clone works

The cloning fails at different points in the process and the server is normally
under some load, so perhaps load is a factor.

> by creating a pack file on the server, just like repack does.
> Except it sends the pack out to the network stream instead of to
> local disk.

Does clone from a client take note of the pack.packSizeLimit if I set it
on the server? Or does it use the client value?

Cheers and many thanks, annoying problems like this always happen at really
inconvenient times :)

Geoff.

>
> --
> Shawn.
>
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]