Re: large(25G) repository in git

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, 23 Mar 2009, Adam Heath wrote:

> Last friday, I was doing a checkin on the production server, and found
> 1.6G of new files.  git was quite able at committing that.  However,
> pushing was problematic.  I was pushing over ssh; so, a new ssh
> connection was open to the preview server.  After doing so, git tried
> to create a new pack file.  This took *ages*, and the ssh connection
> died.  So did git, when it finally got done with the new pack, and
> discovered the ssh connection was gone.

Strange.  You could instruct ssh to keep the connection up with the 
ServerAliveInterval option (see the ssh_config man page).

> So, to work around that, I ran git gc.  When done, I discovered that
> git repacked the *entire* repository.  While not something I care for,
> I can understand that, and live with it.  It just took *hours* to do so.
> 
> Then, what really annoys me, is that when I finally did the push, it
> tried sending the single 27G pack file, when the remote already had
> 25G of the repository in several different packs(the site was an
> hg->git conversion).  This part is just unacceptable.

This shouldn't happen either.  When pushing, git reconstruct a pack with 
only the necessary objects to transmit.  Are you sure it was really 
trying to send a 27G pack?

> So, here are my questions/observations:
> 
> 1: Handle the case of the ssh connection dying during git push(seems
> simple).

See above.

> 2: Is there an option to tell git to *not* be so thorough when trying
> to find similiar files.  videos/doc/pdf/etc aren't always very
> deltafiable, so I'd be happy to just do full content compares.

Look at the gitattribute documentation.  One thing that the doc appears 
to be missing is information about the "delta" attribute.  You can 
disable delta compression on a file pattern that way.

> 3: delta packs seem to be poorly done.  it seems that if one repo gets
> repacked completely, that the entire new pack gets sent, when the
> target has most of the objects already.

This is not supposed to happen.  Please provide more details if you can.


Nicolas
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux