Re: large(25G) repository in git

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Nicolas Pitre wrote:

> Strange.  You could instruct ssh to keep the connection up with the 
> ServerAliveInterval option (see the ssh_config man page).

Sure, could do that.  Already have a separate ssh config entry for
this host.  But why should a connection be kept open for that long?
Why not close and re-open?

Consider the case of other protocol access.  http/git/ssh.  Should
they *all* be changed to allow for this?  Wouldn't it be simpler to
just make git smarter?

>> So, to work around that, I ran git gc.  When done, I discovered that
>> git repacked the *entire* repository.  While not something I care for,
>> I can understand that, and live with it.  It just took *hours* to do so.
>>
>> Then, what really annoys me, is that when I finally did the push, it
>> tried sending the single 27G pack file, when the remote already had
>> 25G of the repository in several different packs(the site was an
>> hg->git conversion).  This part is just unacceptable.
> 
> This shouldn't happen either.  When pushing, git reconstruct a pack with 
> only the necessary objects to transmit.  Are you sure it was really 
> trying to send a 27G pack?

Of course I'm sure.  I wouldn't have sent the email if it didn't
happen.  And, I have the bandwidthd graph and lost time to prove it.

After I ran git push, ssh timed out, the temp pack that was created
was then removed, as git complained about the connection being gone.

I then decided to do a 'git gc', which collapsed all the separate
packs into one.  This allowed git push to proceed quickly, but at that
point, it started sending the entire pack.

It's entirely possible that the temp pack created by git push was
incremental; it just took too long to create it, so it got aborted.

But, doing git gc shouldn't cause things to be resent.

The machines in question have done push before.  Even small amounts;
just the set of objects that are newer.  It's just this time, when the
1.6G of new data was added, git ended up creating a new pack file,
that contained the entire repo, and then tried sending that.

I forgot to mention previously, that the source machine was running
git 1.5.6.5, and was pushing to 1.5.6.3.

I've tried duplicating this problem on a machine with 1.6.1.3, but
either I don't fully understand the issue enough to replicate it, or
the newer git doesn't have the problem.

>> 2: Is there an option to tell git to *not* be so thorough when trying
>> to find similiar files.  videos/doc/pdf/etc aren't always very
>> deltafiable, so I'd be happy to just do full content compares.
> 
> Look at the gitattribute documentation.  One thing that the doc appears 
> to be missing is information about the "delta" attribute.  You can 
> disable delta compression on a file pattern that way.

Um, if it's missing documentation, then how am I supposed to know
about it?  google does give me info, tho.  Thanks for the pointer.

> 
>> 3: delta packs seem to be poorly done.  it seems that if one repo gets
>> repacked completely, that the entire new pack gets sent, when the
>> target has most of the objects already.
> 
> This is not supposed to happen.  Please provide more details if you can.

Well, I haven't been able to replicate it with a script.  I might have
to actually clone this huge repo, do history removal, and reapply the
changes, just to see if I can get it to fail.  But that will take time.

--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux