incremental push/pull for large repositories

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi folks,


I often have situations where I've rebased branches with large files 
(10th of megabytes per file) and pushing them to the remote. Normally 
these files themselves stay untouched, but the history changes (eg. 
commits reordered, several changes in smaller files,etc).

It seem that on each push, the whole branch is transferred, including
all the large files, which already exist on the remote site. Is there
any way to prevent this ?

IMHO it would be enough having a negotiation on which objects really
have to be transferred before creating and transmitting the actual
pack file. That would save _much_ traffic (and transmit time) for
those situations. 

This could also be used to remotely repair broken repositories: 
just repack leaving off the broken objects and fetch the missing
ones from remotes.


cu
-- 
----------------------------------------------------------------------
 Enrico Weigelt, metux IT service -- http://www.metux.de/

 phone:  +49 36207 519931  email: weigelt@xxxxxxxx
 mobile: +49 151 27565287  icq:   210169427         skype: nekrad666
----------------------------------------------------------------------
 Embedded-Linux / Portierung / Opensource-QM / Verteilte Systeme
----------------------------------------------------------------------
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]