Fwd: Local clones aka forks disk size optimization

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I have come up with this while doing some local forks for work.
Currently, when you clone a repo using a path (not file:/// protocol)
you get all the common objects linked.

But as you work, each one will continue growing on its way, although
they may have common objects.

Is there any way to avoid this? I mean, can something be done in git,
that it checks for (when pulling) the same objects in the other forks?

Thought this doesn't make much sense in clients, when you have to
maintain 20 forks of very big projects in server side, it eats
precious disk space.

I don't know how if this should have [RFC] in the subject or what. But
here is my idea.

As hardlinking is already done by git, if it checked for how many
links there are for its files, it would be able to find other dirs
where to search. The easier way is checking for the most ancient pack.

Hope you like this idea,

Javier Domingo
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]