On Sun, 5 Apr 2009, Nicolas Sebrecht wrote:
On Sun, Apr 05, 2009 at 12:04:12AM -0700, Robin H. Johnson wrote:
Before I answer the rest of your post, I'd like to note that the matter
of which choice between single-repo, repo-per-package, repo-per-category
has been flogged to death within Gentoo.
I did not come to the Git mailing list to rehash those choices. I came
here to find a solution to the performance problem.
I understand. I know two ways to resolve this:
- by resolving the performance problem itself,
- by changing the workflow to something more accurate and more suitable
against the facts.
My point is that going from a centralized to a decentralized SCM
involves breacking strongly how developers and maintainers work. What
you're currently suggesting is a way to work with Git in a centralized
way. This sucks. To get the things right with Git I would avoid shared
and global repositories. Gnome is doing it this way:
http://gitorious.org/projects/gnome-svn-hooks/repos/mainline/trees/master
guys, back off a little on telling the gentoo people to change. the kernel
developers don't split th kernel into 'core' 'drivers' etc pieces just
because some people only work on one area. I see the gentoo desire to keep
things in one repo as being something very similar.
the problem here is a real one, if you have a large repo, git send-pack
will always generate a new pack, even if it doesn't need to (with the
extreme case being the the repo is fully packed)
The GSoC 2009 ideas contain a potential project for caching the
generated packs, which, while having value in itself, could be partially
avoided by sending suitable pre-built packs (if they exist) without any
repacking.
Right. It could be an option to wait and see if the GSoC gives
something.
the GSOC project is not the same thing. in this case the packs are already
'cached' (they are stored on disk), what is needed is some option to let
git send existing pack(s) if they exist rather then taking the time to
try and generate an 'optimal' pack.
I'm actually aurprised that this is happening, I thought that the
recommendation was that the public repository should do a very agressive
pack (that takes a lot of resources) for the old content so that people
cloning from it get the advantage of the tight packing without having to
do it themselves.
if the server _always_ re-generates the pack from scratch then this is a
waste of time (except for people who clone via the dumb, unsafe
mechanisms)
David Lang
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html