Junio C Hamano <junkio@xxxxxxx> wrote: > I suspect that it is deeper than that. Think about why having > "everything at once" is better than "one at a time". > > Potentially you could have a rule that says "these should be > updated together" (or the other way around). If you split the > set of refs at arbitrary limit, like xargs does, you would lose > that advantage. Yes, I think the documentation says something about that... ;-) > We could take stdin to solve that and shell > scripts should be able to handle that as refnames do not contain > shell metacharacters. Never even occurred to me, because I was trying to keep the hook interface "simple". > But this is only true if you want to make it really nice. I > personally feel that nobody would scream if pushing 1300 refs at > once (4K pages and MAX_ARG_PAGES at 32 would give 128K for > **argv and its strings, and one ref's worth of data is two > 40-digit hex plus refname, roughly 100-byte per ref) is not > supported and always failed. Agree completely. I'm not too worried about it. 1300 ref push is just not going to really occur in practice; that is just insane. 30 refs, maybe. -- Shawn. - To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html