> On Jun 13, 2019, at 11:43 AM, Jeff King <peff@xxxxxxxx> wrote: > > On Thu, Jun 13, 2019 at 11:33:40AM -0600, Nasser Grainawi wrote: > >> I have a situation where I need to delete 100k+ refs on 15+ separate >> hosts/disks. This setup is using Gerrit replication, so I can trigger >> it all on one host and it will push the deletes to the rest (all >> running git-daemon v2.18.0 with receive-pack enabled). All the refs >> being deleted on the receiving ends are packed. >> >> What I see is the packed-refs file getting locked/updated over and >> over for each ref. I had assumed it would do something more like >> 'update-ref --stdin' and do a bulk removal of refs. Am I seeing the >> correct behavior? If yes, is there a specific reason it works this way >> or is "bulk delete through push" just a feature that hasn't been >> implemented yet? > > The underlying ref code is smart enough to coalesce all of the deletions > in a single transaction into a single write of the packed-refs file. > > But historically, pushes do not do a single ref transaction because we > would allow the push for one ref to succeed while others failed. Later, > we added an "atomic" mode that does it all in a single transaction. > > Try with "git push --atomic", which should be able to do it in a single > write. Thanks! Is there a way to get the bulk behavior without the all-or-nothing behavior? -- Qualcomm Innovation Center, Inc. The Qualcomm Innovation Center, Inc. is a member of the Code Aurora Forum, a Linux Foundation Collaborative Project