On Mon, Dec 02, 2019 at 06:40:35PM +0100, SZEDER Gábor wrote: > > When loading packfiles on start-up, we traverse the internal packfile > > list once per file to avoid reloading packfiles that have already > > been loaded. This check runs in quadratic time, so for poorly > > maintained repos with a large number of packfiles, it can be pretty > > slow. > > > > Add a hashmap containing the packfile names as we load them so that > > the average runtime cost of checking for already-loaded packs becomes > > constant. > [...] > This patch break test 'gc --keep-largest-pack' in 't6500-gc.sh' when > run with GIT_TEST_MULTI_PACK_INDEX=1, because there is a duplicate > entry in '.git/objects/info/packs': Good catch. The issue is that we only add entries to the hashmap in prepare_packed_git(), but they may be added to the pack list by other callers of install_packed_git(). It probably makes sense to just push the hashmap maintenance down into that function, like below. That requires an extra strhash() when inserting a new pack, but I don't think that's a big deal. diff --git a/packfile.c b/packfile.c index 253559fa87..f0dc63e92f 100644 --- a/packfile.c +++ b/packfile.c @@ -757,6 +757,9 @@ void install_packed_git(struct repository *r, struct packed_git *pack) pack->next = r->objects->packed_git; r->objects->packed_git = pack; + + hashmap_entry_init(&pack->packmap_ent, strhash(pack->pack_name)); + hashmap_add(&r->objects->pack_map, &pack->packmap_ent); } void (*report_garbage)(unsigned seen_bits, const char *path); @@ -864,11 +867,8 @@ static void prepare_pack(const char *full_name, size_t full_name_len, /* Don't reopen a pack we already have. */ if (!hashmap_get(&data->r->objects->pack_map, &hent, pack_name)) { p = add_packed_git(full_name, full_name_len, data->local); - if (p) { - hashmap_entry_init(&p->packmap_ent, hash); - hashmap_add(&data->r->objects->pack_map, &p->packmap_ent); + if (p) install_packed_git(data->r, p); - } } free(pack_name); } -Peff