Kousik Sanagavarapu <five231003@xxxxxxxxx> writes: > This global was added as a temporary measure to suppress the fetching > of missing objects and can be removed once the commandshave been taught > to handle these cases. Two requests. * Could you substantiate this claim for future readers of "git log"? A reference to an old mailing list discussion or a log message of the commit that added the temporary measure that says the above plan would be perfect. * What exactly does "once the commands have been taught"? Which commands? Could you clarify? > Hence, use has_object() to check for the existence of an object, which > has the default behavior of not lazy-fetching in a partial clone. It is > worth mentioning that this is the only place where there is potential for > lazy-fetching and all other cases are properly handled, making it safe to > remove this global here. This paragraph is very well explained. > @@ -1728,14 +1727,6 @@ int cmd_index_pack(int argc, const char **argv, const char *prefix) > int report_end_of_input = 0; > int hash_algo = 0; > > - /* > - * index-pack never needs to fetch missing objects except when > - * REF_DELTA bases are missing (which are explicitly handled). It only > - * accesses the repo to do hash collision checks and to check which > - * REF_DELTA bases need to be fetched. > - */ OK. The comment describes the design choice we made to flip the fetch_if_missing flag off. The old world-view was that we would notice a breakage by non-functioning index-pack when a lazy clone is missing objects that we need by disabling auto-fetching, and we instead explicitly handle any missing and necessary objects by lazy fetching (like "when we lack REF_DELTA bases"). It does sound like a conservative thing to do, compared to the opposite approach we are taking with this patch, i.e. we would not fail if we tried to access objects we do not need to, because we have lazy fetching enabled, and we just ended up with bloated object store nobody may notice. To protect us from future breakage that can come from the new approach, it is a very good thing that you added new tests to ensure no unnecessary lazy fetching is done (I am not offhand sure if that test is sufficient, though). > - fetch_if_missing = 0; Looking good to me. Jonathan, who reviewed the previous round, do you have any comments? Thanks, all. Will queue. > diff --git a/t/t5616-partial-clone.sh b/t/t5616-partial-clone.sh > index f519d2a87a..46af8698ce 100755 > --- a/t/t5616-partial-clone.sh > +++ b/t/t5616-partial-clone.sh > @@ -644,6 +644,41 @@ test_expect_success 'repack does not loosen promisor objects' ' > grep "loosen_unused_packed_objects/loosened:0" trace > ' > > +test_expect_success 'index-pack does not lazy-fetch when checking for sha1 collsions' ' > + rm -rf server promisor-remote client repo trace && > + > + # setup > + git init server && > + for i in 1 2 3 4 > + do > + echo $i >server/file$i && > + git -C server add file$i && > + git -C server commit -am "Commit $i" || return 1 > + done && > + git -C server config --local uploadpack.allowFilter 1 && > + git -C server config --local uploadpack.allowAnySha1InWant 1 && > + HASH=$(git -C server hash-object file3) && > + > + git init promisor-remote && > + git -C promisor-remote fetch --keep "file://$(pwd)/server" && > + > + git clone --no-checkout --filter=blob:none "file://$(pwd)/server" client && > + git -C client remote set-url origin "file://$(pwd)/promisor-remote" && > + git -C client config extensions.partialClone 1 && > + git -C client config remote.origin.promisor 1 && > + > + git init repo && > + echo "5" >repo/file5 && > + git -C repo config --local uploadpack.allowFilter 1 && > + git -C repo config --local uploadpack.allowAnySha1InWant 1 && > + > + # verify that no lazy-fetching is done when fetching from another repo > + GIT_TRACE_PACKET="$(pwd)/trace" git -C client \ > + fetch --keep "file://$(pwd)/repo" main && > + > + ! grep "want $HASH" trace > +' > + > test_expect_success 'lazy-fetch in submodule succeeds' ' > # setup > test_config_global protocol.file.allow always &&