01.04.2020, 01:10, "Konstantin Tokarev" <annulen@xxxxxxxxx>: > 28.03.2020, 19:58, "Derrick Stolee" <stolee@xxxxxxxxx>: >> On 3/28/2020 10:40 AM, Jeff King wrote: >>> On Sat, Mar 28, 2020 at 12:08:17AM +0300, Konstantin Tokarev wrote: >>> >>>> Is it a known thing that addition of --filter=blob:none to workflow >>>> with shalow clone (e.g. --depth=1) and following sparse checkout may >>>> significantly slow down process and result in much larger .git >>>> repository? >> >> In general, I would recommend not using shallow clones in conjunction >> with partial clone. The blob:none filter will get you what you really >> want from shallow clone without any of the downsides of shallow clone. > > Is it really so? > > As you can see from my measurements [1], in my case simple shallow clone (1) > runs faster than simple partial clone (2) and produces slightly smaller .git, > from which I can infer that (2) downloads some data which is not downloaded > in (1). Actually, as I have full git logs for all these cases, there is no need to be guessing: (1) downloads 295085 git objects of total size 1.00 GiB (2) downloads 1949129 git objects of total size 1.01 GiB Total sizes are very close, but (2) downloads much more objects, and also it uses 3 passes to download them which leads to less efficient use of network bandwidth. > > To be clear, use case which I'm interested right now is checking out sources in > cloud CI system like GitHub Actions for one shot build. Right now checkout usually > takes 1-2 minutes and my hope was that someday in the future it would be possible\ > to make it faster. > > [1] https://gist.github.com/annulen/835ac561e22bedd7138d13392a7a53be > > -- > Regards, > Konstantin -- Regards, Konstantin