Re: [PATCH v3 8/8] clean/smudge: allow clean filters to process extremely large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Junio,

On Fri, 29 Oct 2021, Junio C Hamano wrote:

> "Matt Cooper via GitGitGadget" <gitgitgadget@xxxxxxxxx> writes:
>
> > +# This clean filter writes down the size of input it receives. By checking against
> > +# the actual size, we ensure that cleaning doesn't mangle large files on 64-bit Windows.
> > +test_expect_success EXPENSIVE,SIZE_T_IS_64BIT,!LONG_IS_64BIT \
> > +		'files over 4GB convert on input' '
> > +	test-tool genzeros $((5*1024*1024*1024)) >big &&
> > +	test_config filter.checklarge.clean "wc -c >big.size" &&
> > +	echo "big filter=checklarge" >.gitattributes &&
> > +	git add big &&
> > +	test $(test_file_size big) -eq $(cat big.size)
> > +'
>
> I would have expected that the clean filter to be sending the count
> to its standard output (to be hashed and made into a blob object),
> and the test wuld be doing "git cat-file blob :big" to read the
> contents of the raw blob, bypassing the filter system.

That was exactly what Matt had in his first iteration. But I dislike
unnecessarily spawned processes, they are not "free" on Windows, so I
shortened the design to take this shortcut.

> But we are testing with only a single path anyway, use of this single
> extra file is OK.

Precisely.

Ciao,
Dscho




[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux