Re: [PATCH v3 4/8] t1051: introduce a smudge filter test for extremely large files

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Junio C Hamano <gitster@xxxxxxxxx> writes:

>> +# This smudge filter prepends 5GB of zeros to the file it checks out. This
>> +# ensures that smudging doesn't mangle large files on 64-bit Windows.
>> +test_expect_failure EXPENSIVE,SIZE_T_IS_64BIT,!LONG_IS_64BIT \
>> +		'files over 4GB convert on output' '
>> +	test_commit test small "a small file" &&
>> +	test_config filter.makelarge.smudge \
>> +		"test-tool genzeros $((5*1024*1024*1024)) && cat" &&
>> +	echo "small filter=makelarge" >.gitattributes &&
>> +	rm small &&
>> +	git checkout -- small &&
>> +	size=$(test_file_size small) &&
>> +	test "$size" -ge $((5 * 1024 * 1024 * 1024))
>> +'
>
> Why not exactly 5G, but anything that is at least 5G is OK?

I know it is more than 5G, thanks to the "&& cat".  THe question was
why aren't we measuring the size of "a small file" so that we can
check against an exact size to be expected.

Thanks.



[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]

  Powered by Linux