Am 19.08.2013 10:25, schrieb Stefan Beller:
On 08/19/2013 10:20 AM, Johannes Sixt wrote:
Am 19.08.2013 08:38, schrieb Steffen Prohaska:
+test_expect_success EXPENSIVE 'filter large file' '
+ git config filter.largefile.smudge cat &&
+ git config filter.largefile.clean cat &&
+ for i in $(test_seq 1 2048); do printf "%1048576d" 1; done >2GB &&
Shouldn't you count to 2049 to get a file that is over 2GB?
Would it be possible to offload the looping from shell to a real
program? So for example
truncate -s 2049M <filename>
should do the job. That would create a file reading all bytes as zeros
being larger as 2G. If truncate is not available, what about dd?
The point is exactly to avoid external dependencies. Our dd on Windows
doesn't do the right thing with seek=2GB (it makes the file twice as large
as expected).
-- Hannes
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at http://vger.kernel.org/majordomo-info.html