Re: [PATCH v6 10/13] convert: generate large test files only once

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Lars Schneider <larsxschneider@xxxxxxxxx> writes:

>> On 25 Aug 2016, at 21:17, Stefan Beller <sbeller@xxxxxxxxxx> wrote:
>> 
>>> On Thu, Aug 25, 2016 at 4:07 AM,  <larsxschneider@xxxxxxxxx> wrote:
>>> From: Lars Schneider <larsxschneider@xxxxxxxxx>
>>> 
>>> Generate more interesting large test files
>> 
>> How are the large test files more interesting?
>> (interesting in the notion of covering more potential bugs?
>> easier to debug? better to maintain, or just a pleasant read?)
>
> The old large test file was 1MB of zeros and 1 byte with a one, repeated 2048 times.
>
> Since the filter uses 64k packets we would test a large number of equally looking packets.
>
> That's why I thought the pseudo random content is more interesting.

I guess my real question is why it is not just a single invocation
of test-genrandom that gives you the whole test file; if you are
using 20MB, the simplest would be to grab 20MB out of test-genrandom.
With that hopefully you won't see large number of equally looking
packets, no?



[Index of Archives]     [Linux Kernel Development]     [Gcc Help]     [IETF Annouce]     [DCCP]     [Netdev]     [Networking]     [Security]     [V4L]     [Bugtraq]     [Yosemite]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Linux SCSI]     [Fedora Users]