On Fri, Nov 25, 2022 at 12:54:27PM +0800, Herbert Xu wrote: > On Wed, Nov 23, 2022 at 12:10:32PM +0000, Giovanni Cabiddu wrote: > > > > +#define MAX_NULL_DST_RETRIES 5 > > Why do we need this limit? I wanted a cap on the number of retries. What if the input is a zip bomb [1]? The alternative would be to retry up to the point the allocation of the destination scatterlist fails. What do you suggest? > Doesn't that mean that anything that > compresses by a factor of more than 2^5 or 2^6 will fail? Anything that decompresses by more than (src size * 2 rounded up to 4K) * 2^6 will fail. > Perhaps it would also be wise to enforce a minimum for dlen in > case of a NULL dst, perhaps PAGE_SIZE just in case slen is very > small and it would take many doublings to get to the right size. This is done already. See this snippet from qat_comp_alg_compress_decompress(): /* Handle acomp requests that require the allocation of a destination * buffer. The size of the destination buffer is double the source * buffer (rounded up to the size of a page) to fit the * decompressed output or an expansion on the data for compression. */ if (!areq->dst) { qat_req->dst.is_null = true; dlen = round_up(2 * slen, PAGE_SIZE); areq->dst = sgl_alloc(dlen, f, NULL); [1] https://en.wikipedia.org/wiki/Zip_bomb Regards, -- Giovanni