Round up allocations with kmalloc_size_roundup() so that mempool's use of ksize() is always accurate and no special handling of the memory is needed by KASAN, UBSAN_BOUNDS, nor FORTIFY_SOURCE. Cc: Andrew Morton <akpm@xxxxxxxxxxxxxxxxxxxx> Cc: linux-mm@xxxxxxxxx Signed-off-by: Kees Cook <keescook@xxxxxxxxxxxx> --- mm/mempool.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/mm/mempool.c b/mm/mempool.c index 96488b13a1ef..0f3107b28e6b 100644 --- a/mm/mempool.c +++ b/mm/mempool.c @@ -526,7 +526,7 @@ EXPORT_SYMBOL(mempool_free_slab); */ void *mempool_kmalloc(gfp_t gfp_mask, void *pool_data) { - size_t size = (size_t)pool_data; + size_t size = kmalloc_size_roundup((size_t)pool_data); return kmalloc(size, gfp_mask); } EXPORT_SYMBOL(mempool_kmalloc); -- 2.34.1