On 03/01/2019 18:45, Andrey Konovalov wrote: > Instead of changing cache->align to be aligned to KASAN_SHADOW_SCALE_SIZE > in kasan_cache_create() we can reuse the ARCH_SLAB_MINALIGN macro. > > Suggested-by: Vincenzo Frascino <vincenzo.frascino@xxxxxxx> > Signed-off-by: Andrey Konovalov <andreyknvl@xxxxxxxxxx> > --- > arch/arm64/include/asm/cache.h | 6 ++++++ > mm/kasan/common.c | 2 -- > 2 files changed, 6 insertions(+), 2 deletions(-) > > diff --git a/arch/arm64/include/asm/cache.h b/arch/arm64/include/asm/cache.h > index 13dd42c3ad4e..eb43e09c1980 100644 > --- a/arch/arm64/include/asm/cache.h > +++ b/arch/arm64/include/asm/cache.h > @@ -58,6 +58,12 @@ > */ > #define ARCH_DMA_MINALIGN (128) > > +#ifdef CONFIG_KASAN_SW_TAGS > +#define ARCH_SLAB_MINALIGN (1ULL << KASAN_SHADOW_SCALE_SHIFT) > +#else > +#define ARCH_SLAB_MINALIGN __alignof__(unsigned long long) > +#endif > + Could you please remove the "#else" case here, because it is redundant (it is defined in linux/slab.h as ifndef) and could be misleading in future? > #ifndef __ASSEMBLY__ > > #include <linux/bitops.h> > diff --git a/mm/kasan/common.c b/mm/kasan/common.c > index 03d5d1374ca7..44390392d4c9 100644 > --- a/mm/kasan/common.c > +++ b/mm/kasan/common.c > @@ -298,8 +298,6 @@ void kasan_cache_create(struct kmem_cache *cache, unsigned int *size, > return; > } > > - cache->align = round_up(cache->align, KASAN_SHADOW_SCALE_SIZE); > - > *flags |= SLAB_KASAN; > } > > -- Regards, Vincenzo