The patch titled Subject: arm64: allow kmalloc() caches aligned to the smaller cache_line_size() has been added to the -mm mm-unstable branch. Its filename is arm64-allow-kmalloc-caches-aligned-to-the-smaller-cache_line_size.patch This patch will shortly appear at https://git.kernel.org/pub/scm/linux/kernel/git/akpm/25-new.git/tree/patches/arm64-allow-kmalloc-caches-aligned-to-the-smaller-cache_line_size.patch This patch will later appear in the mm-unstable branch at git://git.kernel.org/pub/scm/linux/kernel/git/akpm/mm Before you just go and hit "reply", please: a) Consider who else should be cc'ed b) Prefer to cc a suitable mailing list as well c) Ideally: find the original patch on the mailing list and do a reply-to-all to that, adding suitable additional cc's *** Remember to use Documentation/process/submit-checklist.rst when testing your code *** The -mm tree is included into linux-next via the mm-everything branch at git://git.kernel.org/pub/scm/linux/kernel/git/akpm/mm and is updated there every 2-3 working days ------------------------------------------------------ From: Catalin Marinas <catalin.marinas@xxxxxxx> Subject: arm64: allow kmalloc() caches aligned to the smaller cache_line_size() Date: Mon, 12 Jun 2023 16:31:55 +0100 On arm64, ARCH_DMA_MINALIGN is 128, larger than the cache line size on most of the current platforms (typically 64). Define ARCH_KMALLOC_MINALIGN to 8 (the default for architectures without their own ARCH_DMA_MINALIGN) and override dma_get_cache_alignment() to return cache_line_size(), probed at run-time. The kmalloc() caches will be limited to the cache line size. This will allow the additional kmalloc-{64,192} caches on most arm64 platforms. Link: https://lkml.kernel.org/r/20230612153201.554742-12-catalin.marinas@xxxxxxx Signed-off-by: Catalin Marinas <catalin.marinas@xxxxxxx> Tested-by: Isaac J. Manjarres <isaacmanjarres@xxxxxxxxxx> Cc: Will Deacon <will@xxxxxxxxxx> Cc: Alasdair Kergon <agk@xxxxxxxxxx> Cc: Ard Biesheuvel <ardb@xxxxxxxxxx> Cc: Arnd Bergmann <arnd@xxxxxxxx> Cc: Christoph Hellwig <hch@xxxxxx> Cc: Daniel Vetter <daniel@xxxxxxxx> Cc: Greg Kroah-Hartman <gregkh@xxxxxxxxxxxxxxxxxxx> Cc: Herbert Xu <herbert@xxxxxxxxxxxxxxxxxxx> Cc: Jerry Snitselaar <jsnitsel@xxxxxxxxxx> Cc: Joerg Roedel <joro@xxxxxxxxxx> Cc: Jonathan Cameron <jic23@xxxxxxxxxx> Cc: Jonathan Cameron <Jonathan.Cameron@xxxxxxxxxx> Cc: Lars-Peter Clausen <lars@xxxxxxxxxx> Cc: Logan Gunthorpe <logang@xxxxxxxxxxxx> Cc: Marc Zyngier <maz@xxxxxxxxxx> Cc: Mark Brown <broonie@xxxxxxxxxx> Cc: Mike Snitzer <snitzer@xxxxxxxxxx> Cc: "Rafael J. Wysocki" <rafael@xxxxxxxxxx> Cc: Robin Murphy <robin.murphy@xxxxxxx> Cc: Saravana Kannan <saravanak@xxxxxxxxxx> Cc: Vlastimil Babka <vbabka@xxxxxxx> Signed-off-by: Andrew Morton <akpm@xxxxxxxxxxxxxxxxxxxx> --- arch/arm64/include/asm/cache.h | 3 +++ 1 file changed, 3 insertions(+) --- a/arch/arm64/include/asm/cache.h~arm64-allow-kmalloc-caches-aligned-to-the-smaller-cache_line_size +++ a/arch/arm64/include/asm/cache.h @@ -33,6 +33,7 @@ * the CPU. */ #define ARCH_DMA_MINALIGN (128) +#define ARCH_KMALLOC_MINALIGN (8) #ifndef __ASSEMBLY__ @@ -90,6 +91,8 @@ static inline int cache_line_size_of_cpu int cache_line_size(void); +#define dma_get_cache_alignment cache_line_size + /* * Read the effective value of CTR_EL0. * _ Patches currently in -mm which might be from catalin.marinas@xxxxxxx are mm-slab-decouple-arch_kmalloc_minalign-from-arch_dma_minalign.patch dma-allow-dma_get_cache_alignment-to-be-overridden-by-the-arch-code.patch mm-slab-simplify-create_kmalloc_cache-args-and-make-it-static.patch mm-slab-limit-kmalloc-minimum-alignment-to-dma_get_cache_alignment.patch drivers-base-use-arch_dma_minalign-instead-of-arch_kmalloc_minalign.patch drivers-gpu-use-arch_dma_minalign-instead-of-arch_kmalloc_minalign.patch drivers-usb-use-arch_dma_minalign-instead-of-arch_kmalloc_minalign.patch drivers-spi-use-arch_dma_minalign-instead-of-arch_kmalloc_minalign.patch dm-crypt-use-arch_dma_minalign-instead-of-arch_kmalloc_minalign.patch iio-core-use-arch_dma_minalign-instead-of-arch_kmalloc_minalign.patch arm64-allow-kmalloc-caches-aligned-to-the-smaller-cache_line_size.patch dma-mapping-force-bouncing-if-the-kmalloc-size-is-not-cache-line-aligned.patch iommu-dma-force-bouncing-if-the-size-is-not-cacheline-aligned.patch mm-slab-reduce-the-kmalloc-minimum-alignment-if-dma-bouncing-possible.patch arm64-enable-arch_want_kmalloc_dma_bounce-for-arm64.patch