[PATCH 10/10] arm64: Enable dynamic kmalloc() minimum alignment

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Define ARCH_KMALLOC_MINALIGN as 64 since this would be the minimum
requirement across most arm64 SoCs. Define arch_kmalloc_minalign()
returning cache_line_size() to set the minimum run-time kmalloc()
alignment for those SoCs with bigger cache lines.

Signed-off-by: Catalin Marinas <catalin.marinas@xxxxxxx>
Cc: Will Deacon <will@xxxxxxxxxx>
---
 arch/arm64/include/asm/cache.h | 1 +
 arch/arm64/kernel/cacheinfo.c  | 7 +++++++
 2 files changed, 8 insertions(+)

diff --git a/arch/arm64/include/asm/cache.h b/arch/arm64/include/asm/cache.h
index a074459f8f2f..0bec986c9d51 100644
--- a/arch/arm64/include/asm/cache.h
+++ b/arch/arm64/include/asm/cache.h
@@ -48,6 +48,7 @@
  * the CPU.
  */
 #define ARCH_DMA_MINALIGN	(128)
+#define ARCH_KMALLOC_MINALIGN	(64)
 
 #ifdef CONFIG_KASAN_SW_TAGS
 #define ARCH_SLAB_MINALIGN	(1ULL << KASAN_SHADOW_SCALE_SHIFT)
diff --git a/arch/arm64/kernel/cacheinfo.c b/arch/arm64/kernel/cacheinfo.c
index 587543c6c51c..61211cd597f7 100644
--- a/arch/arm64/kernel/cacheinfo.c
+++ b/arch/arm64/kernel/cacheinfo.c
@@ -97,3 +97,10 @@ int populate_cache_leaves(unsigned int cpu)
 	}
 	return 0;
 }
+
+#ifndef CONFIG_SLOB
+unsigned int arch_kmalloc_minalign(void)
+{
+	return cache_line_size();
+}
+#endif




[Index of Archives]     [Linux ARM Kernel]     [Linux ARM]     [Linux Omap]     [Fedora ARM]     [IETF Annouce]     [Bugtraq]     [Linux OMAP]     [Linux MIPS]     [eCos]     [Asterisk Internet PBX]     [Linux API]

  Powered by Linux