[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-Id: <20220405135758.774016-11-catalin.marinas@arm.com>
Date: Tue, 5 Apr 2022 14:57:58 +0100
From: Catalin Marinas <catalin.marinas@....com>
To: Will Deacon <will@...nel.org>, Marc Zyngier <maz@...nel.org>,
Arnd Bergmann <arnd@...db.de>,
Greg Kroah-Hartman <gregkh@...uxfoundation.org>,
Andrew Morton <akpm@...ux-foundation.org>,
Linus Torvalds <torvalds@...ux-foundation.org>
Cc: linux-mm@...ck.org, linux-arm-kernel@...ts.infradead.org,
linux-kernel@...r.kernel.org
Subject: [PATCH 10/10] arm64: Enable dynamic kmalloc() minimum alignment
Define ARCH_KMALLOC_MINALIGN as 64 since this would be the minimum
requirement across most arm64 SoCs. Define arch_kmalloc_minalign()
returning cache_line_size() to set the minimum run-time kmalloc()
alignment for those SoCs with bigger cache lines.
Signed-off-by: Catalin Marinas <catalin.marinas@....com>
Cc: Will Deacon <will@...nel.org>
---
arch/arm64/include/asm/cache.h | 1 +
arch/arm64/kernel/cacheinfo.c | 7 +++++++
2 files changed, 8 insertions(+)
diff --git a/arch/arm64/include/asm/cache.h b/arch/arm64/include/asm/cache.h
index a074459f8f2f..0bec986c9d51 100644
--- a/arch/arm64/include/asm/cache.h
+++ b/arch/arm64/include/asm/cache.h
@@ -48,6 +48,7 @@
* the CPU.
*/
#define ARCH_DMA_MINALIGN (128)
+#define ARCH_KMALLOC_MINALIGN (64)
#ifdef CONFIG_KASAN_SW_TAGS
#define ARCH_SLAB_MINALIGN (1ULL << KASAN_SHADOW_SCALE_SHIFT)
diff --git a/arch/arm64/kernel/cacheinfo.c b/arch/arm64/kernel/cacheinfo.c
index 587543c6c51c..61211cd597f7 100644
--- a/arch/arm64/kernel/cacheinfo.c
+++ b/arch/arm64/kernel/cacheinfo.c
@@ -97,3 +97,10 @@ int populate_cache_leaves(unsigned int cpu)
}
return 0;
}
+
+#ifndef CONFIG_SLOB
+unsigned int arch_kmalloc_minalign(void)
+{
+ return cache_line_size();
+}
+#endif
Powered by blists - more mailing lists