[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <20240402014706.3969151-2-harisokn@amazon.com>
Date: Mon, 1 Apr 2024 20:47:05 -0500
From: Haris Okanovic <harisokn@...zon.com>
To: <linux-kernel@...r.kernel.org>, <linux-pm@...r.kernel.org>,
<linux-assembly@...r.kernel.org>
CC: <peterz@...radead.org>, Haris Okanovic <harisokn@...zon.com>
Subject: [PATCH 2/3] arm64: add __READ_ONCE_EX()
Perform an exclusive load, which atomically loads a word and arms the
execusive monitor to enable wfe() polling of an address.
Adding this macro in preparation for an arm64 cpuidle driver which
supports a wfe() based polling state.
https://developer.arm.com/documentation/dht0008/a/arm-synchronization-primitives/exclusive-accesses/exclusive-monitors
Signed-off-by: Haris Okanovic <harisokn@...zon.com>
---
arch/arm64/include/asm/readex.h | 46 +++++++++++++++++++++++++++++++++
1 file changed, 46 insertions(+)
create mode 100644 arch/arm64/include/asm/readex.h
diff --git a/arch/arm64/include/asm/readex.h b/arch/arm64/include/asm/readex.h
new file mode 100644
index 000000000000..51963c3107e1
--- /dev/null
+++ b/arch/arm64/include/asm/readex.h
@@ -0,0 +1,46 @@
+/* SPDX-License-Identifier: GPL-2.0 */
+/*
+ * Based on arch/arm64/include/asm/rwonce.h
+ *
+ * Copyright (C) 2020 Google LLC.
+ * Copyright (C) 2024 Amazon.com, Inc. or its affiliates.
+ */
+
+#ifndef __ASM_READEX_H
+#define __ASM_READEX_H
+
+#define __LOAD_EX(sfx, regs...) "ldaxr" #sfx "\t" #regs
+
+#define __READ_ONCE_EX(x) \
+({ \
+ typeof(&(x)) __x = &(x); \
+ int atomic = 1; \
+ union { __unqual_scalar_typeof(*__x) __val; char __c[1]; } __u; \
+ switch (sizeof(x)) { \
+ case 1: \
+ asm volatile(__LOAD_EX(b, %w0, %1) \
+ : "=r" (*(__u8 *)__u.__c) \
+ : "Q" (*__x) : "memory"); \
+ break; \
+ case 2: \
+ asm volatile(__LOAD_EX(h, %w0, %1) \
+ : "=r" (*(__u16 *)__u.__c) \
+ : "Q" (*__x) : "memory"); \
+ break; \
+ case 4: \
+ asm volatile(__LOAD_EX(, %w0, %1) \
+ : "=r" (*(__u32 *)__u.__c) \
+ : "Q" (*__x) : "memory"); \
+ break; \
+ case 8: \
+ asm volatile(__LOAD_EX(, %0, %1) \
+ : "=r" (*(__u64 *)__u.__c) \
+ : "Q" (*__x) : "memory"); \
+ break; \
+ default: \
+ atomic = 0; \
+ } \
+ atomic ? (typeof(*__x))__u.__val : (*(volatile typeof(__x))__x);\
+})
+
+#endif
--
2.34.1
Powered by blists - more mailing lists