[<prev] [next>] [thread-next>] [day] [month] [year] [list]
Message-Id: <20200527141553.1768675-1-arnd@arndb.de>
Date: Wed, 27 May 2020 16:15:39 +0200
From: Arnd Bergmann <arnd@...db.de>
To: Thomas Gleixner <tglx@...utronix.de>,
Ingo Molnar <mingo@...hat.com>, Borislav Petkov <bp@...en8.de>,
x86@...nel.org
Cc: Arnd Bergmann <arnd@...db.de>, stable@...r.kernel.org,
"H. Peter Anvin" <hpa@...or.com>, Jiri Slaby <jslaby@...e.cz>,
Juergen Gross <jgross@...e.com>,
Herbert Xu <herbert@...dor.apana.org.au>,
Tony Luck <tony.luck@...el.com>, linux-kernel@...r.kernel.org,
clang-built-linux@...glegroups.com
Subject: [PATCH] x86: fix clang integrated assembler build
clang and gas seem to interpret the symbols in memmove_64.S and
memset_64.S differently, such that clang does not make them
'weak' as expected, which leads to a linker error, with both
ld.bfd and ld.lld:
ld.lld: error: duplicate symbol: memmove
>>> defined at common.c
>>> kasan/common.o:(memmove) in archive mm/built-in.a
>>> defined at memmove.o:(__memmove) in archive arch/arm64/lib/lib.a
ld.lld: error: duplicate symbol: memset
>>> defined at common.c
>>> kasan/common.o:(memset) in archive mm/built-in.a
>>> defined at memset.o:(__memset) in archive arch/arm64/lib/lib.a
Copy the exact way these are written in memcpy_64.S, which does
not have the same problem.
I don't know why this makes a difference, and it would be good
to have someone with a better understanding of assembler internals
review it.
It might be either a bug in the kernel or a bug in the assembler,
no idea which one. My patch makes it work with all versions of
clang and gcc, which is probably helpful even if it's a workaround
for a clang bug.
Cc: stable@...r.kernel.org
Signed-off-by: Arnd Bergmann <arnd@...db.de>
---
arch/x86/lib/memmove_64.S | 4 ++--
arch/x86/lib/memset_64.S | 4 ++--
2 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/arch/x86/lib/memmove_64.S b/arch/x86/lib/memmove_64.S
index 7ff00ea64e4f..dcca01434be8 100644
--- a/arch/x86/lib/memmove_64.S
+++ b/arch/x86/lib/memmove_64.S
@@ -26,8 +26,8 @@
*/
.weak memmove
-SYM_FUNC_START_ALIAS(memmove)
-SYM_FUNC_START(__memmove)
+SYM_FUNC_START_ALIAS(__memmove)
+SYM_FUNC_START_LOCAL(memmove)
mov %rdi, %rax
diff --git a/arch/x86/lib/memset_64.S b/arch/x86/lib/memset_64.S
index 9ff15ee404a4..a97f2ea4e0b2 100644
--- a/arch/x86/lib/memset_64.S
+++ b/arch/x86/lib/memset_64.S
@@ -19,8 +19,8 @@
*
* rax original destination
*/
-SYM_FUNC_START_ALIAS(memset)
-SYM_FUNC_START(__memset)
+SYM_FUNC_START_ALIAS(__memset)
+SYM_FUNC_START_LOCAL(memset)
/*
* Some CPUs support enhanced REP MOVSB/STOSB feature. It is recommended
* to use it when possible. If not available, use fast string instructions.
--
2.26.2
Powered by blists - more mailing lists