Message ID | 20201029181951.1866093-1-maskray@google.com (mailing list archive) |
---|---|
State | New, archived |
Headers | show |
Series | arm64: Change .weak to SYM_FUNC_START_WEAK_PI for arch/arm64/lib/mem*.S | expand |
On Thu, Oct 29, 2020 at 11:20 AM Fangrui Song <maskray@google.com> wrote: > > Commit 39d114ddc682 ("arm64: add KASAN support") added .weak directives to > arch/arm64/lib/mem*.S instead of changing the existing SYM_FUNC_START_PI > macros. This can lead to the assembly snippet `.weak memcpy ... .globl > memcpy` which will produce a STB_WEAK memcpy with GNU as but STB_GLOBAL > memcpy with LLVM's integrated assembler before LLVM 12. LLVM 12 (since > https://reviews.llvm.org/D90108) will error on such an overridden symbol > binding. > > Use the appropriate SYM_FUNC_START_WEAK_PI instead. > > Fixes: 39d114ddc682 ("arm64: add KASAN support") > Reported-by: Sami Tolvanen <samitolvanen@google.com> > Signed-off-by: Fangrui Song <maskray@google.com> > Cc: <stable@vger.kernel.org> > --- > arch/arm64/lib/memcpy.S | 3 +-- > arch/arm64/lib/memmove.S | 3 +-- > arch/arm64/lib/memset.S | 3 +-- > 3 files changed, 3 insertions(+), 6 deletions(-) Thanks for the patch! This fixes the build with Clang 12 + LLVM_IAS=1, and also builds and boots with gcc 9.3 and Clang 12 without IAS. Tested-by: Sami Tolvanen <samitolvanen@google.com> Sami
On Thu, Oct 29, 2020 at 11:20 AM 'Fangrui Song' via Clang Built Linux <clang-built-linux@googlegroups.com> wrote: > > Commit 39d114ddc682 ("arm64: add KASAN support") added .weak directives to > arch/arm64/lib/mem*.S instead of changing the existing SYM_FUNC_START_PI > macros. This can lead to the assembly snippet `.weak memcpy ... .globl > memcpy` which will produce a STB_WEAK memcpy with GNU as but STB_GLOBAL > memcpy with LLVM's integrated assembler before LLVM 12. LLVM 12 (since > https://reviews.llvm.org/D90108) will error on such an overridden symbol > binding. > > Use the appropriate SYM_FUNC_START_WEAK_PI instead. > > Fixes: 39d114ddc682 ("arm64: add KASAN support") > Reported-by: Sami Tolvanen <samitolvanen@google.com> > Signed-off-by: Fangrui Song <maskray@google.com> > Cc: <stable@vger.kernel.org> Before your patch, with GNU `as`: $ llvm-readelf -s arch/arm64/lib/memcpy.o | grep memcpy 19: 0000000000000000 340 FUNC WEAK DEFAULT 1 memcpy 20: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __memcpy 21: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __pi_memcpy $ llvm-readelf -s arch/arm64/lib/memmove.o | grep memmove 19: 0000000000000000 340 FUNC WEAK DEFAULT 1 memmove 20: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __memmove 21: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __pi_memmove $ llvm-readelf -s arch/arm64/lib/memset.o | grep memset 19: 0000000000000000 392 FUNC WEAK DEFAULT 1 memset 20: 0000000000000000 392 FUNC GLOBAL DEFAULT 1 __memset 21: 0000000000000000 392 FUNC GLOBAL DEFAULT 1 __pi_memset After your patch, with GNU `as`: $ llvm-readelf -s arch/arm64/lib/memcpy.o | grep memcpy 19: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __memcpy 20: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __pi_memcpy 21: 0000000000000000 340 FUNC WEAK DEFAULT 1 memcpy $ llvm-readelf -s arch/arm64/lib/memmove.o | grep memmove 19: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __memmove 20: 0000000000000000 340 FUNC GLOBAL DEFAULT 1 __pi_memmove 21: 0000000000000000 340 FUNC WEAK DEFAULT 1 memmove $ llvm-readelf -s arch/arm64/lib/memset.o | grep memset 19: 0000000000000000 392 FUNC GLOBAL DEFAULT 1 __memset 20: 0000000000000000 392 FUNC GLOBAL DEFAULT 1 __pi_memset 21: 0000000000000000 392 FUNC WEAK DEFAULT 1 memset So it didn't change any bindings (good), but it did reorder symbols. I suspect if you flip the ordering of SYM_FUNC_START_WEAK_PI relative to SYM_FUNC_START_ALIAS you can get the previous ordering back, but I also suspect that the ordering of these symbols is not important. Reviewed-by: Nick Desaulniers <ndesaulniers@google.com> Tested-by: Nick Desaulniers <ndesaulniers@google.com> > --- > arch/arm64/lib/memcpy.S | 3 +-- > arch/arm64/lib/memmove.S | 3 +-- > arch/arm64/lib/memset.S | 3 +-- > 3 files changed, 3 insertions(+), 6 deletions(-) > > diff --git a/arch/arm64/lib/memcpy.S b/arch/arm64/lib/memcpy.S > index e0bf83d556f2..dc8d2a216a6e 100644 > --- a/arch/arm64/lib/memcpy.S > +++ b/arch/arm64/lib/memcpy.S > @@ -56,9 +56,8 @@ > stp \reg1, \reg2, [\ptr], \val > .endm > > - .weak memcpy > SYM_FUNC_START_ALIAS(__memcpy) > -SYM_FUNC_START_PI(memcpy) > +SYM_FUNC_START_WEAK_PI(memcpy) > #include "copy_template.S" > ret > SYM_FUNC_END_PI(memcpy) > diff --git a/arch/arm64/lib/memmove.S b/arch/arm64/lib/memmove.S > index 02cda2e33bde..1035dce4bdaf 100644 > --- a/arch/arm64/lib/memmove.S > +++ b/arch/arm64/lib/memmove.S > @@ -45,9 +45,8 @@ C_h .req x12 > D_l .req x13 > D_h .req x14 > > - .weak memmove > SYM_FUNC_START_ALIAS(__memmove) > -SYM_FUNC_START_PI(memmove) > +SYM_FUNC_START_WEAK_PI(memmove) > cmp dstin, src > b.lo __memcpy > add tmp1, src, count > diff --git a/arch/arm64/lib/memset.S b/arch/arm64/lib/memset.S > index 77c3c7ba0084..a9c1c9a01ea9 100644 > --- a/arch/arm64/lib/memset.S > +++ b/arch/arm64/lib/memset.S > @@ -42,9 +42,8 @@ dst .req x8 > tmp3w .req w9 > tmp3 .req x9 > > - .weak memset > SYM_FUNC_START_ALIAS(__memset) > -SYM_FUNC_START_PI(memset) > +SYM_FUNC_START_WEAK_PI(memset) > mov dst, dstin /* Preserve return value. */ > and A_lw, val, #255 > orr A_lw, A_lw, A_lw, lsl #8 > -- > 2.29.1.341.ge80a0c044ae-goog > > -- > You received this message because you are subscribed to the Google Groups "Clang Built Linux" group. > To unsubscribe from this group and stop receiving emails from it, send an email to clang-built-linux+unsubscribe@googlegroups.com. > To view this discussion on the web visit https://groups.google.com/d/msgid/clang-built-linux/20201029181951.1866093-1-maskray%40google.com.
On Thu, 29 Oct 2020 11:19:51 -0700, Fangrui Song wrote: > Commit 39d114ddc682 ("arm64: add KASAN support") added .weak directives to > arch/arm64/lib/mem*.S instead of changing the existing SYM_FUNC_START_PI > macros. This can lead to the assembly snippet `.weak memcpy ... .globl > memcpy` which will produce a STB_WEAK memcpy with GNU as but STB_GLOBAL > memcpy with LLVM's integrated assembler before LLVM 12. LLVM 12 (since > https://reviews.llvm.org/D90108) will error on such an overridden symbol > binding. > > [...] Applied to arm64 (for-next/fixes), thanks! [1/1] arm64: Change .weak to SYM_FUNC_START_WEAK_PI for arch/arm64/lib/mem*.S https://git.kernel.org/arm64/c/ec9d78070de9 Cheers,
diff --git a/arch/arm64/lib/memcpy.S b/arch/arm64/lib/memcpy.S index e0bf83d556f2..dc8d2a216a6e 100644 --- a/arch/arm64/lib/memcpy.S +++ b/arch/arm64/lib/memcpy.S @@ -56,9 +56,8 @@ stp \reg1, \reg2, [\ptr], \val .endm - .weak memcpy SYM_FUNC_START_ALIAS(__memcpy) -SYM_FUNC_START_PI(memcpy) +SYM_FUNC_START_WEAK_PI(memcpy) #include "copy_template.S" ret SYM_FUNC_END_PI(memcpy) diff --git a/arch/arm64/lib/memmove.S b/arch/arm64/lib/memmove.S index 02cda2e33bde..1035dce4bdaf 100644 --- a/arch/arm64/lib/memmove.S +++ b/arch/arm64/lib/memmove.S @@ -45,9 +45,8 @@ C_h .req x12 D_l .req x13 D_h .req x14 - .weak memmove SYM_FUNC_START_ALIAS(__memmove) -SYM_FUNC_START_PI(memmove) +SYM_FUNC_START_WEAK_PI(memmove) cmp dstin, src b.lo __memcpy add tmp1, src, count diff --git a/arch/arm64/lib/memset.S b/arch/arm64/lib/memset.S index 77c3c7ba0084..a9c1c9a01ea9 100644 --- a/arch/arm64/lib/memset.S +++ b/arch/arm64/lib/memset.S @@ -42,9 +42,8 @@ dst .req x8 tmp3w .req w9 tmp3 .req x9 - .weak memset SYM_FUNC_START_ALIAS(__memset) -SYM_FUNC_START_PI(memset) +SYM_FUNC_START_WEAK_PI(memset) mov dst, dstin /* Preserve return value. */ and A_lw, val, #255 orr A_lw, A_lw, A_lw, lsl #8
Commit 39d114ddc682 ("arm64: add KASAN support") added .weak directives to arch/arm64/lib/mem*.S instead of changing the existing SYM_FUNC_START_PI macros. This can lead to the assembly snippet `.weak memcpy ... .globl memcpy` which will produce a STB_WEAK memcpy with GNU as but STB_GLOBAL memcpy with LLVM's integrated assembler before LLVM 12. LLVM 12 (since https://reviews.llvm.org/D90108) will error on such an overridden symbol binding. Use the appropriate SYM_FUNC_START_WEAK_PI instead. Fixes: 39d114ddc682 ("arm64: add KASAN support") Reported-by: Sami Tolvanen <samitolvanen@google.com> Signed-off-by: Fangrui Song <maskray@google.com> Cc: <stable@vger.kernel.org> --- arch/arm64/lib/memcpy.S | 3 +-- arch/arm64/lib/memmove.S | 3 +-- arch/arm64/lib/memset.S | 3 +-- 3 files changed, 3 insertions(+), 6 deletions(-)