diff mbox series

[2/5] arm64: add __READ_ONCE_EX()

Message ID 20241105183041.1531976-3-harisokn@amazon.com (mailing list archive)
State New
Headers show
Series [1/5] asm-generic: add smp_vcond_load_relaxed() | expand

Commit Message

Okanovic, Haris Nov. 5, 2024, 6:30 p.m. UTC
Perform an exclusive load, which atomically loads a word and arms the
exclusive monitor to enable wfet()/wfe() accelerated polling.

https://developer.arm.com/documentation/dht0008/a/arm-synchronization-primitives/exclusive-accesses/exclusive-monitors

Signed-off-by: Haris Okanovic <harisokn@amazon.com>
---
 arch/arm64/include/asm/readex.h | 46 +++++++++++++++++++++++++++++++++
 1 file changed, 46 insertions(+)
 create mode 100644 arch/arm64/include/asm/readex.h

Comments

Christoph Lameter (Ampere) Nov. 5, 2024, 7:39 p.m. UTC | #1
On Tue, 5 Nov 2024, Haris Okanovic wrote:

> +#define __READ_ONCE_EX(x)						\

This is derived from READ_ONCE and named similarly so maybe it would
better be placed into rwonce.h?
Will Deacon Nov. 6, 2024, 11:43 a.m. UTC | #2
On Tue, Nov 05, 2024 at 12:30:38PM -0600, Haris Okanovic wrote:
> Perform an exclusive load, which atomically loads a word and arms the
> exclusive monitor to enable wfet()/wfe() accelerated polling.
> 
> https://developer.arm.com/documentation/dht0008/a/arm-synchronization-primitives/exclusive-accesses/exclusive-monitors
> 
> Signed-off-by: Haris Okanovic <harisokn@amazon.com>
> ---
>  arch/arm64/include/asm/readex.h | 46 +++++++++++++++++++++++++++++++++
>  1 file changed, 46 insertions(+)
>  create mode 100644 arch/arm64/include/asm/readex.h
> 
> diff --git a/arch/arm64/include/asm/readex.h b/arch/arm64/include/asm/readex.h
> new file mode 100644
> index 000000000000..51963c3107e1
> --- /dev/null
> +++ b/arch/arm64/include/asm/readex.h
> @@ -0,0 +1,46 @@
> +/* SPDX-License-Identifier: GPL-2.0 */
> +/*
> + * Based on arch/arm64/include/asm/rwonce.h
> + *
> + * Copyright (C) 2020 Google LLC.
> + * Copyright (C) 2024 Amazon.com, Inc. or its affiliates.
> + */
> +
> +#ifndef __ASM_READEX_H
> +#define __ASM_READEX_H
> +
> +#define __LOAD_EX(sfx, regs...) "ldaxr" #sfx "\t" #regs
> +
> +#define __READ_ONCE_EX(x)						\
> +({									\
> +	typeof(&(x)) __x = &(x);					\
> +	int atomic = 1;							\
> +	union { __unqual_scalar_typeof(*__x) __val; char __c[1]; } __u;	\
> +	switch (sizeof(x)) {						\
> +	case 1:								\
> +		asm volatile(__LOAD_EX(b, %w0, %1)			\
> +			: "=r" (*(__u8 *)__u.__c)			\
> +			: "Q" (*__x) : "memory");			\
> +		break;							\
> +	case 2:								\
> +		asm volatile(__LOAD_EX(h, %w0, %1)			\
> +			: "=r" (*(__u16 *)__u.__c)			\
> +			: "Q" (*__x) : "memory");			\
> +		break;							\
> +	case 4:								\
> +		asm volatile(__LOAD_EX(, %w0, %1)			\
> +			: "=r" (*(__u32 *)__u.__c)			\
> +			: "Q" (*__x) : "memory");			\
> +		break;							\
> +	case 8:								\
> +		asm volatile(__LOAD_EX(, %0, %1)			\
> +			: "=r" (*(__u64 *)__u.__c)			\
> +			: "Q" (*__x) : "memory");			\
> +		break;							\
> +	default:							\
> +		atomic = 0;						\
> +	}								\
> +	atomic ? (typeof(*__x))__u.__val : (*(volatile typeof(__x))__x);\
> +})

I think this is a bad idea. Load-exclusive needs to be used very carefully,
preferably when you're able to see exactly what instructions it's
interacting with. By making this into a macro, we're at the mercy of the
compiler and we give the wrong impression that you could e.g. build atomic
critical sections out of this macro.

So I'm fairly strongly against this interface.

Will
Okanovic, Haris Nov. 6, 2024, 5:09 p.m. UTC | #3
On Wed, 2024-11-06 at 11:43 +0000, Will Deacon wrote:
> CAUTION: This email originated from outside of the organization. Do not click links or open attachments unless you can confirm the sender and know the content is safe.
> 
> 
> 
> On Tue, Nov 05, 2024 at 12:30:38PM -0600, Haris Okanovic wrote:
> > Perform an exclusive load, which atomically loads a word and arms the
> > exclusive monitor to enable wfet()/wfe() accelerated polling.
> > 
> > https://developer.arm.com/documentation/dht0008/a/arm-synchronization-primitives/exclusive-accesses/exclusive-monitors
> > 
> > Signed-off-by: Haris Okanovic <harisokn@amazon.com>
> > ---
> >  arch/arm64/include/asm/readex.h | 46 +++++++++++++++++++++++++++++++++
> >  1 file changed, 46 insertions(+)
> >  create mode 100644 arch/arm64/include/asm/readex.h
> > 
> > diff --git a/arch/arm64/include/asm/readex.h b/arch/arm64/include/asm/readex.h
> > new file mode 100644
> > index 000000000000..51963c3107e1
> > --- /dev/null
> > +++ b/arch/arm64/include/asm/readex.h
> > @@ -0,0 +1,46 @@
> > +/* SPDX-License-Identifier: GPL-2.0 */
> > +/*
> > + * Based on arch/arm64/include/asm/rwonce.h
> > + *
> > + * Copyright (C) 2020 Google LLC.
> > + * Copyright (C) 2024 Amazon.com, Inc. or its affiliates.
> > + */
> > +
> > +#ifndef __ASM_READEX_H
> > +#define __ASM_READEX_H
> > +
> > +#define __LOAD_EX(sfx, regs...) "ldaxr" #sfx "\t" #regs
> > +
> > +#define __READ_ONCE_EX(x)                                            \
> > +({                                                                   \
> > +     typeof(&(x)) __x = &(x);                                        \
> > +     int atomic = 1;                                                 \
> > +     union { __unqual_scalar_typeof(*__x) __val; char __c[1]; } __u; \
> > +     switch (sizeof(x)) {                                            \
> > +     case 1:                                                         \
> > +             asm volatile(__LOAD_EX(b, %w0, %1)                      \
> > +                     : "=r" (*(__u8 *)__u.__c)                       \
> > +                     : "Q" (*__x) : "memory");                       \
> > +             break;                                                  \
> > +     case 2:                                                         \
> > +             asm volatile(__LOAD_EX(h, %w0, %1)                      \
> > +                     : "=r" (*(__u16 *)__u.__c)                      \
> > +                     : "Q" (*__x) : "memory");                       \
> > +             break;                                                  \
> > +     case 4:                                                         \
> > +             asm volatile(__LOAD_EX(, %w0, %1)                       \
> > +                     : "=r" (*(__u32 *)__u.__c)                      \
> > +                     : "Q" (*__x) : "memory");                       \
> > +             break;                                                  \
> > +     case 8:                                                         \
> > +             asm volatile(__LOAD_EX(, %0, %1)                        \
> > +                     : "=r" (*(__u64 *)__u.__c)                      \
> > +                     : "Q" (*__x) : "memory");                       \
> > +             break;                                                  \
> > +     default:                                                        \
> > +             atomic = 0;                                             \
> > +     }                                                               \
> > +     atomic ? (typeof(*__x))__u.__val : (*(volatile typeof(__x))__x);\
> > +})
> 
> I think this is a bad idea. Load-exclusive needs to be used very carefully,
> preferably when you're able to see exactly what instructions it's
> interacting with. By making this into a macro, we're at the mercy of the
> compiler and we give the wrong impression that you could e.g. build atomic
> critical sections out of this macro.
> 
> So I'm fairly strongly against this interface.

Fair point. I'll post an alternate delay() implementation in asm. It's
a simple routine.

> 
> Will
Okanovic, Haris Nov. 6, 2024, 5:37 p.m. UTC | #4
On Tue, 2024-11-05 at 11:39 -0800, Christoph Lameter (Ampere) wrote:
> CAUTION: This email originated from outside of the organization. Do not click links or open attachments unless you can confirm the sender and know the content is safe.
> 
> 
> 
> On Tue, 5 Nov 2024, Haris Okanovic wrote:
> 
> > +#define __READ_ONCE_EX(x)                                            \
> 
> This is derived from READ_ONCE and named similarly so maybe it would
> better be placed into rwonce.h?
> 
> 

I plan to remove this macro per other feedback in this thread.
diff mbox series

Patch

diff --git a/arch/arm64/include/asm/readex.h b/arch/arm64/include/asm/readex.h
new file mode 100644
index 000000000000..51963c3107e1
--- /dev/null
+++ b/arch/arm64/include/asm/readex.h
@@ -0,0 +1,46 @@ 
+/* SPDX-License-Identifier: GPL-2.0 */
+/*
+ * Based on arch/arm64/include/asm/rwonce.h
+ *
+ * Copyright (C) 2020 Google LLC.
+ * Copyright (C) 2024 Amazon.com, Inc. or its affiliates.
+ */
+
+#ifndef __ASM_READEX_H
+#define __ASM_READEX_H
+
+#define __LOAD_EX(sfx, regs...) "ldaxr" #sfx "\t" #regs
+
+#define __READ_ONCE_EX(x)						\
+({									\
+	typeof(&(x)) __x = &(x);					\
+	int atomic = 1;							\
+	union { __unqual_scalar_typeof(*__x) __val; char __c[1]; } __u;	\
+	switch (sizeof(x)) {						\
+	case 1:								\
+		asm volatile(__LOAD_EX(b, %w0, %1)			\
+			: "=r" (*(__u8 *)__u.__c)			\
+			: "Q" (*__x) : "memory");			\
+		break;							\
+	case 2:								\
+		asm volatile(__LOAD_EX(h, %w0, %1)			\
+			: "=r" (*(__u16 *)__u.__c)			\
+			: "Q" (*__x) : "memory");			\
+		break;							\
+	case 4:								\
+		asm volatile(__LOAD_EX(, %w0, %1)			\
+			: "=r" (*(__u32 *)__u.__c)			\
+			: "Q" (*__x) : "memory");			\
+		break;							\
+	case 8:								\
+		asm volatile(__LOAD_EX(, %0, %1)			\
+			: "=r" (*(__u64 *)__u.__c)			\
+			: "Q" (*__x) : "memory");			\
+		break;							\
+	default:							\
+		atomic = 0;						\
+	}								\
+	atomic ? (typeof(*__x))__u.__val : (*(volatile typeof(__x))__x);\
+})
+
+#endif