[<prev] [next>] [day] [month] [year] [list]
Message-ID: <176830138984.510.12031634682208427016.tip-bot2@tip-bot2>
Date: Tue, 13 Jan 2026 10:49:49 -0000
From: "tip-bot2 for FUJITA Tomonori" <tip-bot2@...utronix.de>
To: linux-tip-commits@...r.kernel.org
Cc: FUJITA Tomonori <fujita.tomonori@...il.com>,
Alice Ryhl <aliceryhl@...gle.com>, Boqun Feng <boqun.feng@...il.com>,
x86@...nel.org, linux-kernel@...r.kernel.org
Subject:
[tip: locking/core] rust: list: Switch to kernel::sync atomic primitives
The following commit has been merged into the locking/core branch of tip:
Commit-ID: 323e4bfcbe2dc6c6cac6e007dded0ba4f89a6458
Gitweb: https://git.kernel.org/tip/323e4bfcbe2dc6c6cac6e007dded0ba4f89a6458
Author: FUJITA Tomonori <fujita.tomonori@...il.com>
AuthorDate: Tue, 30 Dec 2025 18:37:17 +09:00
Committer: Boqun Feng <boqun.feng@...il.com>
CommitterDate: Fri, 09 Jan 2026 19:01:41 +08:00
rust: list: Switch to kernel::sync atomic primitives
Convert uses of `AtomicBool` to `Atomic<bool>`.
Note that the compare_exchange migration simplifies to
`try_cmpxchg()`, since `try_cmpxchg()` provides relaxed ordering on
failure, making the explicit failure ordering unnecessary.
Signed-off-by: FUJITA Tomonori <fujita.tomonori@...il.com>
Reviewed-by: Alice Ryhl <aliceryhl@...gle.com>
Signed-off-by: Boqun Feng <boqun.feng@...il.com>
Link: https://patch.msgid.link/20251230093718.1852322-3-fujita.tomonori@gmail.com
---
rust/kernel/list/arc.rs | 14 ++++++--------
1 file changed, 6 insertions(+), 8 deletions(-)
diff --git a/rust/kernel/list/arc.rs b/rust/kernel/list/arc.rs
index d92bcf6..2282f33 100644
--- a/rust/kernel/list/arc.rs
+++ b/rust/kernel/list/arc.rs
@@ -6,11 +6,11 @@
use crate::alloc::{AllocError, Flags};
use crate::prelude::*;
+use crate::sync::atomic::{ordering, Atomic};
use crate::sync::{Arc, ArcBorrow, UniqueArc};
use core::marker::PhantomPinned;
use core::ops::Deref;
use core::pin::Pin;
-use core::sync::atomic::{AtomicBool, Ordering};
/// Declares that this type has some way to ensure that there is exactly one `ListArc` instance for
/// this id.
@@ -469,7 +469,7 @@ where
/// If the boolean is `false`, then there is no [`ListArc`] for this value.
#[repr(transparent)]
pub struct AtomicTracker<const ID: u64 = 0> {
- inner: AtomicBool,
+ inner: Atomic<bool>,
// This value needs to be pinned to justify the INVARIANT: comment in `AtomicTracker::new`.
_pin: PhantomPinned,
}
@@ -480,12 +480,12 @@ impl<const ID: u64> AtomicTracker<ID> {
// INVARIANT: Pin-init initializers can't be used on an existing `Arc`, so this value will
// not be constructed in an `Arc` that already has a `ListArc`.
Self {
- inner: AtomicBool::new(false),
+ inner: Atomic::new(false),
_pin: PhantomPinned,
}
}
- fn project_inner(self: Pin<&mut Self>) -> &mut AtomicBool {
+ fn project_inner(self: Pin<&mut Self>) -> &mut Atomic<bool> {
// SAFETY: The `inner` field is not structurally pinned, so we may obtain a mutable
// reference to it even if we only have a pinned reference to `self`.
unsafe { &mut Pin::into_inner_unchecked(self).inner }
@@ -500,7 +500,7 @@ impl<const ID: u64> ListArcSafe<ID> for AtomicTracker<ID> {
unsafe fn on_drop_list_arc(&self) {
// INVARIANT: We just dropped a ListArc, so the boolean should be false.
- self.inner.store(false, Ordering::Release);
+ self.inner.store(false, ordering::Release);
}
}
@@ -514,8 +514,6 @@ unsafe impl<const ID: u64> TryNewListArc<ID> for AtomicTracker<ID> {
fn try_new_list_arc(&self) -> bool {
// INVARIANT: If this method returns true, then the boolean used to be false, and is no
// longer false, so it is okay for the caller to create a new [`ListArc`].
- self.inner
- .compare_exchange(false, true, Ordering::Acquire, Ordering::Relaxed)
- .is_ok()
+ self.inner.cmpxchg(false, true, ordering::Acquire).is_ok()
}
}
Powered by blists - more mailing lists