[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <32ddd2a6-22c6-49d2-aebb-da5a2e99748d@arm.com>
Date: Wed, 10 Apr 2024 13:27:39 +0530
From: Anshuman Khandual <anshuman.khandual@....com>
To: Gavin Shan <gshan@...hat.com>, linux-arm-kernel@...ts.infradead.org,
linux-kernel@...r.kernel.org
Cc: catalin.marinas@....com, will@...nel.org, akpm@...ux-foundation.org,
maz@...nel.org, oliver.upton@...ux.dev, ryan.roberts@....com,
apopple@...dia.com, rananta@...gle.com, mark.rutland@....com,
v-songbaohua@...o.com, yangyicong@...ilicon.com, shahuang@...hat.com,
yihyu@...hat.com, shan.gavin@...il.com
Subject: Re: [PATCH v3 2/3] arm64: tlb: Improve __TLBI_VADDR_RANGE()
On 4/5/24 09:28, Gavin Shan wrote:
> The macro returns the operand of TLBI RANGE instruction. A mask needs
> to be applied to each individual field upon producing the operand, to
> avoid the adjacent fields can interfere with each other when invalid
> arguments have been provided. The code looks more tidy at least with
> a mask and FIELD_PREP().
>
> Suggested-by: Marc Zyngier <maz@...nel.org>
> Signed-off-by: Gavin Shan <gshan@...hat.com>
This looks much better.
Reviewed-by: Anshuman Khandual <anshuman.khandual@....com>
> ---
> arch/arm64/include/asm/tlbflush.h | 29 ++++++++++++++++++-----------
> 1 file changed, 18 insertions(+), 11 deletions(-)
>
> diff --git a/arch/arm64/include/asm/tlbflush.h b/arch/arm64/include/asm/tlbflush.h
> index a75de2665d84..243d71f7bc1f 100644
> --- a/arch/arm64/include/asm/tlbflush.h
> +++ b/arch/arm64/include/asm/tlbflush.h
> @@ -142,17 +142,24 @@ static inline unsigned long get_trans_granule(void)
> * EL1, Inner Shareable".
> *
> */
> -#define __TLBI_VADDR_RANGE(baddr, asid, scale, num, ttl) \
> - ({ \
> - unsigned long __ta = (baddr); \
> - unsigned long __ttl = (ttl >= 1 && ttl <= 3) ? ttl : 0; \
> - __ta &= GENMASK_ULL(36, 0); \
> - __ta |= __ttl << 37; \
> - __ta |= (unsigned long)(num) << 39; \
> - __ta |= (unsigned long)(scale) << 44; \
> - __ta |= get_trans_granule() << 46; \
> - __ta |= (unsigned long)(asid) << 48; \
> - __ta; \
> +#define TLBIR_ASID_MASK GENMASK_ULL(63, 48)
> +#define TLBIR_TG_MASK GENMASK_ULL(47, 46)
> +#define TLBIR_SCALE_MASK GENMASK_ULL(45, 44)
> +#define TLBIR_NUM_MASK GENMASK_ULL(43, 39)
> +#define TLBIR_TTL_MASK GENMASK_ULL(38, 37)
> +#define TLBIR_BADDR_MASK GENMASK_ULL(36, 0)
> +
> +#define __TLBI_VADDR_RANGE(baddr, asid, scale, num, ttl) \
> + ({ \
> + unsigned long __ta = 0; \
> + unsigned long __ttl = (ttl >= 1 && ttl <= 3) ? ttl : 0; \
> + __ta |= FIELD_PREP(TLBIR_BADDR_MASK, baddr); \
> + __ta |= FIELD_PREP(TLBIR_TTL_MASK, __ttl); \
> + __ta |= FIELD_PREP(TLBIR_NUM_MASK, num); \
> + __ta |= FIELD_PREP(TLBIR_SCALE_MASK, scale); \
> + __ta |= FIELD_PREP(TLBIR_TG_MASK, get_trans_granule()); \
> + __ta |= FIELD_PREP(TLBIR_ASID_MASK, asid); \
> + __ta; \
> })
>
> /* These macros are used by the TLBI RANGE feature. */
Powered by blists - more mailing lists