[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <20220119093525.GB42546@C02TD0UTHF1T.local>
Date: Wed, 19 Jan 2022 09:35:25 +0000
From: Mark Rutland <mark.rutland@....com>
To: He Ying <heying24@...wei.com>
Cc: catalin.marinas@....com, will@...nel.org, marcan@...can.st,
maz@...nel.org, joey.gouly@....com, pcc@...gle.com,
linux-arm-kernel@...ts.infradead.org, linux-kernel@...r.kernel.org
Subject: Re: [PATCH] arm64: entry: Save some nops when
CONFIG_ARM64_PSEUDO_NMI is not set
On Wed, Jan 19, 2022 at 02:40:58PM +0800, He Ying wrote:
> Hi all,
>
> Ping. Any comments?
The patch looks fine, but as it's the middle of the merge window people
are busy and unlikely to look at this for the next few days.
Generally it's a good idea to wait until rc1 or rc2, rebase atop that,
and post the updated patch. Stuff like this usually gets queued around
rc3/rc4 time.
> 锟斤拷 2022/1/12 11:24, He Ying 写锟斤拷:
> > Arm64 pseudo-NMI feature code brings some additional nops
> > when CONFIG_ARM64_PSEUDO_NMI is not set, which is not
> > necessary. So add necessary ifdeffery to avoid it.
> >
> > Signed-off-by: He Ying <heying24@...wei.com>
FWIW:
Acked-by: Mark Rutland <mark.rutland@....com>
Mark.
> > ---
> > arch/arm64/kernel/entry.S | 4 ++++
> > 1 file changed, 4 insertions(+)
> >
> > diff --git a/arch/arm64/kernel/entry.S b/arch/arm64/kernel/entry.S
> > index 2f69ae43941d..ffc32d3d909a 100644
> > --- a/arch/arm64/kernel/entry.S
> > +++ b/arch/arm64/kernel/entry.S
> > @@ -300,6 +300,7 @@ alternative_else_nop_endif
> > str w21, [sp, #S_SYSCALLNO]
> > .endif
> > +#ifdef CONFIG_ARM64_PSEUDO_NMI
> > /* Save pmr */
> > alternative_if ARM64_HAS_IRQ_PRIO_MASKING
> > mrs_s x20, SYS_ICC_PMR_EL1
> > @@ -307,6 +308,7 @@ alternative_if ARM64_HAS_IRQ_PRIO_MASKING
> > mov x20, #GIC_PRIO_IRQON | GIC_PRIO_PSR_I_SET
> > msr_s SYS_ICC_PMR_EL1, x20
> > alternative_else_nop_endif
> > +#endif
> > /* Re-enable tag checking (TCO set on exception entry) */
> > #ifdef CONFIG_ARM64_MTE
> > @@ -330,6 +332,7 @@ alternative_else_nop_endif
> > disable_daif
> > .endif
> > +#ifdef CONFIG_ARM64_PSEUDO_NMI
> > /* Restore pmr */
> > alternative_if ARM64_HAS_IRQ_PRIO_MASKING
> > ldr x20, [sp, #S_PMR_SAVE]
> > @@ -339,6 +342,7 @@ alternative_if ARM64_HAS_IRQ_PRIO_MASKING
> > dsb sy // Ensure priority change is seen by redistributor
> > .L__skip_pmr_sync\@:
> > alternative_else_nop_endif
> > +#endif
> > ldp x21, x22, [sp, #S_PC] // load ELR, SPSR
Powered by blists - more mailing lists