[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-Id: <20160214222223.794158803@linuxfoundation.org>
Date: Sun, 14 Feb 2016 14:23:06 -0800
From: Greg Kroah-Hartman <gregkh@...uxfoundation.org>
To: linux-kernel@...r.kernel.org
Cc: Greg Kroah-Hartman <gregkh@...uxfoundation.org>,
stable@...r.kernel.org, Eli Cooper <elicooper@....com>,
Martin Willi <martin@...ongswan.org>,
Herbert Xu <herbert@...dor.apana.org.au>
Subject: [PATCH 4.3 179/200] crypto: chacha20-ssse3 - Align stack pointer to 64 bytes
4.3-stable review patch. If anyone has any objections, please let me know.
------------------
From: Eli Cooper <elicooper@....com>
commit cbe09bd51bf23b42c3a94c5fb6815e1397c5fc3f upstream.
This aligns the stack pointer in chacha20_4block_xor_ssse3 to 64 bytes.
Fixes general protection faults and potential kernel panics.
Signed-off-by: Eli Cooper <elicooper@....com>
Acked-by: Martin Willi <martin@...ongswan.org>
Signed-off-by: Herbert Xu <herbert@...dor.apana.org.au>
Signed-off-by: Greg Kroah-Hartman <gregkh@...uxfoundation.org>
---
arch/x86/crypto/chacha20-ssse3-x86_64.S | 6 ++++--
1 file changed, 4 insertions(+), 2 deletions(-)
--- a/arch/x86/crypto/chacha20-ssse3-x86_64.S
+++ b/arch/x86/crypto/chacha20-ssse3-x86_64.S
@@ -157,7 +157,9 @@ ENTRY(chacha20_4block_xor_ssse3)
# done with the slightly better performing SSSE3 byte shuffling,
# 7/12-bit word rotation uses traditional shift+OR.
- sub $0x40,%rsp
+ mov %rsp,%r11
+ sub $0x80,%rsp
+ and $~63,%rsp
# x0..15[0-3] = s0..3[0..3]
movq 0x00(%rdi),%xmm1
@@ -620,6 +622,6 @@ ENTRY(chacha20_4block_xor_ssse3)
pxor %xmm1,%xmm15
movdqu %xmm15,0xf0(%rsi)
- add $0x40,%rsp
+ mov %r11,%rsp
ret
ENDPROC(chacha20_4block_xor_ssse3)
Powered by blists - more mailing lists