[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-Id: <20180731191102.2434-3-Jason@zx2c4.com>
Date: Tue, 31 Jul 2018 21:11:01 +0200
From: "Jason A. Donenfeld" <Jason@...c4.com>
To: linux-kernel@...r.kernel.org, netdev@...r.kernel.org,
davem@...emloft.net
Cc: "Jason A. Donenfeld" <Jason@...c4.com>,
Andy Lutomirski <luto@...nel.org>,
Greg KH <gregkh@...uxfoundation.org>,
Samuel Neves <sneves@....uc.pt>,
"D . J . Bernstein" <djb@...yp.to>,
Tanja Lange <tanja@...erelliptic.org>,
Jean-Philippe Aumasson <jeanphilippe.aumasson@...il.com>,
Karthikeyan Bhargavan <karthik.bhargavan@...il.com>
Subject: [PATCH v1 2/3] zinc: Introduce minimal cryptography library
Zinc stands for "Zinc Is Not crypto/". It's also short, easy to type,
and plays nicely with the recent trend of naming crypto libraries after
elements. The guiding principle is "don't overdo it". It's less of a
library and more of a directory tree for organizing well-curated direct
implementations of cryptography primitives.
Zinc is a new cryptography API that is much more minimal and lower-level
than the current one. It intends to complement it and provide a basis
upon which the current crypto API might build, and perhaps someday Zinc
may altogether supplant the current crypto API. It is motivated by
three primary observations in crypto API design:
* Highly composable "cipher modes" and related abstractions from
90s cryptographers did not turn out to be as terrific an idea as
hoped, leading to a host of API misuse problems.
* Most programmers are afraid of crypto code, and so prefer to
integrate it into libraries in a highly abstracted manner, so as to
shield themselves from implementation details. Cryptographers, on
the other hand, prefer simple direct implementations, which they're
able to verify for high assurance and optimize in accordance with
their expertise.
* Overly abstracted and flexible cryptography APIs lead to a host of
dangerous problems and performance issues. The kernel is in the
business usually not of coming up with new uses of crypto, but
rather implementing various constructions, which means it essentially
needs a library of primitives, not a highly abstracted enterprise-ready
pluggable system, with a few particular exceptions.
This last observation has seen itself play out several times over and
over again within the kernel:
* The perennial move of actual primitives away from crypto/ and into
lib/, so that users can actually call these functions directly with
no overhead and without lots of allocations, function pointers,
string specifier parsing, and general clunkiness. For example:
sha256, chacha20, siphash, sha1, and so forth live in lib/ rather
than in crypto/. Zinc intends to stop the cluttering of lib/ and
introduce these direct primitives into their proper place, lib/zinc/.
* An abundance of misuse bugs with the present crypto API that have
been very unpleasant to clean up.
* A hesitance to even use cryptography, because of the overhead and
headaches involved in accessing the routines.
Zinc goes in a rather different direction. Rather than providing a
thoroughly designed and abstracted API, Zinc gives you simple functions,
which implement some primitive, or some particular and specific
construction of primitives. It is not dynamic in the least, though one
could imagine implementing a complex dynamic dispatch mechanism (such as
the current crypto API) on top of these basic functions. After all,
dynamic dispatch is usually needed for applications with cipher agility,
such as IPsec, dm-crypt, AF_ALG, and so forth, and the existing crypto
API will continue to play that role. However, Zinc will provide a non-
haphazard way of directly utilizing crypto routines in applications
that do have neither need nor desire for abstraction and dynamic
dispatch.
It also organizes the implementations in a simple, straight-forward,
and direct manner, making it enjoyable and intuitive to work on.
Rather than moving optimized assembly implementations into arch/, it
keeps them all together in lib/zinc/, making it simple and obvious to
compare and contrast what's happening. This is, notably, exactly what
the lib/raid6/ tree does, and that seems to work out rather well. It's
also the pattern of most successful crypto libraries. Likewise, the
cascade of architecture-specific functions is done as ifdefs within one
file, so that it's easy and obvious and clear what's happening, how each
architecture differs, and how to optimize for shared patterns. This is
very much preferable to architecture-specific file splitting.
All implementations have been extensively tested and fuzzed, and are
selected for their quality, trustworthiness, and performance. Wherever
possible and performant, formally verified implementations are used,
such as those from HACL* [1] and Fiat-Crypto [2]. The routines also take
special care to zero out secrets using memzero_explicit (and future work
is planned to have gcc do this more reliably and performantly with
compiler plugins). The performance of the selected implementations is
state-of-the-art and unrivaled. Each implementation also comes with
extensive self-tests and crafted test vectors, pulled from various
places such as Wycheproof [9].
Regularity of function signatures is important, so that users can easily
"guess" the name of the function they want. Though, individual
primitives are oftentimes not trivially interchangeable, having been
designed for different things and requiring different parameters and
semantics, and so the function signatures they provide will directly
reflect the realities of the primitives' usages, rather than hiding it
behind (inevitably leaky) abstractions. Also, in contrast to the current
crypto API, Zinc functions can work on stack buffers, and can be called
with different keys, without requiring allocations or locking.
SIMD is used automatically when available, though some routines may
benefit from either having their SIMD disabled for particular
invocations, or to have the SIMD initialization calls amortized over
several invocations of the function, and so Zinc provides helpers and
function signatures enabling that.
More generally, Zinc provides function signatures that allow just what
is required by the various callers. This isn't to say that users of the
functions will be permitted to pollute the function semantics with weird
particular needs, but we are trying very hard not to overdo it, and that
means looking carefully at what's actually necessary, and doing just that,
and not much more than that. Remember: practicality and cleanliness rather
than over-zealous infrastructure.
Zinc provides also an opening for the best implementers in academia to
contribute their time and effort to the kernel, by being sufficiently
simple and inviting. In discussing this commit with some of the best and
brightest over the last few years, there are many who are eager to
devote rare talent and energy to this effort.
This initial commit adds implementations of the primitives used by WireGuard:
* Curve25519 [3]: formally verified 64-bit C, formally verified 32-bit C,
x86_64 BMI2, x86_64 ADX, ARM NEON.
* ChaCha20 [4]: generic C, x86_64 SSSE3, x86_64 AVX-2, x86_64 AVX-512F,
x86_64 AVX-512VL, ARM NEON, ARM64 NEON, MIPS.
* HChaCha20 [5]: generic C, x86_64 SSSE3.
* Poly1305 [6]: generic C, x86_64, x86_64 AVX, x86_64 AVX-2, x86_64
AVX-512F, ARM NEON, ARM64 NEON, MIPS, MIPS64.
* BLAKE2s [7]: generic C, x86_64 AVX, x86_64 AVX-512VL.
* ChaCha20Poly1305 [8]: generic C construction for both full buffers and
scatter gather.
* XChaCha20Poly1305 [5]: generic C construction.
Following the merging of this, I expect for first the primitives that
currently exist in lib/ to work their way into lib/zinc/, after intense
scrutiny of each implementation, potentially replacing them with either
formally-verified implementations, or better studied and faster
state-of-the-art implementations. In a phase after that, I envision that
certain instances from crypto/ will want to rebase themselves to simply
be abstracted crypto API wrappers using the lower level Zinc functions.
This is already what various crypto/ implementations do with the
existing code in lib/.
Currently Zinc exists as a single un-menued option, CONFIG_ZINC, but as
this grows we will inevitably want to make that more granular. This will
happen at the appropriate time, rather than doing so prematurely. There
also is a CONFIG_ZINC_DEBUG menued option, performing several intense
tests at startup and enabling various BUG_ONs.
[1] https://github.com/project-everest/hacl-star
[2] https://github.com/mit-plv/fiat-crypto
[3] https://cr.yp.to/ecdh.html
[4] https://cr.yp.to/chacha.html
[5] https://cr.yp.to/snuffle/xsalsa-20081128.pdf
[6] https://cr.yp.to/mac.html
[7] https://blake2.net/
[8] https://tools.ietf.org/html/rfc8439
[9] https://github.com/google/wycheproof
Signed-off-by: Jason A. Donenfeld <Jason@...c4.com>
Cc: Andy Lutomirski <luto@...nel.org>
Cc: Greg KH <gregkh@...uxfoundation.org>
Cc: Samuel Neves <sneves@....uc.pt>
Cc: D. J. Bernstein <djb@...yp.to>
Cc: Tanja Lange <tanja@...erelliptic.org>
Cc: Jean-Philippe Aumasson <jeanphilippe.aumasson@...il.com>
Cc: Karthikeyan Bhargavan <karthik.bhargavan@...il.com>
---
MAINTAINERS | 7 +
include/zinc/blake2s.h | 94 +
include/zinc/chacha20.h | 46 +
include/zinc/chacha20poly1305.h | 51 +
include/zinc/curve25519.h | 25 +
include/zinc/poly1305.h | 34 +
include/zinc/simd.h | 60 +
lib/Kconfig | 21 +
lib/Makefile | 2 +
lib/zinc/Makefile | 28 +
lib/zinc/blake2s/blake2s-x86_64.S | 685 ++++++
lib/zinc/blake2s/blake2s.c | 292 +++
lib/zinc/chacha20/chacha20-arm.S | 1471 ++++++++++++
lib/zinc/chacha20/chacha20-arm64.S | 1940 ++++++++++++++++
lib/zinc/chacha20/chacha20-mips.S | 474 ++++
lib/zinc/chacha20/chacha20-x86_64.S | 2630 +++++++++++++++++++++
lib/zinc/chacha20/chacha20.c | 242 ++
lib/zinc/chacha20poly1305.c | 286 +++
lib/zinc/curve25519/curve25519-arm.S | 2110 +++++++++++++++++
lib/zinc/curve25519/curve25519-arm.h | 14 +
lib/zinc/curve25519/curve25519-fiat32.h | 838 +++++++
lib/zinc/curve25519/curve25519-hacl64.h | 751 ++++++
lib/zinc/curve25519/curve25519-x86_64.h | 2060 +++++++++++++++++
lib/zinc/curve25519/curve25519.c | 86 +
lib/zinc/main.c | 36 +
lib/zinc/poly1305/poly1305-arm.S | 1115 +++++++++
lib/zinc/poly1305/poly1305-arm64.S | 820 +++++++
lib/zinc/poly1305/poly1305-mips.S | 417 ++++
lib/zinc/poly1305/poly1305-mips64.S | 357 +++
lib/zinc/poly1305/poly1305-x86_64.S | 2790 +++++++++++++++++++++++
lib/zinc/poly1305/poly1305.c | 377 +++
lib/zinc/selftest/blake2s.h | 559 +++++
lib/zinc/selftest/chacha20poly1305.h | 1559 +++++++++++++
lib/zinc/selftest/curve25519.h | 607 +++++
lib/zinc/selftest/poly1305.h | 1568 +++++++++++++
35 files changed, 24452 insertions(+)
create mode 100644 include/zinc/blake2s.h
create mode 100644 include/zinc/chacha20.h
create mode 100644 include/zinc/chacha20poly1305.h
create mode 100644 include/zinc/curve25519.h
create mode 100644 include/zinc/poly1305.h
create mode 100644 include/zinc/simd.h
create mode 100644 lib/zinc/Makefile
create mode 100644 lib/zinc/blake2s/blake2s-x86_64.S
create mode 100644 lib/zinc/blake2s/blake2s.c
create mode 100644 lib/zinc/chacha20/chacha20-arm.S
create mode 100644 lib/zinc/chacha20/chacha20-arm64.S
create mode 100644 lib/zinc/chacha20/chacha20-mips.S
create mode 100644 lib/zinc/chacha20/chacha20-x86_64.S
create mode 100644 lib/zinc/chacha20/chacha20.c
create mode 100644 lib/zinc/chacha20poly1305.c
create mode 100644 lib/zinc/curve25519/curve25519-arm.S
create mode 100644 lib/zinc/curve25519/curve25519-arm.h
create mode 100644 lib/zinc/curve25519/curve25519-fiat32.h
create mode 100644 lib/zinc/curve25519/curve25519-hacl64.h
create mode 100644 lib/zinc/curve25519/curve25519-x86_64.h
create mode 100644 lib/zinc/curve25519/curve25519.c
create mode 100644 lib/zinc/main.c
create mode 100644 lib/zinc/poly1305/poly1305-arm.S
create mode 100644 lib/zinc/poly1305/poly1305-arm64.S
create mode 100644 lib/zinc/poly1305/poly1305-mips.S
create mode 100644 lib/zinc/poly1305/poly1305-mips64.S
create mode 100644 lib/zinc/poly1305/poly1305-x86_64.S
create mode 100644 lib/zinc/poly1305/poly1305.c
create mode 100644 lib/zinc/selftest/blake2s.h
create mode 100644 lib/zinc/selftest/chacha20poly1305.h
create mode 100644 lib/zinc/selftest/curve25519.h
create mode 100644 lib/zinc/selftest/poly1305.h
diff --git a/MAINTAINERS b/MAINTAINERS
index 38f44389ad99..0ffc9fd0ebb4 100644
--- a/MAINTAINERS
+++ b/MAINTAINERS
@@ -15859,6 +15859,13 @@ Q: https://patchwork.linuxtv.org/project/linux-media/list/
S: Maintained
F: drivers/media/dvb-frontends/zd1301_demod*
+ZINC CRYPTOGRAPHY LIBRARY
+M: Jason A. Donenfeld <Jason@...c4.com>
+S: Maintained
+F: lib/zinc/
+F: include/zinc/
+L: linux-crypto@...r.kernel.org
+
ZPOOL COMPRESSED PAGE STORAGE API
M: Dan Streetman <ddstreet@...e.org>
L: linux-mm@...ck.org
diff --git a/include/zinc/blake2s.h b/include/zinc/blake2s.h
new file mode 100644
index 000000000000..812c4c68bbe9
--- /dev/null
+++ b/include/zinc/blake2s.h
@@ -0,0 +1,94 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifndef _ZINC_BLAKE2S_H
+#define _ZINC_BLAKE2S_H
+
+#include <linux/types.h>
+#include <linux/kernel.h>
+#include <crypto/algapi.h>
+
+enum blake2s_lengths {
+ BLAKE2S_BLOCKBYTES = 64,
+ BLAKE2S_OUTBYTES = 32,
+ BLAKE2S_KEYBYTES = 32
+};
+
+struct blake2s_state {
+ u32 h[8];
+ u32 t[2];
+ u32 f[2];
+ u8 buf[BLAKE2S_BLOCKBYTES];
+ size_t buflen;
+ u8 last_node;
+};
+
+void blake2s_init(struct blake2s_state *state, const size_t outlen);
+void blake2s_init_key(struct blake2s_state *state, const size_t outlen, const void *key, const size_t keylen);
+void blake2s_update(struct blake2s_state *state, const u8 *in, size_t inlen);
+void __blake2s_final(struct blake2s_state *state);
+static inline void blake2s_final(struct blake2s_state *state, u8 *out, const size_t outlen)
+{
+ int i;
+
+#ifdef CONFIG_ZINC_DEBUG
+ BUG_ON(!out || !outlen || outlen > BLAKE2S_OUTBYTES);
+#endif
+ __blake2s_final(state);
+
+ if (__builtin_constant_p(outlen) && !(outlen % sizeof(u32))) {
+ if (IS_ENABLED(CONFIG_HAVE_EFFICIENT_UNALIGNED_ACCESS) || IS_ALIGNED((unsigned long)out, __alignof__(u32))) {
+ __le32 *outwords = (__le32 *)out;
+
+ for (i = 0; i < outlen / sizeof(u32); ++i)
+ outwords[i] = cpu_to_le32(state->h[i]);
+ } else {
+ __le32 buffer[BLAKE2S_OUTBYTES];
+
+ for (i = 0; i < outlen / sizeof(u32); ++i)
+ buffer[i] = cpu_to_le32(state->h[i]);
+ memcpy(out, buffer, outlen);
+ memzero_explicit(buffer, sizeof(buffer));
+ }
+ } else {
+ u8 buffer[BLAKE2S_OUTBYTES] __aligned(__alignof__(u32));
+ __le32 *outwords = (__le32 *)buffer;
+
+ for (i = 0; i < 8; ++i)
+ outwords[i] = cpu_to_le32(state->h[i]);
+ memcpy(out, buffer, outlen);
+ memzero_explicit(buffer, sizeof(buffer));
+ }
+
+ memzero_explicit(state, sizeof(struct blake2s_state));
+}
+
+
+static inline void blake2s(u8 *out, const u8 *in, const u8 *key, const size_t outlen, const size_t inlen, const size_t keylen)
+{
+ struct blake2s_state state;
+
+#ifdef CONFIG_ZINC_DEBUG
+ BUG_ON((!in && inlen > 0) || !out || !outlen || outlen > BLAKE2S_OUTBYTES || keylen > BLAKE2S_KEYBYTES || (!key && keylen));
+#endif
+
+ if (keylen)
+ blake2s_init_key(&state, outlen, key, keylen);
+ else
+ blake2s_init(&state, outlen);
+
+ blake2s_update(&state, in, inlen);
+ blake2s_final(&state, out, outlen);
+}
+
+void blake2s_hmac(u8 *out, const u8 *in, const u8 *key, const size_t outlen, const size_t inlen, const size_t keylen);
+
+void blake2s_fpu_init(void);
+
+#ifdef CONFIG_ZINC_DEBUG
+bool blake2s_selftest(void);
+#endif
+
+#endif /* _ZINC_BLAKE2S_H */
diff --git a/include/zinc/chacha20.h b/include/zinc/chacha20.h
new file mode 100644
index 000000000000..a7c392454c74
--- /dev/null
+++ b/include/zinc/chacha20.h
@@ -0,0 +1,46 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifndef _ZINC_CHACHA20_H
+#define _ZINC_CHACHA20_H
+
+#include <linux/kernel.h>
+#include <linux/types.h>
+
+enum {
+ CHACHA20_IV_SIZE = 16,
+ CHACHA20_KEY_SIZE = 32,
+ CHACHA20_BLOCK_SIZE = 64,
+ HCHACHA20_KEY_SIZE = 32,
+ HCHACHA20_NONCE_SIZE = 16
+};
+
+struct chacha20_ctx {
+ u32 key[8];
+ u32 counter[4];
+} __aligned(32);
+
+void chacha20_fpu_init(void);
+
+static inline void chacha20_init(struct chacha20_ctx *state, const u8 key[CHACHA20_KEY_SIZE], const u64 nonce)
+{
+ __le32 *le_key = (__le32 *)key;
+ state->key[0] = le32_to_cpu(le_key[0]);
+ state->key[1] = le32_to_cpu(le_key[1]);
+ state->key[2] = le32_to_cpu(le_key[2]);
+ state->key[3] = le32_to_cpu(le_key[3]);
+ state->key[4] = le32_to_cpu(le_key[4]);
+ state->key[5] = le32_to_cpu(le_key[5]);
+ state->key[6] = le32_to_cpu(le_key[6]);
+ state->key[7] = le32_to_cpu(le_key[7]);
+ state->counter[0] = state->counter[1] = 0;
+ state->counter[2] = nonce & U32_MAX;
+ state->counter[3] = nonce >> 32;
+}
+void chacha20(struct chacha20_ctx *state, u8 *dst, const u8 *src, u32 len, bool have_simd);
+
+void hchacha20(u8 derived_key[CHACHA20_KEY_SIZE], const u8 nonce[HCHACHA20_NONCE_SIZE], const u8 key[HCHACHA20_KEY_SIZE], bool have_simd);
+
+#endif /* _ZINC_CHACHA20_H */
diff --git a/include/zinc/chacha20poly1305.h b/include/zinc/chacha20poly1305.h
new file mode 100644
index 000000000000..f6c5e9314859
--- /dev/null
+++ b/include/zinc/chacha20poly1305.h
@@ -0,0 +1,51 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifndef _ZINC_CHACHA20POLY1305_H
+#define _ZINC_CHACHA20POLY1305_H
+
+#include <linux/types.h>
+
+struct scatterlist;
+
+enum chacha20poly1305_lengths {
+ XCHACHA20POLY1305_NONCELEN = 24,
+ CHACHA20POLY1305_KEYLEN = 32,
+ CHACHA20POLY1305_AUTHTAGLEN = 16
+};
+
+void chacha20poly1305_encrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN]);
+
+bool __must_check chacha20poly1305_encrypt_sg(struct scatterlist *dst, struct scatterlist *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN],
+ bool have_simd);
+
+bool __must_check chacha20poly1305_decrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN]);
+
+bool __must_check chacha20poly1305_decrypt_sg(struct scatterlist *dst, struct scatterlist *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN],
+ bool have_simd);
+
+void xchacha20poly1305_encrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u8 nonce[XCHACHA20POLY1305_NONCELEN],
+ const u8 key[CHACHA20POLY1305_KEYLEN]);
+
+bool __must_check xchacha20poly1305_decrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u8 nonce[XCHACHA20POLY1305_NONCELEN],
+ const u8 key[CHACHA20POLY1305_KEYLEN]);
+
+#ifdef CONFIG_ZINC_DEBUG
+bool chacha20poly1305_selftest(void);
+#endif
+
+#endif /* _ZINC_CHACHA20POLY1305_H */
diff --git a/include/zinc/curve25519.h b/include/zinc/curve25519.h
new file mode 100644
index 000000000000..b6bd387e3b8a
--- /dev/null
+++ b/include/zinc/curve25519.h
@@ -0,0 +1,25 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifndef _ZINC_CURVE25519_H
+#define _ZINC_CURVE25519_H
+
+#include <linux/types.h>
+
+enum curve25519_lengths {
+ CURVE25519_POINT_SIZE = 32
+};
+
+bool __must_check curve25519(u8 mypublic[CURVE25519_POINT_SIZE], const u8 secret[CURVE25519_POINT_SIZE], const u8 basepoint[CURVE25519_POINT_SIZE]);
+void curve25519_generate_secret(u8 secret[CURVE25519_POINT_SIZE]);
+bool __must_check curve25519_generate_public(u8 pub[CURVE25519_POINT_SIZE], const u8 secret[CURVE25519_POINT_SIZE]);
+
+void curve25519_fpu_init(void);
+
+#ifdef CONFIG_ZINC_DEBUG
+bool curve25519_selftest(void);
+#endif
+
+#endif /* _ZINC_CURVE25519_H */
diff --git a/include/zinc/poly1305.h b/include/zinc/poly1305.h
new file mode 100644
index 000000000000..b8827493e6d7
--- /dev/null
+++ b/include/zinc/poly1305.h
@@ -0,0 +1,34 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifndef _ZINC_POLY1305_H
+#define _ZINC_POLY1305_H
+
+#include <linux/types.h>
+
+enum poly1305_lengths {
+ POLY1305_BLOCK_SIZE = 16,
+ POLY1305_KEY_SIZE = 32,
+ POLY1305_MAC_SIZE = 16
+};
+
+struct poly1305_ctx {
+ u8 opaque[24 * sizeof(u64)];
+ u32 nonce[4];
+ u8 data[POLY1305_BLOCK_SIZE];
+ size_t num;
+} __aligned(8);
+
+void poly1305_fpu_init(void);
+
+void poly1305_init(struct poly1305_ctx *ctx, const u8 key[POLY1305_KEY_SIZE], bool have_simd);
+void poly1305_update(struct poly1305_ctx *ctx, const u8 *inp, const size_t len, bool have_simd);
+void poly1305_finish(struct poly1305_ctx *ctx, u8 mac[POLY1305_MAC_SIZE], bool have_simd);
+
+#ifdef CONFIG_ZINC_DEBUG
+bool poly1305_selftest(void);
+#endif
+
+#endif /* _ZINC_POLY1305_H */
diff --git a/include/zinc/simd.h b/include/zinc/simd.h
new file mode 100644
index 000000000000..0011257e3de2
--- /dev/null
+++ b/include/zinc/simd.h
@@ -0,0 +1,60 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifndef _ZINC_SIMD_H
+#define _ZINC_SIMD_H
+
+#include <linux/sched.h>
+#if defined(CONFIG_X86_64)
+#include <linux/version.h>
+#include <asm/fpu/api.h>
+#include <asm/simd.h>
+#elif IS_ENABLED(CONFIG_KERNEL_MODE_NEON)
+#include <asm/neon.h>
+#include <asm/simd.h>
+#endif
+
+static inline bool simd_get(void)
+{
+ bool have_simd = false;
+#if defined(CONFIG_X86_64) && !defined(CONFIG_UML) && !defined(CONFIG_PREEMPT_RT_BASE)
+ have_simd = irq_fpu_usable();
+ if (have_simd)
+ kernel_fpu_begin();
+#elif IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && !defined(CONFIG_PREEMPT_RT_BASE)
+#if defined(CONFIG_ARM64)
+ have_simd = true; /* ARM64 supports NEON in any context. */
+#elif defined(CONFIG_ARM)
+ have_simd = may_use_simd(); /* ARM doesn't support NEON in interrupt context. */
+#endif
+ if (have_simd)
+ kernel_neon_begin();
+#endif
+ return have_simd;
+}
+
+static inline void simd_put(bool was_on)
+{
+#if defined(CONFIG_X86_64) && !defined(CONFIG_UML) && !defined(CONFIG_PREEMPT_RT_BASE)
+ if (was_on)
+ kernel_fpu_end();
+#elif IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && !defined(CONFIG_PREEMPT_RT_BASE)
+ if (was_on)
+ kernel_neon_end();
+#endif
+}
+
+static inline bool simd_relax(bool was_on)
+{
+#ifdef CONFIG_PREEMPT
+ if (was_on && need_resched()) {
+ simd_put(true);
+ return simd_get();
+ }
+#endif
+ return was_on;
+}
+
+#endif /* _ZINC_SIMD_H */
diff --git a/lib/Kconfig b/lib/Kconfig
index 706836ec314d..6f5361d0f797 100644
--- a/lib/Kconfig
+++ b/lib/Kconfig
@@ -478,6 +478,27 @@ config GLOB_SELFTEST
module load) by a small amount, so you're welcome to play with
it, but you probably don't need it.
+config ZINC
+ tristate
+ select CRYPTO_BLKCIPHER
+ select VFP
+ select VFPv3
+ select NEON
+ select KERNEL_MODE_NEON
+
+config ZINC_DEBUG
+ bool "Zinc cryptography library debugging and self-tests"
+ depends on ZINC
+ help
+ This builds a series of self-tests for the Zinc crypto library, which
+ help diagnose any cryptographic algorithm implementation issues that
+ might be at the root cause of potential bugs. It also adds various
+ debugging traps.
+
+ Unless you're developing and testing cryptographic routines, or are
+ especially paranoid about correctness on your hardware, you may say
+ N here.
+
#
# Netlink attribute parsing support is select'ed if needed
#
diff --git a/lib/Makefile b/lib/Makefile
index 60d0d5f90946..df8bba9775cb 100644
--- a/lib/Makefile
+++ b/lib/Makefile
@@ -211,6 +211,8 @@ obj-$(CONFIG_PERCPU_TEST) += percpu_test.o
obj-$(CONFIG_ASN1) += asn1_decoder.o
+obj-$(CONFIG_ZINC) += zinc/
+
obj-$(CONFIG_FONT_SUPPORT) += fonts/
obj-$(CONFIG_PRIME_NUMBERS) += prime_numbers.o
diff --git a/lib/zinc/Makefile b/lib/zinc/Makefile
new file mode 100644
index 000000000000..80f6f25cbc82
--- /dev/null
+++ b/lib/zinc/Makefile
@@ -0,0 +1,28 @@
+ccflags-y := -O3
+ccflags-y += -Wframe-larger-than=8192
+ccflags-y += -D'pr_fmt(fmt)=KBUILD_MODNAME ": " fmt'
+
+zinc-y += chacha20/chacha20.o
+zinc-$(CONFIG_X86_64) += chacha20/chacha20-x86_64.o
+zinc-$(CONFIG_ARM) += chacha20/chacha20-arm.o
+zinc-$(CONFIG_ARM64) += chacha20/chacha20-arm64.o
+zinc-$(if $(filter yy,$(CONFIG_MIPS)$(CONFIG_CPU_MIPS32_R2)),y,n) += chacha20/chacha20-mips.o
+
+zinc-y += poly1305/poly1305.o
+zinc-$(CONFIG_X86_64) += poly1305/poly1305-x86_64.o
+zinc-$(CONFIG_ARM) += poly1305/poly1305-arm.o
+zinc-$(CONFIG_ARM64) += poly1305/poly1305-arm64.o
+zinc-$(if $(filter yy,$(CONFIG_MIPS)$(CONFIG_CPU_MIPS32_R2)),y,n) += poly1305/poly1305-mips.o
+zinc-$(if $(filter yy,$(CONFIG_MIPS)$(CONFIG_64BIT)),y,n) += poly1305/poly1305-mips64.o
+
+zinc-y += chacha20poly1305.o
+
+zinc-y += curve25519/curve25519.o
+zinc-$(CONFIG_ARM) += curve25519/curve25519-arm.o
+
+zinc-y += blake2s/blake2s.o
+zinc-$(CONFIG_X86_64) += blake2s/blake2s-x86_64.o
+
+zinc-y += main.o
+
+obj-$(CONFIG_ZINC) := zinc.o
diff --git a/lib/zinc/blake2s/blake2s-x86_64.S b/lib/zinc/blake2s/blake2s-x86_64.S
new file mode 100644
index 000000000000..ee683e45a6c4
--- /dev/null
+++ b/lib/zinc/blake2s/blake2s-x86_64.S
@@ -0,0 +1,685 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright (C) 2017 Samuel Neves <sneves@....uc.pt>. All Rights Reserved.
+ */
+
+#include <linux/linkage.h>
+
+.section .rodata.cst32.BLAKE2S_IV, "aM", @progbits, 32
+.align 32
+IV: .octa 0xA54FF53A3C6EF372BB67AE856A09E667
+ .octa 0x5BE0CD191F83D9AB9B05688C510E527F
+.section .rodata.cst16.ROT16, "aM", @progbits, 16
+.align 16
+ROT16: .octa 0x0D0C0F0E09080B0A0504070601000302
+.section .rodata.cst16.ROR328, "aM", @progbits, 16
+.align 16
+ROR328: .octa 0x0C0F0E0D080B0A090407060500030201
+#ifdef CONFIG_AS_AVX512
+.section .rodata.cst64.BLAKE2S_SIGMA, "aM", @progbits, 640
+.align 64
+SIGMA:
+.long 0, 2, 4, 6, 1, 3, 5, 7, 8, 10, 12, 14, 9, 11, 13, 15
+.long 11, 2, 12, 14, 9, 8, 15, 3, 4, 0, 13, 6, 10, 1, 7, 5
+.long 10, 12, 11, 6, 5, 9, 13, 3, 4, 15, 14, 2, 0, 7, 8, 1
+.long 10, 9, 7, 0, 11, 14, 1, 12, 6, 2, 15, 3, 13, 8, 5, 4
+.long 4, 9, 8, 13, 14, 0, 10, 11, 7, 3, 12, 1, 5, 6, 15, 2
+.long 2, 10, 4, 14, 13, 3, 9, 11, 6, 5, 7, 12, 15, 1, 8, 0
+.long 4, 11, 14, 8, 13, 10, 12, 5, 2, 1, 15, 3, 9, 7, 0, 6
+.long 6, 12, 0, 13, 15, 2, 1, 10, 4, 5, 11, 14, 8, 3, 9, 7
+.long 14, 5, 4, 12, 9, 7, 3, 10, 2, 0, 6, 15, 11, 1, 13, 8
+.long 11, 7, 13, 10, 12, 14, 0, 15, 4, 5, 6, 9, 2, 1, 8, 3
+#endif /* CONFIG_AS_AVX512 */
+
+.text
+#ifdef CONFIG_AS_AVX
+ENTRY(blake2s_compress_avx)
+ movl %ecx, %ecx
+ testq %rdx, %rdx
+ je .Lendofloop
+ .align 32
+.Lbeginofloop:
+ addq %rcx, 32(%rdi)
+ vmovdqu IV+16(%rip), %xmm1
+ vmovdqu (%rsi), %xmm4
+ vpxor 32(%rdi), %xmm1, %xmm1
+ vmovdqu 16(%rsi), %xmm3
+ vshufps $136, %xmm3, %xmm4, %xmm6
+ vmovdqa ROT16(%rip), %xmm7
+ vpaddd (%rdi), %xmm6, %xmm6
+ vpaddd 16(%rdi), %xmm6, %xmm6
+ vpxor %xmm6, %xmm1, %xmm1
+ vmovdqu IV(%rip), %xmm8
+ vpshufb %xmm7, %xmm1, %xmm1
+ vmovdqu 48(%rsi), %xmm5
+ vpaddd %xmm1, %xmm8, %xmm8
+ vpxor 16(%rdi), %xmm8, %xmm9
+ vmovdqu 32(%rsi), %xmm2
+ vpblendw $12, %xmm3, %xmm5, %xmm13
+ vshufps $221, %xmm5, %xmm2, %xmm12
+ vpunpckhqdq %xmm2, %xmm4, %xmm14
+ vpslld $20, %xmm9, %xmm0
+ vpsrld $12, %xmm9, %xmm9
+ vpxor %xmm0, %xmm9, %xmm0
+ vshufps $221, %xmm3, %xmm4, %xmm9
+ vpaddd %xmm9, %xmm6, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vmovdqa ROR328(%rip), %xmm6
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm8, %xmm8
+ vpxor %xmm8, %xmm0, %xmm0
+ vpshufd $147, %xmm1, %xmm1
+ vpshufd $78, %xmm8, %xmm8
+ vpslld $25, %xmm0, %xmm10
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm10, %xmm0, %xmm0
+ vshufps $136, %xmm5, %xmm2, %xmm10
+ vpshufd $57, %xmm0, %xmm0
+ vpaddd %xmm10, %xmm9, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpaddd %xmm12, %xmm9, %xmm9
+ vpblendw $12, %xmm2, %xmm3, %xmm12
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm8, %xmm8
+ vpxor %xmm8, %xmm0, %xmm10
+ vpslld $20, %xmm10, %xmm0
+ vpsrld $12, %xmm10, %xmm10
+ vpxor %xmm0, %xmm10, %xmm0
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm8, %xmm8
+ vpxor %xmm8, %xmm0, %xmm0
+ vpshufd $57, %xmm1, %xmm1
+ vpshufd $78, %xmm8, %xmm8
+ vpslld $25, %xmm0, %xmm10
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm10, %xmm0, %xmm0
+ vpslldq $4, %xmm5, %xmm10
+ vpblendw $240, %xmm10, %xmm12, %xmm12
+ vpshufd $147, %xmm0, %xmm0
+ vpshufd $147, %xmm12, %xmm12
+ vpaddd %xmm9, %xmm12, %xmm12
+ vpaddd %xmm0, %xmm12, %xmm12
+ vpxor %xmm12, %xmm1, %xmm1
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm8, %xmm8
+ vpxor %xmm8, %xmm0, %xmm11
+ vpslld $20, %xmm11, %xmm9
+ vpsrld $12, %xmm11, %xmm11
+ vpxor %xmm9, %xmm11, %xmm0
+ vpshufd $8, %xmm2, %xmm9
+ vpblendw $192, %xmm5, %xmm3, %xmm11
+ vpblendw $240, %xmm11, %xmm9, %xmm9
+ vpshufd $177, %xmm9, %xmm9
+ vpaddd %xmm12, %xmm9, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm11
+ vpxor %xmm11, %xmm1, %xmm1
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm8, %xmm8
+ vpxor %xmm8, %xmm0, %xmm9
+ vpshufd $147, %xmm1, %xmm1
+ vpshufd $78, %xmm8, %xmm8
+ vpslld $25, %xmm9, %xmm0
+ vpsrld $7, %xmm9, %xmm9
+ vpxor %xmm0, %xmm9, %xmm0
+ vpslldq $4, %xmm3, %xmm9
+ vpblendw $48, %xmm9, %xmm2, %xmm9
+ vpblendw $240, %xmm9, %xmm4, %xmm9
+ vpshufd $57, %xmm0, %xmm0
+ vpshufd $177, %xmm9, %xmm9
+ vpaddd %xmm11, %xmm9, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm8, %xmm11
+ vpxor %xmm11, %xmm0, %xmm0
+ vpslld $20, %xmm0, %xmm8
+ vpsrld $12, %xmm0, %xmm0
+ vpxor %xmm8, %xmm0, %xmm0
+ vpunpckhdq %xmm3, %xmm4, %xmm8
+ vpblendw $12, %xmm10, %xmm8, %xmm12
+ vpshufd $177, %xmm12, %xmm12
+ vpaddd %xmm9, %xmm12, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm11, %xmm11
+ vpxor %xmm11, %xmm0, %xmm0
+ vpshufd $57, %xmm1, %xmm1
+ vpshufd $78, %xmm11, %xmm11
+ vpslld $25, %xmm0, %xmm12
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm12, %xmm0, %xmm0
+ vpunpckhdq %xmm5, %xmm2, %xmm12
+ vpshufd $147, %xmm0, %xmm0
+ vpblendw $15, %xmm13, %xmm12, %xmm12
+ vpslldq $8, %xmm5, %xmm13
+ vpshufd $210, %xmm12, %xmm12
+ vpaddd %xmm9, %xmm12, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm11, %xmm11
+ vpxor %xmm11, %xmm0, %xmm0
+ vpslld $20, %xmm0, %xmm12
+ vpsrld $12, %xmm0, %xmm0
+ vpxor %xmm12, %xmm0, %xmm0
+ vpunpckldq %xmm4, %xmm2, %xmm12
+ vpblendw $240, %xmm4, %xmm12, %xmm12
+ vpblendw $192, %xmm13, %xmm12, %xmm12
+ vpsrldq $12, %xmm3, %xmm13
+ vpaddd %xmm12, %xmm9, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm11, %xmm11
+ vpxor %xmm11, %xmm0, %xmm0
+ vpshufd $147, %xmm1, %xmm1
+ vpshufd $78, %xmm11, %xmm11
+ vpslld $25, %xmm0, %xmm12
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm12, %xmm0, %xmm0
+ vpblendw $60, %xmm2, %xmm4, %xmm12
+ vpblendw $3, %xmm13, %xmm12, %xmm12
+ vpshufd $57, %xmm0, %xmm0
+ vpshufd $78, %xmm12, %xmm12
+ vpaddd %xmm9, %xmm12, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm11, %xmm11
+ vpxor %xmm11, %xmm0, %xmm12
+ vpslld $20, %xmm12, %xmm13
+ vpsrld $12, %xmm12, %xmm0
+ vpblendw $51, %xmm3, %xmm4, %xmm12
+ vpxor %xmm13, %xmm0, %xmm0
+ vpblendw $192, %xmm10, %xmm12, %xmm10
+ vpslldq $8, %xmm2, %xmm12
+ vpshufd $27, %xmm10, %xmm10
+ vpaddd %xmm9, %xmm10, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm11, %xmm11
+ vpxor %xmm11, %xmm0, %xmm0
+ vpshufd $57, %xmm1, %xmm1
+ vpshufd $78, %xmm11, %xmm11
+ vpslld $25, %xmm0, %xmm10
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm10, %xmm0, %xmm0
+ vpunpckhdq %xmm2, %xmm8, %xmm10
+ vpshufd $147, %xmm0, %xmm0
+ vpblendw $12, %xmm5, %xmm10, %xmm10
+ vpshufd $210, %xmm10, %xmm10
+ vpaddd %xmm9, %xmm10, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm11, %xmm11
+ vpxor %xmm11, %xmm0, %xmm10
+ vpslld $20, %xmm10, %xmm0
+ vpsrld $12, %xmm10, %xmm10
+ vpxor %xmm0, %xmm10, %xmm0
+ vpblendw $12, %xmm4, %xmm5, %xmm10
+ vpblendw $192, %xmm12, %xmm10, %xmm10
+ vpunpckldq %xmm2, %xmm4, %xmm12
+ vpshufd $135, %xmm10, %xmm10
+ vpaddd %xmm9, %xmm10, %xmm9
+ vpaddd %xmm0, %xmm9, %xmm9
+ vpxor %xmm9, %xmm1, %xmm1
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm11, %xmm13
+ vpxor %xmm13, %xmm0, %xmm0
+ vpshufd $147, %xmm1, %xmm1
+ vpshufd $78, %xmm13, %xmm13
+ vpslld $25, %xmm0, %xmm10
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm10, %xmm0, %xmm0
+ vpblendw $15, %xmm3, %xmm4, %xmm10
+ vpblendw $192, %xmm5, %xmm10, %xmm10
+ vpshufd $57, %xmm0, %xmm0
+ vpshufd $198, %xmm10, %xmm10
+ vpaddd %xmm9, %xmm10, %xmm10
+ vpaddd %xmm0, %xmm10, %xmm10
+ vpxor %xmm10, %xmm1, %xmm1
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm13, %xmm13
+ vpxor %xmm13, %xmm0, %xmm9
+ vpslld $20, %xmm9, %xmm0
+ vpsrld $12, %xmm9, %xmm9
+ vpxor %xmm0, %xmm9, %xmm0
+ vpunpckhdq %xmm2, %xmm3, %xmm9
+ vpunpcklqdq %xmm12, %xmm9, %xmm15
+ vpunpcklqdq %xmm12, %xmm8, %xmm12
+ vpblendw $15, %xmm5, %xmm8, %xmm8
+ vpaddd %xmm15, %xmm10, %xmm15
+ vpaddd %xmm0, %xmm15, %xmm15
+ vpxor %xmm15, %xmm1, %xmm1
+ vpshufd $141, %xmm8, %xmm8
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm13, %xmm13
+ vpxor %xmm13, %xmm0, %xmm0
+ vpshufd $57, %xmm1, %xmm1
+ vpshufd $78, %xmm13, %xmm13
+ vpslld $25, %xmm0, %xmm10
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm10, %xmm0, %xmm0
+ vpunpcklqdq %xmm2, %xmm3, %xmm10
+ vpshufd $147, %xmm0, %xmm0
+ vpblendw $51, %xmm14, %xmm10, %xmm14
+ vpshufd $135, %xmm14, %xmm14
+ vpaddd %xmm15, %xmm14, %xmm14
+ vpaddd %xmm0, %xmm14, %xmm14
+ vpxor %xmm14, %xmm1, %xmm1
+ vpunpcklqdq %xmm3, %xmm4, %xmm15
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm13, %xmm13
+ vpxor %xmm13, %xmm0, %xmm0
+ vpslld $20, %xmm0, %xmm11
+ vpsrld $12, %xmm0, %xmm0
+ vpxor %xmm11, %xmm0, %xmm0
+ vpunpckhqdq %xmm5, %xmm3, %xmm11
+ vpblendw $51, %xmm15, %xmm11, %xmm11
+ vpunpckhqdq %xmm3, %xmm5, %xmm15
+ vpaddd %xmm11, %xmm14, %xmm11
+ vpaddd %xmm0, %xmm11, %xmm11
+ vpxor %xmm11, %xmm1, %xmm1
+ vpshufb %xmm6, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm13, %xmm13
+ vpxor %xmm13, %xmm0, %xmm0
+ vpshufd $147, %xmm1, %xmm1
+ vpshufd $78, %xmm13, %xmm13
+ vpslld $25, %xmm0, %xmm14
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm14, %xmm0, %xmm14
+ vpunpckhqdq %xmm4, %xmm2, %xmm0
+ vpshufd $57, %xmm14, %xmm14
+ vpblendw $51, %xmm15, %xmm0, %xmm15
+ vpaddd %xmm15, %xmm11, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm1, %xmm1
+ vpshufb %xmm7, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm13, %xmm13
+ vpxor %xmm13, %xmm14, %xmm14
+ vpslld $20, %xmm14, %xmm11
+ vpsrld $12, %xmm14, %xmm14
+ vpxor %xmm11, %xmm14, %xmm14
+ vpblendw $3, %xmm2, %xmm4, %xmm11
+ vpslldq $8, %xmm11, %xmm0
+ vpblendw $15, %xmm5, %xmm0, %xmm0
+ vpshufd $99, %xmm0, %xmm0
+ vpaddd %xmm15, %xmm0, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm1, %xmm0
+ vpaddd %xmm12, %xmm15, %xmm15
+ vpshufb %xmm6, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm13, %xmm13
+ vpxor %xmm13, %xmm14, %xmm14
+ vpshufd $57, %xmm0, %xmm0
+ vpshufd $78, %xmm13, %xmm13
+ vpslld $25, %xmm14, %xmm1
+ vpsrld $7, %xmm14, %xmm14
+ vpxor %xmm1, %xmm14, %xmm14
+ vpblendw $3, %xmm5, %xmm4, %xmm1
+ vpshufd $147, %xmm14, %xmm14
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpshufb %xmm7, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm13, %xmm13
+ vpxor %xmm13, %xmm14, %xmm14
+ vpslld $20, %xmm14, %xmm12
+ vpsrld $12, %xmm14, %xmm14
+ vpxor %xmm12, %xmm14, %xmm14
+ vpsrldq $4, %xmm2, %xmm12
+ vpblendw $60, %xmm12, %xmm1, %xmm1
+ vpaddd %xmm1, %xmm15, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpblendw $12, %xmm4, %xmm3, %xmm1
+ vpshufb %xmm6, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm13, %xmm13
+ vpxor %xmm13, %xmm14, %xmm14
+ vpshufd $147, %xmm0, %xmm0
+ vpshufd $78, %xmm13, %xmm13
+ vpslld $25, %xmm14, %xmm12
+ vpsrld $7, %xmm14, %xmm14
+ vpxor %xmm12, %xmm14, %xmm14
+ vpsrldq $4, %xmm5, %xmm12
+ vpblendw $48, %xmm12, %xmm1, %xmm1
+ vpshufd $33, %xmm5, %xmm12
+ vpshufd $57, %xmm14, %xmm14
+ vpshufd $108, %xmm1, %xmm1
+ vpblendw $51, %xmm12, %xmm10, %xmm12
+ vpaddd %xmm15, %xmm1, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpaddd %xmm12, %xmm15, %xmm15
+ vpshufb %xmm7, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm13, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpslld $20, %xmm14, %xmm13
+ vpsrld $12, %xmm14, %xmm14
+ vpxor %xmm13, %xmm14, %xmm14
+ vpslldq $12, %xmm3, %xmm13
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpshufb %xmm6, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpshufd $57, %xmm0, %xmm0
+ vpshufd $78, %xmm1, %xmm1
+ vpslld $25, %xmm14, %xmm12
+ vpsrld $7, %xmm14, %xmm14
+ vpxor %xmm12, %xmm14, %xmm14
+ vpblendw $51, %xmm5, %xmm4, %xmm12
+ vpshufd $147, %xmm14, %xmm14
+ vpblendw $192, %xmm13, %xmm12, %xmm12
+ vpaddd %xmm12, %xmm15, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpsrldq $4, %xmm3, %xmm12
+ vpshufb %xmm7, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpslld $20, %xmm14, %xmm13
+ vpsrld $12, %xmm14, %xmm14
+ vpxor %xmm13, %xmm14, %xmm14
+ vpblendw $48, %xmm2, %xmm5, %xmm13
+ vpblendw $3, %xmm12, %xmm13, %xmm13
+ vpshufd $156, %xmm13, %xmm13
+ vpaddd %xmm15, %xmm13, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpshufb %xmm6, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpshufd $147, %xmm0, %xmm0
+ vpshufd $78, %xmm1, %xmm1
+ vpslld $25, %xmm14, %xmm13
+ vpsrld $7, %xmm14, %xmm14
+ vpxor %xmm13, %xmm14, %xmm14
+ vpunpcklqdq %xmm2, %xmm4, %xmm13
+ vpshufd $57, %xmm14, %xmm14
+ vpblendw $12, %xmm12, %xmm13, %xmm12
+ vpshufd $180, %xmm12, %xmm12
+ vpaddd %xmm15, %xmm12, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpshufb %xmm7, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpslld $20, %xmm14, %xmm12
+ vpsrld $12, %xmm14, %xmm14
+ vpxor %xmm12, %xmm14, %xmm14
+ vpunpckhqdq %xmm9, %xmm4, %xmm12
+ vpshufd $198, %xmm12, %xmm12
+ vpaddd %xmm15, %xmm12, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpaddd %xmm15, %xmm8, %xmm15
+ vpshufb %xmm6, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpshufd $57, %xmm0, %xmm0
+ vpshufd $78, %xmm1, %xmm1
+ vpslld $25, %xmm14, %xmm12
+ vpsrld $7, %xmm14, %xmm14
+ vpxor %xmm12, %xmm14, %xmm14
+ vpsrldq $4, %xmm4, %xmm12
+ vpshufd $147, %xmm14, %xmm14
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm15, %xmm0, %xmm0
+ vpshufb %xmm7, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpslld $20, %xmm14, %xmm8
+ vpsrld $12, %xmm14, %xmm14
+ vpxor %xmm14, %xmm8, %xmm14
+ vpblendw $48, %xmm5, %xmm2, %xmm8
+ vpblendw $3, %xmm12, %xmm8, %xmm8
+ vpunpckhqdq %xmm5, %xmm4, %xmm12
+ vpshufd $75, %xmm8, %xmm8
+ vpblendw $60, %xmm10, %xmm12, %xmm10
+ vpaddd %xmm15, %xmm8, %xmm15
+ vpaddd %xmm14, %xmm15, %xmm15
+ vpxor %xmm0, %xmm15, %xmm0
+ vpshufd $45, %xmm10, %xmm10
+ vpshufb %xmm6, %xmm0, %xmm0
+ vpaddd %xmm15, %xmm10, %xmm15
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm1, %xmm14, %xmm14
+ vpshufd $147, %xmm0, %xmm0
+ vpshufd $78, %xmm1, %xmm1
+ vpslld $25, %xmm14, %xmm8
+ vpsrld $7, %xmm14, %xmm14
+ vpxor %xmm14, %xmm8, %xmm8
+ vpshufd $57, %xmm8, %xmm8
+ vpaddd %xmm8, %xmm15, %xmm15
+ vpxor %xmm0, %xmm15, %xmm0
+ vpshufb %xmm7, %xmm0, %xmm0
+ vpaddd %xmm0, %xmm1, %xmm1
+ vpxor %xmm8, %xmm1, %xmm8
+ vpslld $20, %xmm8, %xmm10
+ vpsrld $12, %xmm8, %xmm8
+ vpxor %xmm8, %xmm10, %xmm10
+ vpunpckldq %xmm3, %xmm4, %xmm8
+ vpunpcklqdq %xmm9, %xmm8, %xmm9
+ vpaddd %xmm9, %xmm15, %xmm9
+ vpaddd %xmm10, %xmm9, %xmm9
+ vpxor %xmm0, %xmm9, %xmm8
+ vpshufb %xmm6, %xmm8, %xmm8
+ vpaddd %xmm8, %xmm1, %xmm1
+ vpxor %xmm1, %xmm10, %xmm10
+ vpshufd $57, %xmm8, %xmm8
+ vpshufd $78, %xmm1, %xmm1
+ vpslld $25, %xmm10, %xmm12
+ vpsrld $7, %xmm10, %xmm10
+ vpxor %xmm10, %xmm12, %xmm10
+ vpblendw $48, %xmm4, %xmm3, %xmm12
+ vpshufd $147, %xmm10, %xmm0
+ vpunpckhdq %xmm5, %xmm3, %xmm10
+ vpshufd $78, %xmm12, %xmm12
+ vpunpcklqdq %xmm4, %xmm10, %xmm10
+ vpblendw $192, %xmm2, %xmm10, %xmm10
+ vpshufhw $78, %xmm10, %xmm10
+ vpaddd %xmm10, %xmm9, %xmm10
+ vpaddd %xmm0, %xmm10, %xmm10
+ vpxor %xmm8, %xmm10, %xmm8
+ vpshufb %xmm7, %xmm8, %xmm8
+ vpaddd %xmm8, %xmm1, %xmm1
+ vpxor %xmm0, %xmm1, %xmm9
+ vpslld $20, %xmm9, %xmm0
+ vpsrld $12, %xmm9, %xmm9
+ vpxor %xmm9, %xmm0, %xmm0
+ vpunpckhdq %xmm5, %xmm4, %xmm9
+ vpblendw $240, %xmm9, %xmm2, %xmm13
+ vpshufd $39, %xmm13, %xmm13
+ vpaddd %xmm10, %xmm13, %xmm10
+ vpaddd %xmm0, %xmm10, %xmm10
+ vpxor %xmm8, %xmm10, %xmm8
+ vpblendw $12, %xmm4, %xmm2, %xmm13
+ vpshufb %xmm6, %xmm8, %xmm8
+ vpslldq $4, %xmm13, %xmm13
+ vpblendw $15, %xmm5, %xmm13, %xmm13
+ vpaddd %xmm8, %xmm1, %xmm1
+ vpxor %xmm1, %xmm0, %xmm0
+ vpaddd %xmm13, %xmm10, %xmm13
+ vpshufd $147, %xmm8, %xmm8
+ vpshufd $78, %xmm1, %xmm1
+ vpslld $25, %xmm0, %xmm14
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm0, %xmm14, %xmm14
+ vpshufd $57, %xmm14, %xmm14
+ vpaddd %xmm14, %xmm13, %xmm13
+ vpxor %xmm8, %xmm13, %xmm8
+ vpaddd %xmm13, %xmm12, %xmm12
+ vpshufb %xmm7, %xmm8, %xmm8
+ vpaddd %xmm8, %xmm1, %xmm1
+ vpxor %xmm14, %xmm1, %xmm14
+ vpslld $20, %xmm14, %xmm10
+ vpsrld $12, %xmm14, %xmm14
+ vpxor %xmm14, %xmm10, %xmm10
+ vpaddd %xmm10, %xmm12, %xmm12
+ vpxor %xmm8, %xmm12, %xmm8
+ vpshufb %xmm6, %xmm8, %xmm8
+ vpaddd %xmm8, %xmm1, %xmm1
+ vpxor %xmm1, %xmm10, %xmm0
+ vpshufd $57, %xmm8, %xmm8
+ vpshufd $78, %xmm1, %xmm1
+ vpslld $25, %xmm0, %xmm10
+ vpsrld $7, %xmm0, %xmm0
+ vpxor %xmm0, %xmm10, %xmm10
+ vpblendw $48, %xmm2, %xmm3, %xmm0
+ vpblendw $15, %xmm11, %xmm0, %xmm0
+ vpshufd $147, %xmm10, %xmm10
+ vpshufd $114, %xmm0, %xmm0
+ vpaddd %xmm12, %xmm0, %xmm0
+ vpaddd %xmm10, %xmm0, %xmm0
+ vpxor %xmm8, %xmm0, %xmm8
+ vpshufb %xmm7, %xmm8, %xmm8
+ vpaddd %xmm8, %xmm1, %xmm1
+ vpxor %xmm10, %xmm1, %xmm10
+ vpslld $20, %xmm10, %xmm11
+ vpsrld $12, %xmm10, %xmm10
+ vpxor %xmm10, %xmm11, %xmm10
+ vpslldq $4, %xmm4, %xmm11
+ vpblendw $192, %xmm11, %xmm3, %xmm3
+ vpunpckldq %xmm5, %xmm4, %xmm4
+ vpshufd $99, %xmm3, %xmm3
+ vpaddd %xmm0, %xmm3, %xmm3
+ vpaddd %xmm10, %xmm3, %xmm3
+ vpxor %xmm8, %xmm3, %xmm11
+ vpunpckldq %xmm5, %xmm2, %xmm0
+ vpblendw $192, %xmm2, %xmm5, %xmm2
+ vpshufb %xmm6, %xmm11, %xmm11
+ vpunpckhqdq %xmm0, %xmm9, %xmm0
+ vpblendw $15, %xmm4, %xmm2, %xmm4
+ vpaddd %xmm11, %xmm1, %xmm1
+ vpxor %xmm1, %xmm10, %xmm10
+ vpshufd $147, %xmm11, %xmm11
+ vpshufd $201, %xmm0, %xmm0
+ vpslld $25, %xmm10, %xmm8
+ vpsrld $7, %xmm10, %xmm10
+ vpxor %xmm10, %xmm8, %xmm10
+ vpshufd $78, %xmm1, %xmm1
+ vpaddd %xmm3, %xmm0, %xmm0
+ vpshufd $27, %xmm4, %xmm4
+ vpshufd $57, %xmm10, %xmm10
+ vpaddd %xmm10, %xmm0, %xmm0
+ vpxor %xmm11, %xmm0, %xmm11
+ vpaddd %xmm0, %xmm4, %xmm0
+ vpshufb %xmm7, %xmm11, %xmm7
+ vpaddd %xmm7, %xmm1, %xmm1
+ vpxor %xmm10, %xmm1, %xmm10
+ vpslld $20, %xmm10, %xmm8
+ vpsrld $12, %xmm10, %xmm10
+ vpxor %xmm10, %xmm8, %xmm8
+ vpaddd %xmm8, %xmm0, %xmm0
+ vpxor %xmm7, %xmm0, %xmm7
+ vpshufb %xmm6, %xmm7, %xmm6
+ vpaddd %xmm6, %xmm1, %xmm1
+ vpxor %xmm1, %xmm8, %xmm8
+ vpshufd $78, %xmm1, %xmm1
+ vpshufd $57, %xmm6, %xmm6
+ vpslld $25, %xmm8, %xmm2
+ vpsrld $7, %xmm8, %xmm8
+ vpxor %xmm8, %xmm2, %xmm8
+ vpxor (%rdi), %xmm1, %xmm1
+ vpshufd $147, %xmm8, %xmm8
+ vpxor %xmm0, %xmm1, %xmm0
+ vmovups %xmm0, (%rdi)
+ vpxor 16(%rdi), %xmm8, %xmm0
+ vpxor %xmm6, %xmm0, %xmm6
+ vmovups %xmm6, 16(%rdi)
+ addq $64, %rsi
+ decq %rdx
+ jnz .Lbeginofloop
+.Lendofloop:
+ ret
+ENDPROC(blake2s_compress_avx)
+#endif /* CONFIG_AS_AVX */
+
+#ifdef CONFIG_AS_AVX512
+ENTRY(blake2s_compress_avx512)
+ vmovdqu (%rdi),%xmm0
+ vmovdqu 0x10(%rdi),%xmm1
+ vmovdqu 0x20(%rdi),%xmm4
+ vmovq %rcx,%xmm5
+ vmovdqa IV(%rip),%xmm14
+ vmovdqa IV+16(%rip),%xmm15
+ jmp .Lblake2s_compress_avx512_mainloop
+.align 32
+.Lblake2s_compress_avx512_mainloop:
+ vmovdqa %xmm0,%xmm10
+ vmovdqa %xmm1,%xmm11
+ vpaddq %xmm5,%xmm4,%xmm4
+ vmovdqa %xmm14,%xmm2
+ vpxor %xmm15,%xmm4,%xmm3
+ vmovdqu (%rsi),%ymm6
+ vmovdqu 0x20(%rsi),%ymm7
+ addq $0x40,%rsi
+ leaq SIGMA(%rip),%rax
+ movb $0xa,%cl
+.Lblake2s_compress_avx512_roundloop:
+ addq $0x40,%rax
+ vmovdqa -0x40(%rax),%ymm8
+ vmovdqa -0x20(%rax),%ymm9
+ vpermi2d %ymm7,%ymm6,%ymm8
+ vpermi2d %ymm7,%ymm6,%ymm9
+ vmovdqa %ymm8,%ymm6
+ vmovdqa %ymm9,%ymm7
+ vpaddd %xmm8,%xmm0,%xmm0
+ vpaddd %xmm1,%xmm0,%xmm0
+ vpxor %xmm0,%xmm3,%xmm3
+ vprord $0x10,%xmm3,%xmm3
+ vpaddd %xmm3,%xmm2,%xmm2
+ vpxor %xmm2,%xmm1,%xmm1
+ vprord $0xc,%xmm1,%xmm1
+ vextracti128 $0x1,%ymm8,%xmm8
+ vpaddd %xmm8,%xmm0,%xmm0
+ vpaddd %xmm1,%xmm0,%xmm0
+ vpxor %xmm0,%xmm3,%xmm3
+ vprord $0x8,%xmm3,%xmm3
+ vpaddd %xmm3,%xmm2,%xmm2
+ vpxor %xmm2,%xmm1,%xmm1
+ vprord $0x7,%xmm1,%xmm1
+ vpshufd $0x39,%xmm1,%xmm1
+ vpshufd $0x4e,%xmm2,%xmm2
+ vpshufd $0x93,%xmm3,%xmm3
+ vpaddd %xmm9,%xmm0,%xmm0
+ vpaddd %xmm1,%xmm0,%xmm0
+ vpxor %xmm0,%xmm3,%xmm3
+ vprord $0x10,%xmm3,%xmm3
+ vpaddd %xmm3,%xmm2,%xmm2
+ vpxor %xmm2,%xmm1,%xmm1
+ vprord $0xc,%xmm1,%xmm1
+ vextracti128 $0x1,%ymm9,%xmm9
+ vpaddd %xmm9,%xmm0,%xmm0
+ vpaddd %xmm1,%xmm0,%xmm0
+ vpxor %xmm0,%xmm3,%xmm3
+ vprord $0x8,%xmm3,%xmm3
+ vpaddd %xmm3,%xmm2,%xmm2
+ vpxor %xmm2,%xmm1,%xmm1
+ vprord $0x7,%xmm1,%xmm1
+ vpshufd $0x93,%xmm1,%xmm1
+ vpshufd $0x4e,%xmm2,%xmm2
+ vpshufd $0x39,%xmm3,%xmm3
+ decb %cl
+ jne .Lblake2s_compress_avx512_roundloop
+ vpxor %xmm10,%xmm0,%xmm0
+ vpxor %xmm11,%xmm1,%xmm1
+ vpxor %xmm2,%xmm0,%xmm0
+ vpxor %xmm3,%xmm1,%xmm1
+ decq %rdx
+ jne .Lblake2s_compress_avx512_mainloop
+ vmovdqu %xmm0,(%rdi)
+ vmovdqu %xmm1,0x10(%rdi)
+ vmovdqu %xmm4,0x20(%rdi)
+ vzeroupper
+ retq
+ENDPROC(blake2s_compress_avx512)
+#endif /* CONFIG_AS_AVX512 */
diff --git a/lib/zinc/blake2s/blake2s.c b/lib/zinc/blake2s/blake2s.c
new file mode 100644
index 000000000000..7f6e8c2cab3c
--- /dev/null
+++ b/lib/zinc/blake2s/blake2s.c
@@ -0,0 +1,292 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ *
+ * Original author: Samuel Neves <sneves@....uc.pt>
+ */
+
+#include <zinc/blake2s.h>
+
+#include <linux/types.h>
+#include <linux/string.h>
+#include <linux/kernel.h>
+#include <linux/bug.h>
+#include <asm/unaligned.h>
+
+typedef union {
+ struct {
+ u8 digest_length;
+ u8 key_length;
+ u8 fanout;
+ u8 depth;
+ u32 leaf_length;
+ u32 node_offset;
+ u16 xof_length;
+ u8 node_depth;
+ u8 inner_length;
+ u8 salt[8];
+ u8 personal[8];
+ };
+ __le32 words[8];
+} __packed blake2s_param;
+
+static const u32 blake2s_iv[8] = {
+ 0x6A09E667UL, 0xBB67AE85UL, 0x3C6EF372UL, 0xA54FF53AUL,
+ 0x510E527FUL, 0x9B05688CUL, 0x1F83D9ABUL, 0x5BE0CD19UL
+};
+
+static const u8 blake2s_sigma[10][16] = {
+ {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15},
+ {14, 10, 4, 8, 9, 15, 13, 6, 1, 12, 0, 2, 11, 7, 5, 3},
+ {11, 8, 12, 0, 5, 2, 15, 13, 10, 14, 3, 6, 7, 1, 9, 4},
+ {7, 9, 3, 1, 13, 12, 11, 14, 2, 6, 5, 10, 4, 0, 15, 8},
+ {9, 0, 5, 7, 2, 4, 10, 15, 14, 1, 11, 12, 6, 8, 3, 13},
+ {2, 12, 6, 10, 0, 11, 8, 3, 4, 13, 7, 5, 15, 14, 1, 9},
+ {12, 5, 1, 15, 14, 13, 4, 10, 0, 7, 6, 3, 9, 2, 8, 11},
+ {13, 11, 7, 14, 12, 1, 3, 9, 5, 0, 15, 4, 8, 6, 2, 10},
+ {6, 15, 14, 9, 11, 3, 0, 8, 12, 2, 13, 7, 1, 4, 10, 5},
+ {10, 2, 8, 4, 7, 6, 1, 5, 15, 11, 9, 14, 3, 12, 13, 0},
+};
+
+static inline void blake2s_set_lastblock(struct blake2s_state *state)
+{
+ if (state->last_node)
+ state->f[1] = -1;
+ state->f[0] = -1;
+}
+
+static inline void blake2s_increment_counter(struct blake2s_state *state, const u32 inc)
+{
+ state->t[0] += inc;
+ state->t[1] += (state->t[0] < inc);
+}
+
+static inline void blake2s_init_param(struct blake2s_state *state, const blake2s_param *param)
+{
+ int i;
+
+ memset(state, 0, sizeof(struct blake2s_state));
+ for (i = 0; i < 8; ++i)
+ state->h[i] = blake2s_iv[i] ^ le32_to_cpu(param->words[i]);
+}
+
+void blake2s_init(struct blake2s_state *state, const size_t outlen)
+{
+ blake2s_param param __aligned(__alignof__(u32)) = {
+ .digest_length = outlen,
+ .fanout = 1,
+ .depth = 1
+ };
+
+#ifdef CONFIG_ZINC_DEBUG
+ BUG_ON(!outlen || outlen > BLAKE2S_OUTBYTES);
+#endif
+ blake2s_init_param(state, ¶m);
+}
+EXPORT_SYMBOL(blake2s_init);
+
+void blake2s_init_key(struct blake2s_state *state, const size_t outlen, const void *key, const size_t keylen)
+{
+ blake2s_param param = {
+ .digest_length = outlen,
+ .key_length = keylen,
+ .fanout = 1,
+ .depth = 1
+ };
+ u8 block[BLAKE2S_BLOCKBYTES] = { 0 };
+
+#ifdef CONFIG_ZINC_DEBUG
+ BUG_ON(!outlen || outlen > BLAKE2S_OUTBYTES || !key || !keylen || keylen > BLAKE2S_KEYBYTES);
+#endif
+ blake2s_init_param(state, ¶m);
+ memcpy(block, key, keylen);
+ blake2s_update(state, block, BLAKE2S_BLOCKBYTES);
+ memzero_explicit(block, BLAKE2S_BLOCKBYTES);
+}
+EXPORT_SYMBOL(blake2s_init_key);
+
+#ifdef CONFIG_X86_64
+#include <asm/cpufeature.h>
+#include <asm/processor.h>
+#include <asm/fpu/api.h>
+#include <asm/simd.h>
+static bool blake2s_use_avx __ro_after_init;
+static bool blake2s_use_avx512 __ro_after_init;
+void __init blake2s_fpu_init(void)
+{
+#ifndef CONFIG_UML
+ blake2s_use_avx = boot_cpu_has(X86_FEATURE_AVX) && cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM, NULL);
+ blake2s_use_avx512 = boot_cpu_has(X86_FEATURE_AVX) && boot_cpu_has(X86_FEATURE_AVX2) && boot_cpu_has(X86_FEATURE_AVX512F) && boot_cpu_has(X86_FEATURE_AVX512VL) && cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM | XFEATURE_MASK_AVX512, NULL);
+#endif
+}
+#ifdef CONFIG_AS_AVX
+asmlinkage void blake2s_compress_avx(struct blake2s_state *state, const u8 *block, const size_t nblocks, const u32 inc);
+#endif
+#ifdef CONFIG_AS_AVX512
+asmlinkage void blake2s_compress_avx512(struct blake2s_state *state, const u8 *block, const size_t nblocks, const u32 inc);
+#endif
+#else
+void __init blake2s_fpu_init(void) { }
+#endif
+
+static inline void blake2s_compress(struct blake2s_state *state, const u8 *block, size_t nblocks, const u32 inc)
+{
+ u32 m[16];
+ u32 v[16];
+ int i;
+
+#ifdef CONFIG_ZINC_DEBUG
+ BUG_ON(nblocks > 1 && inc != BLAKE2S_BLOCKBYTES);
+#endif
+
+#ifdef CONFIG_X86_64
+#ifdef CONFIG_AS_AVX512
+ if (blake2s_use_avx512 && irq_fpu_usable()) {
+ kernel_fpu_begin();
+ blake2s_compress_avx512(state, block, nblocks, inc);
+ kernel_fpu_end();
+ return;
+ }
+#endif
+#ifdef CONFIG_AS_AVX
+ if (blake2s_use_avx && irq_fpu_usable()) {
+ kernel_fpu_begin();
+ blake2s_compress_avx(state, block, nblocks, inc);
+ kernel_fpu_end();
+ return;
+ }
+#endif
+#endif
+
+ while (nblocks > 0) {
+ blake2s_increment_counter(state, inc);
+
+#ifdef __LITTLE_ENDIAN
+ memcpy(m, block, BLAKE2S_BLOCKBYTES);
+#else
+ for (i = 0; i < 16; ++i)
+ m[i] = get_unaligned_le32(block + i * sizeof(m[i]));
+#endif
+ memcpy(v, state->h, 32);
+ v[ 8] = blake2s_iv[0];
+ v[ 9] = blake2s_iv[1];
+ v[10] = blake2s_iv[2];
+ v[11] = blake2s_iv[3];
+ v[12] = blake2s_iv[4] ^ state->t[0];
+ v[13] = blake2s_iv[5] ^ state->t[1];
+ v[14] = blake2s_iv[6] ^ state->f[0];
+ v[15] = blake2s_iv[7] ^ state->f[1];
+
+#define G(r, i, a, b, c, d) do { \
+ a += b + m[blake2s_sigma[r][2 * i + 0]]; \
+ d = ror32(d ^ a, 16); \
+ c += d; \
+ b = ror32(b ^ c, 12); \
+ a += b + m[blake2s_sigma[r][2 * i + 1]]; \
+ d = ror32(d ^ a, 8); \
+ c += d; \
+ b = ror32(b ^ c, 7); \
+} while (0)
+
+#define ROUND(r) do { \
+ G(r, 0, v[0], v[ 4], v[ 8], v[12]); \
+ G(r, 1, v[1], v[ 5], v[ 9], v[13]); \
+ G(r, 2, v[2], v[ 6], v[10], v[14]); \
+ G(r, 3, v[3], v[ 7], v[11], v[15]); \
+ G(r, 4, v[0], v[ 5], v[10], v[15]); \
+ G(r, 5, v[1], v[ 6], v[11], v[12]); \
+ G(r, 6, v[2], v[ 7], v[ 8], v[13]); \
+ G(r, 7, v[3], v[ 4], v[ 9], v[14]); \
+} while (0)
+ ROUND(0);
+ ROUND(1);
+ ROUND(2);
+ ROUND(3);
+ ROUND(4);
+ ROUND(5);
+ ROUND(6);
+ ROUND(7);
+ ROUND(8);
+ ROUND(9);
+
+#undef G
+#undef ROUND
+
+ for (i = 0; i < 8; ++i)
+ state->h[i] ^= v[i] ^ v[i + 8];
+
+ block += BLAKE2S_BLOCKBYTES;
+ --nblocks;
+ }
+}
+
+void blake2s_update(struct blake2s_state *state, const u8 *in, size_t inlen)
+{
+ const size_t fill = BLAKE2S_BLOCKBYTES - state->buflen;
+
+ if (unlikely(!inlen))
+ return;
+ if (inlen > fill) {
+ memcpy(state->buf + state->buflen, in, fill);
+ blake2s_compress(state, state->buf, 1, BLAKE2S_BLOCKBYTES);
+ state->buflen = 0;
+ in += fill;
+ inlen -= fill;
+ }
+ if (inlen > BLAKE2S_BLOCKBYTES) {
+ const size_t nblocks = (inlen + BLAKE2S_BLOCKBYTES - 1) / BLAKE2S_BLOCKBYTES;
+ /* Hash one less (full) block than strictly possible */
+ blake2s_compress(state, in, nblocks - 1, BLAKE2S_BLOCKBYTES);
+ in += BLAKE2S_BLOCKBYTES * (nblocks - 1);
+ inlen -= BLAKE2S_BLOCKBYTES * (nblocks - 1);
+ }
+ memcpy(state->buf + state->buflen, in, inlen);
+ state->buflen += inlen;
+}
+EXPORT_SYMBOL(blake2s_update);
+
+void __blake2s_final(struct blake2s_state *state)
+{
+ blake2s_set_lastblock(state);
+ memset(state->buf + state->buflen, 0, BLAKE2S_BLOCKBYTES - state->buflen); /* Padding */
+ blake2s_compress(state, state->buf, 1, state->buflen);
+}
+EXPORT_SYMBOL(__blake2s_final);
+
+void blake2s_hmac(u8 *out, const u8 *in, const u8 *key, const size_t outlen, const size_t inlen, const size_t keylen)
+{
+ struct blake2s_state state;
+ u8 x_key[BLAKE2S_BLOCKBYTES] __aligned(__alignof__(u32)) = { 0 };
+ u8 i_hash[BLAKE2S_OUTBYTES] __aligned(__alignof__(u32));
+ int i;
+
+ if (keylen > BLAKE2S_BLOCKBYTES) {
+ blake2s_init(&state, BLAKE2S_OUTBYTES);
+ blake2s_update(&state, key, keylen);
+ blake2s_final(&state, x_key, BLAKE2S_OUTBYTES);
+ } else
+ memcpy(x_key, key, keylen);
+
+ for (i = 0; i < BLAKE2S_BLOCKBYTES; ++i)
+ x_key[i] ^= 0x36;
+
+ blake2s_init(&state, BLAKE2S_OUTBYTES);
+ blake2s_update(&state, x_key, BLAKE2S_BLOCKBYTES);
+ blake2s_update(&state, in, inlen);
+ blake2s_final(&state, i_hash, BLAKE2S_OUTBYTES);
+
+ for (i = 0; i < BLAKE2S_BLOCKBYTES; ++i)
+ x_key[i] ^= 0x5c ^ 0x36;
+
+ blake2s_init(&state, BLAKE2S_OUTBYTES);
+ blake2s_update(&state, x_key, BLAKE2S_BLOCKBYTES);
+ blake2s_update(&state, i_hash, BLAKE2S_OUTBYTES);
+ blake2s_final(&state, i_hash, BLAKE2S_OUTBYTES);
+
+ memcpy(out, i_hash, outlen);
+ memzero_explicit(x_key, BLAKE2S_BLOCKBYTES);
+ memzero_explicit(i_hash, BLAKE2S_OUTBYTES);
+}
+EXPORT_SYMBOL(blake2s_hmac);
+
+#include "selftest/blake2s.h"
diff --git a/lib/zinc/chacha20/chacha20-arm.S b/lib/zinc/chacha20/chacha20-arm.S
new file mode 100644
index 000000000000..601b4e34283d
--- /dev/null
+++ b/lib/zinc/chacha20/chacha20-arm.S
@@ -0,0 +1,1471 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#include <linux/linkage.h>
+
+.text
+#if defined(__thumb2__) || defined(__clang__)
+.syntax unified
+#endif
+#if defined(__thumb2__)
+.thumb
+#else
+.code 32
+#endif
+
+#if defined(__thumb2__) || defined(__clang__)
+#define ldrhsb ldrbhs
+#endif
+
+.align 5
+.Lsigma:
+.long 0x61707865,0x3320646e,0x79622d32,0x6b206574 @ endian-neutral
+.Lone:
+.long 1,0,0,0
+.word -1
+
+#if __LINUX_ARM_ARCH__ >= 7 && IS_ENABLED(CONFIG_KERNEL_MODE_NEON)
+.arch armv7-a
+.fpu neon
+
+.align 5
+ENTRY(chacha20_neon)
+ ldr r12,[sp,#0] @ pull pointer to counter and nonce
+ stmdb sp!,{r0-r2,r4-r11,lr}
+ cmp r2,#0 @ len==0?
+#ifdef __thumb2__
+ itt eq
+#endif
+ addeq sp,sp,#4*3
+ beq .Lno_data_neon
+ cmp r2,#192 @ test len
+ bls .Lshort
+.Lchacha20_neon_begin:
+ adr r14,.Lsigma
+ vstmdb sp!,{d8-d15} @ ABI spec says so
+ stmdb sp!,{r0-r3}
+
+ vld1.32 {q1-q2},[r3] @ load key
+ ldmia r3,{r4-r11} @ load key
+
+ sub sp,sp,#4*(16+16)
+ vld1.32 {q3},[r12] @ load counter and nonce
+ add r12,sp,#4*8
+ ldmia r14,{r0-r3} @ load sigma
+ vld1.32 {q0},[r14]! @ load sigma
+ vld1.32 {q12},[r14] @ one
+ vst1.32 {q2-q3},[r12] @ copy 1/2key|counter|nonce
+ vst1.32 {q0-q1},[sp] @ copy sigma|1/2key
+
+ str r10,[sp,#4*(16+10)] @ off-load "rx"
+ str r11,[sp,#4*(16+11)] @ off-load "rx"
+ vshl.i32 d26,d24,#1 @ two
+ vstr d24,[sp,#4*(16+0)]
+ vshl.i32 d28,d24,#2 @ four
+ vstr d26,[sp,#4*(16+2)]
+ vmov q4,q0
+ vstr d28,[sp,#4*(16+4)]
+ vmov q8,q0
+ vmov q5,q1
+ vmov q9,q1
+ b .Loop_neon_enter
+
+.align 4
+.Loop_neon_outer:
+ ldmia sp,{r0-r9} @ load key material
+ cmp r11,#64*2 @ if len<=64*2
+ bls .Lbreak_neon @ switch to integer-only
+ vmov q4,q0
+ str r11,[sp,#4*(32+2)] @ save len
+ vmov q8,q0
+ str r12, [sp,#4*(32+1)] @ save inp
+ vmov q5,q1
+ str r14, [sp,#4*(32+0)] @ save out
+ vmov q9,q1
+.Loop_neon_enter:
+ ldr r11, [sp,#4*(15)]
+ vadd.i32 q7,q3,q12 @ counter+1
+ ldr r12,[sp,#4*(12)] @ modulo-scheduled load
+ vmov q6,q2
+ ldr r10, [sp,#4*(13)]
+ vmov q10,q2
+ ldr r14,[sp,#4*(14)]
+ vadd.i32 q11,q7,q12 @ counter+2
+ str r11, [sp,#4*(16+15)]
+ mov r11,#10
+ add r12,r12,#3 @ counter+3
+ b .Loop_neon
+
+.align 4
+.Loop_neon:
+ subs r11,r11,#1
+ vadd.i32 q0,q0,q1
+ add r0,r0,r4
+ vadd.i32 q4,q4,q5
+ mov r12,r12,ror#16
+ vadd.i32 q8,q8,q9
+ add r1,r1,r5
+ veor q3,q3,q0
+ mov r10,r10,ror#16
+ veor q7,q7,q4
+ eor r12,r12,r0,ror#16
+ veor q11,q11,q8
+ eor r10,r10,r1,ror#16
+ vrev32.16 q3,q3
+ add r8,r8,r12
+ vrev32.16 q7,q7
+ mov r4,r4,ror#20
+ vrev32.16 q11,q11
+ add r9,r9,r10
+ vadd.i32 q2,q2,q3
+ mov r5,r5,ror#20
+ vadd.i32 q6,q6,q7
+ eor r4,r4,r8,ror#20
+ vadd.i32 q10,q10,q11
+ eor r5,r5,r9,ror#20
+ veor q12,q1,q2
+ add r0,r0,r4
+ veor q13,q5,q6
+ mov r12,r12,ror#24
+ veor q14,q9,q10
+ add r1,r1,r5
+ vshr.u32 q1,q12,#20
+ mov r10,r10,ror#24
+ vshr.u32 q5,q13,#20
+ eor r12,r12,r0,ror#24
+ vshr.u32 q9,q14,#20
+ eor r10,r10,r1,ror#24
+ vsli.32 q1,q12,#12
+ add r8,r8,r12
+ vsli.32 q5,q13,#12
+ mov r4,r4,ror#25
+ vsli.32 q9,q14,#12
+ add r9,r9,r10
+ vadd.i32 q0,q0,q1
+ mov r5,r5,ror#25
+ vadd.i32 q4,q4,q5
+ str r10,[sp,#4*(16+13)]
+ vadd.i32 q8,q8,q9
+ ldr r10,[sp,#4*(16+15)]
+ veor q12,q3,q0
+ eor r4,r4,r8,ror#25
+ veor q13,q7,q4
+ eor r5,r5,r9,ror#25
+ veor q14,q11,q8
+ str r8,[sp,#4*(16+8)]
+ vshr.u32 q3,q12,#24
+ ldr r8,[sp,#4*(16+10)]
+ vshr.u32 q7,q13,#24
+ add r2,r2,r6
+ vshr.u32 q11,q14,#24
+ mov r14,r14,ror#16
+ vsli.32 q3,q12,#8
+ str r9,[sp,#4*(16+9)]
+ vsli.32 q7,q13,#8
+ ldr r9,[sp,#4*(16+11)]
+ vsli.32 q11,q14,#8
+ add r3,r3,r7
+ vadd.i32 q2,q2,q3
+ mov r10,r10,ror#16
+ vadd.i32 q6,q6,q7
+ eor r14,r14,r2,ror#16
+ vadd.i32 q10,q10,q11
+ eor r10,r10,r3,ror#16
+ veor q12,q1,q2
+ add r8,r8,r14
+ veor q13,q5,q6
+ mov r6,r6,ror#20
+ veor q14,q9,q10
+ add r9,r9,r10
+ vshr.u32 q1,q12,#25
+ mov r7,r7,ror#20
+ vshr.u32 q5,q13,#25
+ eor r6,r6,r8,ror#20
+ vshr.u32 q9,q14,#25
+ eor r7,r7,r9,ror#20
+ vsli.32 q1,q12,#7
+ add r2,r2,r6
+ vsli.32 q5,q13,#7
+ mov r14,r14,ror#24
+ vsli.32 q9,q14,#7
+ add r3,r3,r7
+ vext.8 q2,q2,q2,#8
+ mov r10,r10,ror#24
+ vext.8 q6,q6,q6,#8
+ eor r14,r14,r2,ror#24
+ vext.8 q10,q10,q10,#8
+ eor r10,r10,r3,ror#24
+ vext.8 q1,q1,q1,#4
+ add r8,r8,r14
+ vext.8 q5,q5,q5,#4
+ mov r6,r6,ror#25
+ vext.8 q9,q9,q9,#4
+ add r9,r9,r10
+ vext.8 q3,q3,q3,#12
+ mov r7,r7,ror#25
+ vext.8 q7,q7,q7,#12
+ eor r6,r6,r8,ror#25
+ vext.8 q11,q11,q11,#12
+ eor r7,r7,r9,ror#25
+ vadd.i32 q0,q0,q1
+ add r0,r0,r5
+ vadd.i32 q4,q4,q5
+ mov r10,r10,ror#16
+ vadd.i32 q8,q8,q9
+ add r1,r1,r6
+ veor q3,q3,q0
+ mov r12,r12,ror#16
+ veor q7,q7,q4
+ eor r10,r10,r0,ror#16
+ veor q11,q11,q8
+ eor r12,r12,r1,ror#16
+ vrev32.16 q3,q3
+ add r8,r8,r10
+ vrev32.16 q7,q7
+ mov r5,r5,ror#20
+ vrev32.16 q11,q11
+ add r9,r9,r12
+ vadd.i32 q2,q2,q3
+ mov r6,r6,ror#20
+ vadd.i32 q6,q6,q7
+ eor r5,r5,r8,ror#20
+ vadd.i32 q10,q10,q11
+ eor r6,r6,r9,ror#20
+ veor q12,q1,q2
+ add r0,r0,r5
+ veor q13,q5,q6
+ mov r10,r10,ror#24
+ veor q14,q9,q10
+ add r1,r1,r6
+ vshr.u32 q1,q12,#20
+ mov r12,r12,ror#24
+ vshr.u32 q5,q13,#20
+ eor r10,r10,r0,ror#24
+ vshr.u32 q9,q14,#20
+ eor r12,r12,r1,ror#24
+ vsli.32 q1,q12,#12
+ add r8,r8,r10
+ vsli.32 q5,q13,#12
+ mov r5,r5,ror#25
+ vsli.32 q9,q14,#12
+ str r10,[sp,#4*(16+15)]
+ vadd.i32 q0,q0,q1
+ ldr r10,[sp,#4*(16+13)]
+ vadd.i32 q4,q4,q5
+ add r9,r9,r12
+ vadd.i32 q8,q8,q9
+ mov r6,r6,ror#25
+ veor q12,q3,q0
+ eor r5,r5,r8,ror#25
+ veor q13,q7,q4
+ eor r6,r6,r9,ror#25
+ veor q14,q11,q8
+ str r8,[sp,#4*(16+10)]
+ vshr.u32 q3,q12,#24
+ ldr r8,[sp,#4*(16+8)]
+ vshr.u32 q7,q13,#24
+ add r2,r2,r7
+ vshr.u32 q11,q14,#24
+ mov r10,r10,ror#16
+ vsli.32 q3,q12,#8
+ str r9,[sp,#4*(16+11)]
+ vsli.32 q7,q13,#8
+ ldr r9,[sp,#4*(16+9)]
+ vsli.32 q11,q14,#8
+ add r3,r3,r4
+ vadd.i32 q2,q2,q3
+ mov r14,r14,ror#16
+ vadd.i32 q6,q6,q7
+ eor r10,r10,r2,ror#16
+ vadd.i32 q10,q10,q11
+ eor r14,r14,r3,ror#16
+ veor q12,q1,q2
+ add r8,r8,r10
+ veor q13,q5,q6
+ mov r7,r7,ror#20
+ veor q14,q9,q10
+ add r9,r9,r14
+ vshr.u32 q1,q12,#25
+ mov r4,r4,ror#20
+ vshr.u32 q5,q13,#25
+ eor r7,r7,r8,ror#20
+ vshr.u32 q9,q14,#25
+ eor r4,r4,r9,ror#20
+ vsli.32 q1,q12,#7
+ add r2,r2,r7
+ vsli.32 q5,q13,#7
+ mov r10,r10,ror#24
+ vsli.32 q9,q14,#7
+ add r3,r3,r4
+ vext.8 q2,q2,q2,#8
+ mov r14,r14,ror#24
+ vext.8 q6,q6,q6,#8
+ eor r10,r10,r2,ror#24
+ vext.8 q10,q10,q10,#8
+ eor r14,r14,r3,ror#24
+ vext.8 q1,q1,q1,#12
+ add r8,r8,r10
+ vext.8 q5,q5,q5,#12
+ mov r7,r7,ror#25
+ vext.8 q9,q9,q9,#12
+ add r9,r9,r14
+ vext.8 q3,q3,q3,#4
+ mov r4,r4,ror#25
+ vext.8 q7,q7,q7,#4
+ eor r7,r7,r8,ror#25
+ vext.8 q11,q11,q11,#4
+ eor r4,r4,r9,ror#25
+ bne .Loop_neon
+
+ add r11,sp,#32
+ vld1.32 {q12-q13},[sp] @ load key material
+ vld1.32 {q14-q15},[r11]
+
+ ldr r11,[sp,#4*(32+2)] @ load len
+
+ str r8, [sp,#4*(16+8)] @ modulo-scheduled store
+ str r9, [sp,#4*(16+9)]
+ str r12,[sp,#4*(16+12)]
+ str r10, [sp,#4*(16+13)]
+ str r14,[sp,#4*(16+14)]
+
+ @ at this point we have first half of 512-bit result in
+ @ rx and second half at sp+4*(16+8)
+
+ ldr r12,[sp,#4*(32+1)] @ load inp
+ ldr r14,[sp,#4*(32+0)] @ load out
+
+ vadd.i32 q0,q0,q12 @ accumulate key material
+ vadd.i32 q4,q4,q12
+ vadd.i32 q8,q8,q12
+ vldr d24,[sp,#4*(16+0)] @ one
+
+ vadd.i32 q1,q1,q13
+ vadd.i32 q5,q5,q13
+ vadd.i32 q9,q9,q13
+ vldr d26,[sp,#4*(16+2)] @ two
+
+ vadd.i32 q2,q2,q14
+ vadd.i32 q6,q6,q14
+ vadd.i32 q10,q10,q14
+ vadd.i32 d14,d14,d24 @ counter+1
+ vadd.i32 d22,d22,d26 @ counter+2
+
+ vadd.i32 q3,q3,q15
+ vadd.i32 q7,q7,q15
+ vadd.i32 q11,q11,q15
+
+ cmp r11,#64*4
+ blo .Ltail_neon
+
+ vld1.8 {q12-q13},[r12]! @ load input
+ mov r11,sp
+ vld1.8 {q14-q15},[r12]!
+ veor q0,q0,q12 @ xor with input
+ veor q1,q1,q13
+ vld1.8 {q12-q13},[r12]!
+ veor q2,q2,q14
+ veor q3,q3,q15
+ vld1.8 {q14-q15},[r12]!
+
+ veor q4,q4,q12
+ vst1.8 {q0-q1},[r14]! @ store output
+ veor q5,q5,q13
+ vld1.8 {q12-q13},[r12]!
+ veor q6,q6,q14
+ vst1.8 {q2-q3},[r14]!
+ veor q7,q7,q15
+ vld1.8 {q14-q15},[r12]!
+
+ veor q8,q8,q12
+ vld1.32 {q0-q1},[r11]! @ load for next iteration
+ veor d25,d25,d25
+ vldr d24,[sp,#4*(16+4)] @ four
+ veor q9,q9,q13
+ vld1.32 {q2-q3},[r11]
+ veor q10,q10,q14
+ vst1.8 {q4-q5},[r14]!
+ veor q11,q11,q15
+ vst1.8 {q6-q7},[r14]!
+
+ vadd.i32 d6,d6,d24 @ next counter value
+ vldr d24,[sp,#4*(16+0)] @ one
+
+ ldmia sp,{r8-r11} @ load key material
+ add r0,r0,r8 @ accumulate key material
+ ldr r8,[r12],#16 @ load input
+ vst1.8 {q8-q9},[r14]!
+ add r1,r1,r9
+ ldr r9,[r12,#-12]
+ vst1.8 {q10-q11},[r14]!
+ add r2,r2,r10
+ ldr r10,[r12,#-8]
+ add r3,r3,r11
+ ldr r11,[r12,#-4]
+#ifdef __ARMEB__
+ rev r0,r0
+ rev r1,r1
+ rev r2,r2
+ rev r3,r3
+#endif
+ eor r0,r0,r8 @ xor with input
+ add r8,sp,#4*(4)
+ eor r1,r1,r9
+ str r0,[r14],#16 @ store output
+ eor r2,r2,r10
+ str r1,[r14,#-12]
+ eor r3,r3,r11
+ ldmia r8,{r8-r11} @ load key material
+ str r2,[r14,#-8]
+ str r3,[r14,#-4]
+
+ add r4,r4,r8 @ accumulate key material
+ ldr r8,[r12],#16 @ load input
+ add r5,r5,r9
+ ldr r9,[r12,#-12]
+ add r6,r6,r10
+ ldr r10,[r12,#-8]
+ add r7,r7,r11
+ ldr r11,[r12,#-4]
+#ifdef __ARMEB__
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+ rev r7,r7
+#endif
+ eor r4,r4,r8
+ add r8,sp,#4*(8)
+ eor r5,r5,r9
+ str r4,[r14],#16 @ store output
+ eor r6,r6,r10
+ str r5,[r14,#-12]
+ eor r7,r7,r11
+ ldmia r8,{r8-r11} @ load key material
+ str r6,[r14,#-8]
+ add r0,sp,#4*(16+8)
+ str r7,[r14,#-4]
+
+ ldmia r0,{r0-r7} @ load second half
+
+ add r0,r0,r8 @ accumulate key material
+ ldr r8,[r12],#16 @ load input
+ add r1,r1,r9
+ ldr r9,[r12,#-12]
+#ifdef __thumb2__
+ it hi
+#endif
+ strhi r10,[sp,#4*(16+10)] @ copy "rx" while at it
+ add r2,r2,r10
+ ldr r10,[r12,#-8]
+#ifdef __thumb2__
+ it hi
+#endif
+ strhi r11,[sp,#4*(16+11)] @ copy "rx" while at it
+ add r3,r3,r11
+ ldr r11,[r12,#-4]
+#ifdef __ARMEB__
+ rev r0,r0
+ rev r1,r1
+ rev r2,r2
+ rev r3,r3
+#endif
+ eor r0,r0,r8
+ add r8,sp,#4*(12)
+ eor r1,r1,r9
+ str r0,[r14],#16 @ store output
+ eor r2,r2,r10
+ str r1,[r14,#-12]
+ eor r3,r3,r11
+ ldmia r8,{r8-r11} @ load key material
+ str r2,[r14,#-8]
+ str r3,[r14,#-4]
+
+ add r4,r4,r8 @ accumulate key material
+ add r8,r8,#4 @ next counter value
+ add r5,r5,r9
+ str r8,[sp,#4*(12)] @ save next counter value
+ ldr r8,[r12],#16 @ load input
+ add r6,r6,r10
+ add r4,r4,#3 @ counter+3
+ ldr r9,[r12,#-12]
+ add r7,r7,r11
+ ldr r10,[r12,#-8]
+ ldr r11,[r12,#-4]
+#ifdef __ARMEB__
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+ rev r7,r7
+#endif
+ eor r4,r4,r8
+#ifdef __thumb2__
+ it hi
+#endif
+ ldrhi r8,[sp,#4*(32+2)] @ re-load len
+ eor r5,r5,r9
+ eor r6,r6,r10
+ str r4,[r14],#16 @ store output
+ eor r7,r7,r11
+ str r5,[r14,#-12]
+ sub r11,r8,#64*4 @ len-=64*4
+ str r6,[r14,#-8]
+ str r7,[r14,#-4]
+ bhi .Loop_neon_outer
+
+ b .Ldone_neon
+
+.align 4
+.Lbreak_neon:
+ @ harmonize NEON and integer-only stack frames: load data
+ @ from NEON frame, but save to integer-only one; distance
+ @ between the two is 4*(32+4+16-32)=4*(20).
+
+ str r11, [sp,#4*(20+32+2)] @ save len
+ add r11,sp,#4*(32+4)
+ str r12, [sp,#4*(20+32+1)] @ save inp
+ str r14, [sp,#4*(20+32+0)] @ save out
+
+ ldr r12,[sp,#4*(16+10)]
+ ldr r14,[sp,#4*(16+11)]
+ vldmia r11,{d8-d15} @ fulfill ABI requirement
+ str r12,[sp,#4*(20+16+10)] @ copy "rx"
+ str r14,[sp,#4*(20+16+11)] @ copy "rx"
+
+ ldr r11, [sp,#4*(15)]
+ ldr r12,[sp,#4*(12)] @ modulo-scheduled load
+ ldr r10, [sp,#4*(13)]
+ ldr r14,[sp,#4*(14)]
+ str r11, [sp,#4*(20+16+15)]
+ add r11,sp,#4*(20)
+ vst1.32 {q0-q1},[r11]! @ copy key
+ add sp,sp,#4*(20) @ switch frame
+ vst1.32 {q2-q3},[r11]
+ mov r11,#10
+ b .Loop @ go integer-only
+
+.align 4
+.Ltail_neon:
+ cmp r11,#64*3
+ bhs .L192_or_more_neon
+ cmp r11,#64*2
+ bhs .L128_or_more_neon
+ cmp r11,#64*1
+ bhs .L64_or_more_neon
+
+ add r8,sp,#4*(8)
+ vst1.8 {q0-q1},[sp]
+ add r10,sp,#4*(0)
+ vst1.8 {q2-q3},[r8]
+ b .Loop_tail_neon
+
+.align 4
+.L64_or_more_neon:
+ vld1.8 {q12-q13},[r12]!
+ vld1.8 {q14-q15},[r12]!
+ veor q0,q0,q12
+ veor q1,q1,q13
+ veor q2,q2,q14
+ veor q3,q3,q15
+ vst1.8 {q0-q1},[r14]!
+ vst1.8 {q2-q3},[r14]!
+
+ beq .Ldone_neon
+
+ add r8,sp,#4*(8)
+ vst1.8 {q4-q5},[sp]
+ add r10,sp,#4*(0)
+ vst1.8 {q6-q7},[r8]
+ sub r11,r11,#64*1 @ len-=64*1
+ b .Loop_tail_neon
+
+.align 4
+.L128_or_more_neon:
+ vld1.8 {q12-q13},[r12]!
+ vld1.8 {q14-q15},[r12]!
+ veor q0,q0,q12
+ veor q1,q1,q13
+ vld1.8 {q12-q13},[r12]!
+ veor q2,q2,q14
+ veor q3,q3,q15
+ vld1.8 {q14-q15},[r12]!
+
+ veor q4,q4,q12
+ veor q5,q5,q13
+ vst1.8 {q0-q1},[r14]!
+ veor q6,q6,q14
+ vst1.8 {q2-q3},[r14]!
+ veor q7,q7,q15
+ vst1.8 {q4-q5},[r14]!
+ vst1.8 {q6-q7},[r14]!
+
+ beq .Ldone_neon
+
+ add r8,sp,#4*(8)
+ vst1.8 {q8-q9},[sp]
+ add r10,sp,#4*(0)
+ vst1.8 {q10-q11},[r8]
+ sub r11,r11,#64*2 @ len-=64*2
+ b .Loop_tail_neon
+
+.align 4
+.L192_or_more_neon:
+ vld1.8 {q12-q13},[r12]!
+ vld1.8 {q14-q15},[r12]!
+ veor q0,q0,q12
+ veor q1,q1,q13
+ vld1.8 {q12-q13},[r12]!
+ veor q2,q2,q14
+ veor q3,q3,q15
+ vld1.8 {q14-q15},[r12]!
+
+ veor q4,q4,q12
+ veor q5,q5,q13
+ vld1.8 {q12-q13},[r12]!
+ veor q6,q6,q14
+ vst1.8 {q0-q1},[r14]!
+ veor q7,q7,q15
+ vld1.8 {q14-q15},[r12]!
+
+ veor q8,q8,q12
+ vst1.8 {q2-q3},[r14]!
+ veor q9,q9,q13
+ vst1.8 {q4-q5},[r14]!
+ veor q10,q10,q14
+ vst1.8 {q6-q7},[r14]!
+ veor q11,q11,q15
+ vst1.8 {q8-q9},[r14]!
+ vst1.8 {q10-q11},[r14]!
+
+ beq .Ldone_neon
+
+ ldmia sp,{r8-r11} @ load key material
+ add r0,r0,r8 @ accumulate key material
+ add r8,sp,#4*(4)
+ add r1,r1,r9
+ add r2,r2,r10
+ add r3,r3,r11
+ ldmia r8,{r8-r11} @ load key material
+
+ add r4,r4,r8 @ accumulate key material
+ add r8,sp,#4*(8)
+ add r5,r5,r9
+ add r6,r6,r10
+ add r7,r7,r11
+ ldmia r8,{r8-r11} @ load key material
+#ifdef __ARMEB__
+ rev r0,r0
+ rev r1,r1
+ rev r2,r2
+ rev r3,r3
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+ rev r7,r7
+#endif
+ stmia sp,{r0-r7}
+ add r0,sp,#4*(16+8)
+
+ ldmia r0,{r0-r7} @ load second half
+
+ add r0,r0,r8 @ accumulate key material
+ add r8,sp,#4*(12)
+ add r1,r1,r9
+ add r2,r2,r10
+ add r3,r3,r11
+ ldmia r8,{r8-r11} @ load key material
+
+ add r4,r4,r8 @ accumulate key material
+ add r8,sp,#4*(8)
+ add r5,r5,r9
+ add r4,r4,#3 @ counter+3
+ add r6,r6,r10
+ add r7,r7,r11
+ ldr r11,[sp,#4*(32+2)] @ re-load len
+#ifdef __ARMEB__
+ rev r0,r0
+ rev r1,r1
+ rev r2,r2
+ rev r3,r3
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+ rev r7,r7
+#endif
+ stmia r8,{r0-r7}
+ add r10,sp,#4*(0)
+ sub r11,r11,#64*3 @ len-=64*3
+
+.Loop_tail_neon:
+ ldrb r8,[r10],#1 @ read buffer on stack
+ ldrb r9,[r12],#1 @ read input
+ subs r11,r11,#1
+ eor r8,r8,r9
+ strb r8,[r14],#1 @ store output
+ bne .Loop_tail_neon
+
+.Ldone_neon:
+ add sp,sp,#4*(32+4)
+ vldmia sp,{d8-d15}
+ add sp,sp,#4*(16+3)
+.Lno_data_neon:
+ ldmia sp!,{r4-r11,pc}
+ENDPROC(chacha20_neon)
+#endif
+
+.align 5
+.Lsigma2:
+.long 0x61707865,0x3320646e,0x79622d32,0x6b206574 @ endian-neutral
+.Lone2:
+.long 1,0,0,0
+.word -1
+
+.align 5
+ENTRY(chacha20_arm)
+ ldr r12,[sp,#0] @ pull pointer to counter and nonce
+ stmdb sp!,{r0-r2,r4-r11,lr}
+ cmp r2,#0 @ len==0?
+#ifdef __thumb2__
+ itt eq
+#endif
+ addeq sp,sp,#4*3
+ beq .Lno_data_arm
+.Lshort:
+ ldmia r12,{r4-r7} @ load counter and nonce
+ sub sp,sp,#4*(16) @ off-load area
+#if __LINUX_ARM_ARCH__ < 7 && !defined(__thumb2__)
+ sub r14,pc,#100 @ .Lsigma2
+#else
+ adr r14,.Lsigma2 @ .Lsigma2
+#endif
+ stmdb sp!,{r4-r7} @ copy counter and nonce
+ ldmia r3,{r4-r11} @ load key
+ ldmia r14,{r0-r3} @ load sigma
+ stmdb sp!,{r4-r11} @ copy key
+ stmdb sp!,{r0-r3} @ copy sigma
+ str r10,[sp,#4*(16+10)] @ off-load "rx"
+ str r11,[sp,#4*(16+11)] @ off-load "rx"
+ b .Loop_outer_enter
+
+.align 4
+.Loop_outer:
+ ldmia sp,{r0-r9} @ load key material
+ str r11,[sp,#4*(32+2)] @ save len
+ str r12, [sp,#4*(32+1)] @ save inp
+ str r14, [sp,#4*(32+0)] @ save out
+.Loop_outer_enter:
+ ldr r11, [sp,#4*(15)]
+ ldr r12,[sp,#4*(12)] @ modulo-scheduled load
+ ldr r10, [sp,#4*(13)]
+ ldr r14,[sp,#4*(14)]
+ str r11, [sp,#4*(16+15)]
+ mov r11,#10
+ b .Loop
+
+.align 4
+.Loop:
+ subs r11,r11,#1
+ add r0,r0,r4
+ mov r12,r12,ror#16
+ add r1,r1,r5
+ mov r10,r10,ror#16
+ eor r12,r12,r0,ror#16
+ eor r10,r10,r1,ror#16
+ add r8,r8,r12
+ mov r4,r4,ror#20
+ add r9,r9,r10
+ mov r5,r5,ror#20
+ eor r4,r4,r8,ror#20
+ eor r5,r5,r9,ror#20
+ add r0,r0,r4
+ mov r12,r12,ror#24
+ add r1,r1,r5
+ mov r10,r10,ror#24
+ eor r12,r12,r0,ror#24
+ eor r10,r10,r1,ror#24
+ add r8,r8,r12
+ mov r4,r4,ror#25
+ add r9,r9,r10
+ mov r5,r5,ror#25
+ str r10,[sp,#4*(16+13)]
+ ldr r10,[sp,#4*(16+15)]
+ eor r4,r4,r8,ror#25
+ eor r5,r5,r9,ror#25
+ str r8,[sp,#4*(16+8)]
+ ldr r8,[sp,#4*(16+10)]
+ add r2,r2,r6
+ mov r14,r14,ror#16
+ str r9,[sp,#4*(16+9)]
+ ldr r9,[sp,#4*(16+11)]
+ add r3,r3,r7
+ mov r10,r10,ror#16
+ eor r14,r14,r2,ror#16
+ eor r10,r10,r3,ror#16
+ add r8,r8,r14
+ mov r6,r6,ror#20
+ add r9,r9,r10
+ mov r7,r7,ror#20
+ eor r6,r6,r8,ror#20
+ eor r7,r7,r9,ror#20
+ add r2,r2,r6
+ mov r14,r14,ror#24
+ add r3,r3,r7
+ mov r10,r10,ror#24
+ eor r14,r14,r2,ror#24
+ eor r10,r10,r3,ror#24
+ add r8,r8,r14
+ mov r6,r6,ror#25
+ add r9,r9,r10
+ mov r7,r7,ror#25
+ eor r6,r6,r8,ror#25
+ eor r7,r7,r9,ror#25
+ add r0,r0,r5
+ mov r10,r10,ror#16
+ add r1,r1,r6
+ mov r12,r12,ror#16
+ eor r10,r10,r0,ror#16
+ eor r12,r12,r1,ror#16
+ add r8,r8,r10
+ mov r5,r5,ror#20
+ add r9,r9,r12
+ mov r6,r6,ror#20
+ eor r5,r5,r8,ror#20
+ eor r6,r6,r9,ror#20
+ add r0,r0,r5
+ mov r10,r10,ror#24
+ add r1,r1,r6
+ mov r12,r12,ror#24
+ eor r10,r10,r0,ror#24
+ eor r12,r12,r1,ror#24
+ add r8,r8,r10
+ mov r5,r5,ror#25
+ str r10,[sp,#4*(16+15)]
+ ldr r10,[sp,#4*(16+13)]
+ add r9,r9,r12
+ mov r6,r6,ror#25
+ eor r5,r5,r8,ror#25
+ eor r6,r6,r9,ror#25
+ str r8,[sp,#4*(16+10)]
+ ldr r8,[sp,#4*(16+8)]
+ add r2,r2,r7
+ mov r10,r10,ror#16
+ str r9,[sp,#4*(16+11)]
+ ldr r9,[sp,#4*(16+9)]
+ add r3,r3,r4
+ mov r14,r14,ror#16
+ eor r10,r10,r2,ror#16
+ eor r14,r14,r3,ror#16
+ add r8,r8,r10
+ mov r7,r7,ror#20
+ add r9,r9,r14
+ mov r4,r4,ror#20
+ eor r7,r7,r8,ror#20
+ eor r4,r4,r9,ror#20
+ add r2,r2,r7
+ mov r10,r10,ror#24
+ add r3,r3,r4
+ mov r14,r14,ror#24
+ eor r10,r10,r2,ror#24
+ eor r14,r14,r3,ror#24
+ add r8,r8,r10
+ mov r7,r7,ror#25
+ add r9,r9,r14
+ mov r4,r4,ror#25
+ eor r7,r7,r8,ror#25
+ eor r4,r4,r9,ror#25
+ bne .Loop
+
+ ldr r11,[sp,#4*(32+2)] @ load len
+
+ str r8, [sp,#4*(16+8)] @ modulo-scheduled store
+ str r9, [sp,#4*(16+9)]
+ str r12,[sp,#4*(16+12)]
+ str r10, [sp,#4*(16+13)]
+ str r14,[sp,#4*(16+14)]
+
+ @ at this point we have first half of 512-bit result in
+ @ rx and second half at sp+4*(16+8)
+
+ cmp r11,#64 @ done yet?
+#ifdef __thumb2__
+ itete lo
+#endif
+ addlo r12,sp,#4*(0) @ shortcut or ...
+ ldrhs r12,[sp,#4*(32+1)] @ ... load inp
+ addlo r14,sp,#4*(0) @ shortcut or ...
+ ldrhs r14,[sp,#4*(32+0)] @ ... load out
+
+ ldr r8,[sp,#4*(0)] @ load key material
+ ldr r9,[sp,#4*(1)]
+
+#if __LINUX_ARM_ARCH__ >= 6 || !defined(__ARMEB__)
+#if __LINUX_ARM_ARCH__ < 7
+ orr r10,r12,r14
+ tst r10,#3 @ are input and output aligned?
+ ldr r10,[sp,#4*(2)]
+ bne .Lunaligned
+ cmp r11,#64 @ restore flags
+#else
+ ldr r10,[sp,#4*(2)]
+#endif
+ ldr r11,[sp,#4*(3)]
+
+ add r0,r0,r8 @ accumulate key material
+ add r1,r1,r9
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r8,[r12],#16 @ load input
+ ldrhs r9,[r12,#-12]
+
+ add r2,r2,r10
+ add r3,r3,r11
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r10,[r12,#-8]
+ ldrhs r11,[r12,#-4]
+#if __LINUX_ARM_ARCH__ >= 6 && defined(__ARMEB__)
+ rev r0,r0
+ rev r1,r1
+ rev r2,r2
+ rev r3,r3
+#endif
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r0,r0,r8 @ xor with input
+ eorhs r1,r1,r9
+ add r8,sp,#4*(4)
+ str r0,[r14],#16 @ store output
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r2,r2,r10
+ eorhs r3,r3,r11
+ ldmia r8,{r8-r11} @ load key material
+ str r1,[r14,#-12]
+ str r2,[r14,#-8]
+ str r3,[r14,#-4]
+
+ add r4,r4,r8 @ accumulate key material
+ add r5,r5,r9
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r8,[r12],#16 @ load input
+ ldrhs r9,[r12,#-12]
+ add r6,r6,r10
+ add r7,r7,r11
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r10,[r12,#-8]
+ ldrhs r11,[r12,#-4]
+#if __LINUX_ARM_ARCH__ >= 6 && defined(__ARMEB__)
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+ rev r7,r7
+#endif
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r4,r4,r8
+ eorhs r5,r5,r9
+ add r8,sp,#4*(8)
+ str r4,[r14],#16 @ store output
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r6,r6,r10
+ eorhs r7,r7,r11
+ str r5,[r14,#-12]
+ ldmia r8,{r8-r11} @ load key material
+ str r6,[r14,#-8]
+ add r0,sp,#4*(16+8)
+ str r7,[r14,#-4]
+
+ ldmia r0,{r0-r7} @ load second half
+
+ add r0,r0,r8 @ accumulate key material
+ add r1,r1,r9
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r8,[r12],#16 @ load input
+ ldrhs r9,[r12,#-12]
+#ifdef __thumb2__
+ itt hi
+#endif
+ strhi r10,[sp,#4*(16+10)] @ copy "rx" while at it
+ strhi r11,[sp,#4*(16+11)] @ copy "rx" while at it
+ add r2,r2,r10
+ add r3,r3,r11
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r10,[r12,#-8]
+ ldrhs r11,[r12,#-4]
+#if __LINUX_ARM_ARCH__ >= 6 && defined(__ARMEB__)
+ rev r0,r0
+ rev r1,r1
+ rev r2,r2
+ rev r3,r3
+#endif
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r0,r0,r8
+ eorhs r1,r1,r9
+ add r8,sp,#4*(12)
+ str r0,[r14],#16 @ store output
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r2,r2,r10
+ eorhs r3,r3,r11
+ str r1,[r14,#-12]
+ ldmia r8,{r8-r11} @ load key material
+ str r2,[r14,#-8]
+ str r3,[r14,#-4]
+
+ add r4,r4,r8 @ accumulate key material
+ add r5,r5,r9
+#ifdef __thumb2__
+ itt hi
+#endif
+ addhi r8,r8,#1 @ next counter value
+ strhi r8,[sp,#4*(12)] @ save next counter value
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r8,[r12],#16 @ load input
+ ldrhs r9,[r12,#-12]
+ add r6,r6,r10
+ add r7,r7,r11
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhs r10,[r12,#-8]
+ ldrhs r11,[r12,#-4]
+#if __LINUX_ARM_ARCH__ >= 6 && defined(__ARMEB__)
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+ rev r7,r7
+#endif
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r4,r4,r8
+ eorhs r5,r5,r9
+#ifdef __thumb2__
+ it ne
+#endif
+ ldrne r8,[sp,#4*(32+2)] @ re-load len
+#ifdef __thumb2__
+ itt hs
+#endif
+ eorhs r6,r6,r10
+ eorhs r7,r7,r11
+ str r4,[r14],#16 @ store output
+ str r5,[r14,#-12]
+#ifdef __thumb2__
+ it hs
+#endif
+ subhs r11,r8,#64 @ len-=64
+ str r6,[r14,#-8]
+ str r7,[r14,#-4]
+ bhi .Loop_outer
+
+ beq .Ldone
+#if __LINUX_ARM_ARCH__ < 7
+ b .Ltail
+
+.align 4
+.Lunaligned: @ unaligned endian-neutral path
+ cmp r11,#64 @ restore flags
+#endif
+#endif
+#if __LINUX_ARM_ARCH__ < 7
+ ldr r11,[sp,#4*(3)]
+ add r0,r0,r8 @ accumulate key material
+ add r1,r1,r9
+ add r2,r2,r10
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r8,r8,r8 @ zero or ...
+ ldrhsb r8,[r12],#16 @ ... load input
+ eorlo r9,r9,r9
+ ldrhsb r9,[r12,#-12]
+
+ add r3,r3,r11
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r10,r10,r10
+ ldrhsb r10,[r12,#-8]
+ eorlo r11,r11,r11
+ ldrhsb r11,[r12,#-4]
+
+ eor r0,r8,r0 @ xor with input (or zero)
+ eor r1,r9,r1
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-15] @ load more input
+ ldrhsb r9,[r12,#-11]
+ eor r2,r10,r2
+ strb r0,[r14],#16 @ store output
+ eor r3,r11,r3
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-7]
+ ldrhsb r11,[r12,#-3]
+ strb r1,[r14,#-12]
+ eor r0,r8,r0,lsr#8
+ strb r2,[r14,#-8]
+ eor r1,r9,r1,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-14] @ load more input
+ ldrhsb r9,[r12,#-10]
+ strb r3,[r14,#-4]
+ eor r2,r10,r2,lsr#8
+ strb r0,[r14,#-15]
+ eor r3,r11,r3,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-6]
+ ldrhsb r11,[r12,#-2]
+ strb r1,[r14,#-11]
+ eor r0,r8,r0,lsr#8
+ strb r2,[r14,#-7]
+ eor r1,r9,r1,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-13] @ load more input
+ ldrhsb r9,[r12,#-9]
+ strb r3,[r14,#-3]
+ eor r2,r10,r2,lsr#8
+ strb r0,[r14,#-14]
+ eor r3,r11,r3,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-5]
+ ldrhsb r11,[r12,#-1]
+ strb r1,[r14,#-10]
+ strb r2,[r14,#-6]
+ eor r0,r8,r0,lsr#8
+ strb r3,[r14,#-2]
+ eor r1,r9,r1,lsr#8
+ strb r0,[r14,#-13]
+ eor r2,r10,r2,lsr#8
+ strb r1,[r14,#-9]
+ eor r3,r11,r3,lsr#8
+ strb r2,[r14,#-5]
+ strb r3,[r14,#-1]
+ add r8,sp,#4*(4+0)
+ ldmia r8,{r8-r11} @ load key material
+ add r0,sp,#4*(16+8)
+ add r4,r4,r8 @ accumulate key material
+ add r5,r5,r9
+ add r6,r6,r10
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r8,r8,r8 @ zero or ...
+ ldrhsb r8,[r12],#16 @ ... load input
+ eorlo r9,r9,r9
+ ldrhsb r9,[r12,#-12]
+
+ add r7,r7,r11
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r10,r10,r10
+ ldrhsb r10,[r12,#-8]
+ eorlo r11,r11,r11
+ ldrhsb r11,[r12,#-4]
+
+ eor r4,r8,r4 @ xor with input (or zero)
+ eor r5,r9,r5
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-15] @ load more input
+ ldrhsb r9,[r12,#-11]
+ eor r6,r10,r6
+ strb r4,[r14],#16 @ store output
+ eor r7,r11,r7
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-7]
+ ldrhsb r11,[r12,#-3]
+ strb r5,[r14,#-12]
+ eor r4,r8,r4,lsr#8
+ strb r6,[r14,#-8]
+ eor r5,r9,r5,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-14] @ load more input
+ ldrhsb r9,[r12,#-10]
+ strb r7,[r14,#-4]
+ eor r6,r10,r6,lsr#8
+ strb r4,[r14,#-15]
+ eor r7,r11,r7,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-6]
+ ldrhsb r11,[r12,#-2]
+ strb r5,[r14,#-11]
+ eor r4,r8,r4,lsr#8
+ strb r6,[r14,#-7]
+ eor r5,r9,r5,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-13] @ load more input
+ ldrhsb r9,[r12,#-9]
+ strb r7,[r14,#-3]
+ eor r6,r10,r6,lsr#8
+ strb r4,[r14,#-14]
+ eor r7,r11,r7,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-5]
+ ldrhsb r11,[r12,#-1]
+ strb r5,[r14,#-10]
+ strb r6,[r14,#-6]
+ eor r4,r8,r4,lsr#8
+ strb r7,[r14,#-2]
+ eor r5,r9,r5,lsr#8
+ strb r4,[r14,#-13]
+ eor r6,r10,r6,lsr#8
+ strb r5,[r14,#-9]
+ eor r7,r11,r7,lsr#8
+ strb r6,[r14,#-5]
+ strb r7,[r14,#-1]
+ add r8,sp,#4*(4+4)
+ ldmia r8,{r8-r11} @ load key material
+ ldmia r0,{r0-r7} @ load second half
+#ifdef __thumb2__
+ itt hi
+#endif
+ strhi r10,[sp,#4*(16+10)] @ copy "rx"
+ strhi r11,[sp,#4*(16+11)] @ copy "rx"
+ add r0,r0,r8 @ accumulate key material
+ add r1,r1,r9
+ add r2,r2,r10
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r8,r8,r8 @ zero or ...
+ ldrhsb r8,[r12],#16 @ ... load input
+ eorlo r9,r9,r9
+ ldrhsb r9,[r12,#-12]
+
+ add r3,r3,r11
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r10,r10,r10
+ ldrhsb r10,[r12,#-8]
+ eorlo r11,r11,r11
+ ldrhsb r11,[r12,#-4]
+
+ eor r0,r8,r0 @ xor with input (or zero)
+ eor r1,r9,r1
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-15] @ load more input
+ ldrhsb r9,[r12,#-11]
+ eor r2,r10,r2
+ strb r0,[r14],#16 @ store output
+ eor r3,r11,r3
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-7]
+ ldrhsb r11,[r12,#-3]
+ strb r1,[r14,#-12]
+ eor r0,r8,r0,lsr#8
+ strb r2,[r14,#-8]
+ eor r1,r9,r1,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-14] @ load more input
+ ldrhsb r9,[r12,#-10]
+ strb r3,[r14,#-4]
+ eor r2,r10,r2,lsr#8
+ strb r0,[r14,#-15]
+ eor r3,r11,r3,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-6]
+ ldrhsb r11,[r12,#-2]
+ strb r1,[r14,#-11]
+ eor r0,r8,r0,lsr#8
+ strb r2,[r14,#-7]
+ eor r1,r9,r1,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-13] @ load more input
+ ldrhsb r9,[r12,#-9]
+ strb r3,[r14,#-3]
+ eor r2,r10,r2,lsr#8
+ strb r0,[r14,#-14]
+ eor r3,r11,r3,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-5]
+ ldrhsb r11,[r12,#-1]
+ strb r1,[r14,#-10]
+ strb r2,[r14,#-6]
+ eor r0,r8,r0,lsr#8
+ strb r3,[r14,#-2]
+ eor r1,r9,r1,lsr#8
+ strb r0,[r14,#-13]
+ eor r2,r10,r2,lsr#8
+ strb r1,[r14,#-9]
+ eor r3,r11,r3,lsr#8
+ strb r2,[r14,#-5]
+ strb r3,[r14,#-1]
+ add r8,sp,#4*(4+8)
+ ldmia r8,{r8-r11} @ load key material
+ add r4,r4,r8 @ accumulate key material
+#ifdef __thumb2__
+ itt hi
+#endif
+ addhi r8,r8,#1 @ next counter value
+ strhi r8,[sp,#4*(12)] @ save next counter value
+ add r5,r5,r9
+ add r6,r6,r10
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r8,r8,r8 @ zero or ...
+ ldrhsb r8,[r12],#16 @ ... load input
+ eorlo r9,r9,r9
+ ldrhsb r9,[r12,#-12]
+
+ add r7,r7,r11
+#ifdef __thumb2__
+ itete lo
+#endif
+ eorlo r10,r10,r10
+ ldrhsb r10,[r12,#-8]
+ eorlo r11,r11,r11
+ ldrhsb r11,[r12,#-4]
+
+ eor r4,r8,r4 @ xor with input (or zero)
+ eor r5,r9,r5
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-15] @ load more input
+ ldrhsb r9,[r12,#-11]
+ eor r6,r10,r6
+ strb r4,[r14],#16 @ store output
+ eor r7,r11,r7
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-7]
+ ldrhsb r11,[r12,#-3]
+ strb r5,[r14,#-12]
+ eor r4,r8,r4,lsr#8
+ strb r6,[r14,#-8]
+ eor r5,r9,r5,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-14] @ load more input
+ ldrhsb r9,[r12,#-10]
+ strb r7,[r14,#-4]
+ eor r6,r10,r6,lsr#8
+ strb r4,[r14,#-15]
+ eor r7,r11,r7,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-6]
+ ldrhsb r11,[r12,#-2]
+ strb r5,[r14,#-11]
+ eor r4,r8,r4,lsr#8
+ strb r6,[r14,#-7]
+ eor r5,r9,r5,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r8,[r12,#-13] @ load more input
+ ldrhsb r9,[r12,#-9]
+ strb r7,[r14,#-3]
+ eor r6,r10,r6,lsr#8
+ strb r4,[r14,#-14]
+ eor r7,r11,r7,lsr#8
+#ifdef __thumb2__
+ itt hs
+#endif
+ ldrhsb r10,[r12,#-5]
+ ldrhsb r11,[r12,#-1]
+ strb r5,[r14,#-10]
+ strb r6,[r14,#-6]
+ eor r4,r8,r4,lsr#8
+ strb r7,[r14,#-2]
+ eor r5,r9,r5,lsr#8
+ strb r4,[r14,#-13]
+ eor r6,r10,r6,lsr#8
+ strb r5,[r14,#-9]
+ eor r7,r11,r7,lsr#8
+ strb r6,[r14,#-5]
+ strb r7,[r14,#-1]
+#ifdef __thumb2__
+ it ne
+#endif
+ ldrne r8,[sp,#4*(32+2)] @ re-load len
+#ifdef __thumb2__
+ it hs
+#endif
+ subhs r11,r8,#64 @ len-=64
+ bhi .Loop_outer
+
+ beq .Ldone
+#endif
+
+.Ltail:
+ ldr r12,[sp,#4*(32+1)] @ load inp
+ add r9,sp,#4*(0)
+ ldr r14,[sp,#4*(32+0)] @ load out
+
+.Loop_tail:
+ ldrb r10,[r9],#1 @ read buffer on stack
+ ldrb r11,[r12],#1 @ read input
+ subs r8,r8,#1
+ eor r11,r11,r10
+ strb r11,[r14],#1 @ store output
+ bne .Loop_tail
+
+.Ldone:
+ add sp,sp,#4*(32+3)
+.Lno_data_arm:
+ ldmia sp!,{r4-r11,pc}
+ENDPROC(chacha20_arm)
diff --git a/lib/zinc/chacha20/chacha20-arm64.S b/lib/zinc/chacha20/chacha20-arm64.S
new file mode 100644
index 000000000000..c3d1243161a8
--- /dev/null
+++ b/lib/zinc/chacha20/chacha20-arm64.S
@@ -0,0 +1,1940 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#include <linux/linkage.h>
+
+.text
+.align 5
+.Lsigma:
+.quad 0x3320646e61707865,0x6b20657479622d32 // endian-neutral
+.Lone:
+.long 1,0,0,0
+
+.align 5
+ENTRY(chacha20_arm)
+ cbz x2,.Labort
+.Lshort:
+ stp x29,x30,[sp,#-96]!
+ add x29,sp,#0
+
+ adr x5,.Lsigma
+ stp x19,x20,[sp,#16]
+ stp x21,x22,[sp,#32]
+ stp x23,x24,[sp,#48]
+ stp x25,x26,[sp,#64]
+ stp x27,x28,[sp,#80]
+ sub sp,sp,#64
+
+ ldp x22,x23,[x5] // load sigma
+ ldp x24,x25,[x3] // load key
+ ldp x26,x27,[x3,#16]
+ ldp x28,x30,[x4] // load counter
+#ifdef __ARMEB__
+ ror x24,x24,#32
+ ror x25,x25,#32
+ ror x26,x26,#32
+ ror x27,x27,#32
+ ror x28,x28,#32
+ ror x30,x30,#32
+#endif
+
+.Loop_outer:
+ mov w5,w22 // unpack key block
+ lsr x6,x22,#32
+ mov w7,w23
+ lsr x8,x23,#32
+ mov w9,w24
+ lsr x10,x24,#32
+ mov w11,w25
+ lsr x12,x25,#32
+ mov w13,w26
+ lsr x14,x26,#32
+ mov w15,w27
+ lsr x16,x27,#32
+ mov w17,w28
+ lsr x19,x28,#32
+ mov w20,w30
+ lsr x21,x30,#32
+
+ mov x4,#10
+ subs x2,x2,#64
+.Loop:
+ sub x4,x4,#1
+ add w5,w5,w9
+ add w6,w6,w10
+ add w7,w7,w11
+ add w8,w8,w12
+ eor w17,w17,w5
+ eor w19,w19,w6
+ eor w20,w20,w7
+ eor w21,w21,w8
+ ror w17,w17,#16
+ ror w19,w19,#16
+ ror w20,w20,#16
+ ror w21,w21,#16
+ add w13,w13,w17
+ add w14,w14,w19
+ add w15,w15,w20
+ add w16,w16,w21
+ eor w9,w9,w13
+ eor w10,w10,w14
+ eor w11,w11,w15
+ eor w12,w12,w16
+ ror w9,w9,#20
+ ror w10,w10,#20
+ ror w11,w11,#20
+ ror w12,w12,#20
+ add w5,w5,w9
+ add w6,w6,w10
+ add w7,w7,w11
+ add w8,w8,w12
+ eor w17,w17,w5
+ eor w19,w19,w6
+ eor w20,w20,w7
+ eor w21,w21,w8
+ ror w17,w17,#24
+ ror w19,w19,#24
+ ror w20,w20,#24
+ ror w21,w21,#24
+ add w13,w13,w17
+ add w14,w14,w19
+ add w15,w15,w20
+ add w16,w16,w21
+ eor w9,w9,w13
+ eor w10,w10,w14
+ eor w11,w11,w15
+ eor w12,w12,w16
+ ror w9,w9,#25
+ ror w10,w10,#25
+ ror w11,w11,#25
+ ror w12,w12,#25
+ add w5,w5,w10
+ add w6,w6,w11
+ add w7,w7,w12
+ add w8,w8,w9
+ eor w21,w21,w5
+ eor w17,w17,w6
+ eor w19,w19,w7
+ eor w20,w20,w8
+ ror w21,w21,#16
+ ror w17,w17,#16
+ ror w19,w19,#16
+ ror w20,w20,#16
+ add w15,w15,w21
+ add w16,w16,w17
+ add w13,w13,w19
+ add w14,w14,w20
+ eor w10,w10,w15
+ eor w11,w11,w16
+ eor w12,w12,w13
+ eor w9,w9,w14
+ ror w10,w10,#20
+ ror w11,w11,#20
+ ror w12,w12,#20
+ ror w9,w9,#20
+ add w5,w5,w10
+ add w6,w6,w11
+ add w7,w7,w12
+ add w8,w8,w9
+ eor w21,w21,w5
+ eor w17,w17,w6
+ eor w19,w19,w7
+ eor w20,w20,w8
+ ror w21,w21,#24
+ ror w17,w17,#24
+ ror w19,w19,#24
+ ror w20,w20,#24
+ add w15,w15,w21
+ add w16,w16,w17
+ add w13,w13,w19
+ add w14,w14,w20
+ eor w10,w10,w15
+ eor w11,w11,w16
+ eor w12,w12,w13
+ eor w9,w9,w14
+ ror w10,w10,#25
+ ror w11,w11,#25
+ ror w12,w12,#25
+ ror w9,w9,#25
+ cbnz x4,.Loop
+
+ add w5,w5,w22 // accumulate key block
+ add x6,x6,x22,lsr#32
+ add w7,w7,w23
+ add x8,x8,x23,lsr#32
+ add w9,w9,w24
+ add x10,x10,x24,lsr#32
+ add w11,w11,w25
+ add x12,x12,x25,lsr#32
+ add w13,w13,w26
+ add x14,x14,x26,lsr#32
+ add w15,w15,w27
+ add x16,x16,x27,lsr#32
+ add w17,w17,w28
+ add x19,x19,x28,lsr#32
+ add w20,w20,w30
+ add x21,x21,x30,lsr#32
+
+ b.lo .Ltail
+
+ add x5,x5,x6,lsl#32 // pack
+ add x7,x7,x8,lsl#32
+ ldp x6,x8,[x1,#0] // load input
+ add x9,x9,x10,lsl#32
+ add x11,x11,x12,lsl#32
+ ldp x10,x12,[x1,#16]
+ add x13,x13,x14,lsl#32
+ add x15,x15,x16,lsl#32
+ ldp x14,x16,[x1,#32]
+ add x17,x17,x19,lsl#32
+ add x20,x20,x21,lsl#32
+ ldp x19,x21,[x1,#48]
+ add x1,x1,#64
+#ifdef __ARMEB__
+ rev x5,x5
+ rev x7,x7
+ rev x9,x9
+ rev x11,x11
+ rev x13,x13
+ rev x15,x15
+ rev x17,x17
+ rev x20,x20
+#endif
+ eor x5,x5,x6
+ eor x7,x7,x8
+ eor x9,x9,x10
+ eor x11,x11,x12
+ eor x13,x13,x14
+ eor x15,x15,x16
+ eor x17,x17,x19
+ eor x20,x20,x21
+
+ stp x5,x7,[x0,#0] // store output
+ add x28,x28,#1 // increment counter
+ stp x9,x11,[x0,#16]
+ stp x13,x15,[x0,#32]
+ stp x17,x20,[x0,#48]
+ add x0,x0,#64
+
+ b.hi .Loop_outer
+
+ ldp x19,x20,[x29,#16]
+ add sp,sp,#64
+ ldp x21,x22,[x29,#32]
+ ldp x23,x24,[x29,#48]
+ ldp x25,x26,[x29,#64]
+ ldp x27,x28,[x29,#80]
+ ldp x29,x30,[sp],#96
+.Labort:
+ ret
+
+.align 4
+.Ltail:
+ add x2,x2,#64
+.Less_than_64:
+ sub x0,x0,#1
+ add x1,x1,x2
+ add x0,x0,x2
+ add x4,sp,x2
+ neg x2,x2
+
+ add x5,x5,x6,lsl#32 // pack
+ add x7,x7,x8,lsl#32
+ add x9,x9,x10,lsl#32
+ add x11,x11,x12,lsl#32
+ add x13,x13,x14,lsl#32
+ add x15,x15,x16,lsl#32
+ add x17,x17,x19,lsl#32
+ add x20,x20,x21,lsl#32
+#ifdef __ARMEB__
+ rev x5,x5
+ rev x7,x7
+ rev x9,x9
+ rev x11,x11
+ rev x13,x13
+ rev x15,x15
+ rev x17,x17
+ rev x20,x20
+#endif
+ stp x5,x7,[sp,#0]
+ stp x9,x11,[sp,#16]
+ stp x13,x15,[sp,#32]
+ stp x17,x20,[sp,#48]
+
+.Loop_tail:
+ ldrb w10,[x1,x2]
+ ldrb w11,[x4,x2]
+ add x2,x2,#1
+ eor w10,w10,w11
+ strb w10,[x0,x2]
+ cbnz x2,.Loop_tail
+
+ stp xzr,xzr,[sp,#0]
+ stp xzr,xzr,[sp,#16]
+ stp xzr,xzr,[sp,#32]
+ stp xzr,xzr,[sp,#48]
+
+ ldp x19,x20,[x29,#16]
+ add sp,sp,#64
+ ldp x21,x22,[x29,#32]
+ ldp x23,x24,[x29,#48]
+ ldp x25,x26,[x29,#64]
+ ldp x27,x28,[x29,#80]
+ ldp x29,x30,[sp],#96
+ ret
+ENDPROC(chacha20_arm)
+
+.align 5
+ENTRY(chacha20_neon)
+ cbz x2,.Labort_neon
+ cmp x2,#192
+ b.lo .Lshort
+
+ stp x29,x30,[sp,#-96]!
+ add x29,sp,#0
+
+ adr x5,.Lsigma
+ stp x19,x20,[sp,#16]
+ stp x21,x22,[sp,#32]
+ stp x23,x24,[sp,#48]
+ stp x25,x26,[sp,#64]
+ stp x27,x28,[sp,#80]
+ cmp x2,#512
+ b.hs .L512_or_more_neon
+
+ sub sp,sp,#64
+
+ ldp x22,x23,[x5] // load sigma
+ ld1 {v24.4s},[x5],#16
+ ldp x24,x25,[x3] // load key
+ ldp x26,x27,[x3,#16]
+ ld1 {v25.4s,v26.4s},[x3]
+ ldp x28,x30,[x4] // load counter
+ ld1 {v27.4s},[x4]
+ ld1 {v31.4s},[x5]
+#ifdef __ARMEB__
+ rev64 v24.4s,v24.4s
+ ror x24,x24,#32
+ ror x25,x25,#32
+ ror x26,x26,#32
+ ror x27,x27,#32
+ ror x28,x28,#32
+ ror x30,x30,#32
+#endif
+ add v27.4s,v27.4s,v31.4s // += 1
+ add v28.4s,v27.4s,v31.4s
+ add v29.4s,v28.4s,v31.4s
+ shl v31.4s,v31.4s,#2 // 1 -> 4
+
+.Loop_outer_neon:
+ mov w5,w22 // unpack key block
+ lsr x6,x22,#32
+ mov v0.16b,v24.16b
+ mov w7,w23
+ lsr x8,x23,#32
+ mov v4.16b,v24.16b
+ mov w9,w24
+ lsr x10,x24,#32
+ mov v16.16b,v24.16b
+ mov w11,w25
+ mov v1.16b,v25.16b
+ lsr x12,x25,#32
+ mov v5.16b,v25.16b
+ mov w13,w26
+ mov v17.16b,v25.16b
+ lsr x14,x26,#32
+ mov v3.16b,v27.16b
+ mov w15,w27
+ mov v7.16b,v28.16b
+ lsr x16,x27,#32
+ mov v19.16b,v29.16b
+ mov w17,w28
+ mov v2.16b,v26.16b
+ lsr x19,x28,#32
+ mov v6.16b,v26.16b
+ mov w20,w30
+ mov v18.16b,v26.16b
+ lsr x21,x30,#32
+
+ mov x4,#10
+ subs x2,x2,#256
+.Loop_neon:
+ sub x4,x4,#1
+ add v0.4s,v0.4s,v1.4s
+ add w5,w5,w9
+ add v4.4s,v4.4s,v5.4s
+ add w6,w6,w10
+ add v16.4s,v16.4s,v17.4s
+ add w7,w7,w11
+ eor v3.16b,v3.16b,v0.16b
+ add w8,w8,w12
+ eor v7.16b,v7.16b,v4.16b
+ eor w17,w17,w5
+ eor v19.16b,v19.16b,v16.16b
+ eor w19,w19,w6
+ rev32 v3.8h,v3.8h
+ eor w20,w20,w7
+ rev32 v7.8h,v7.8h
+ eor w21,w21,w8
+ rev32 v19.8h,v19.8h
+ ror w17,w17,#16
+ add v2.4s,v2.4s,v3.4s
+ ror w19,w19,#16
+ add v6.4s,v6.4s,v7.4s
+ ror w20,w20,#16
+ add v18.4s,v18.4s,v19.4s
+ ror w21,w21,#16
+ eor v20.16b,v1.16b,v2.16b
+ add w13,w13,w17
+ eor v21.16b,v5.16b,v6.16b
+ add w14,w14,w19
+ eor v22.16b,v17.16b,v18.16b
+ add w15,w15,w20
+ ushr v1.4s,v20.4s,#20
+ add w16,w16,w21
+ ushr v5.4s,v21.4s,#20
+ eor w9,w9,w13
+ ushr v17.4s,v22.4s,#20
+ eor w10,w10,w14
+ sli v1.4s,v20.4s,#12
+ eor w11,w11,w15
+ sli v5.4s,v21.4s,#12
+ eor w12,w12,w16
+ sli v17.4s,v22.4s,#12
+ ror w9,w9,#20
+ add v0.4s,v0.4s,v1.4s
+ ror w10,w10,#20
+ add v4.4s,v4.4s,v5.4s
+ ror w11,w11,#20
+ add v16.4s,v16.4s,v17.4s
+ ror w12,w12,#20
+ eor v20.16b,v3.16b,v0.16b
+ add w5,w5,w9
+ eor v21.16b,v7.16b,v4.16b
+ add w6,w6,w10
+ eor v22.16b,v19.16b,v16.16b
+ add w7,w7,w11
+ ushr v3.4s,v20.4s,#24
+ add w8,w8,w12
+ ushr v7.4s,v21.4s,#24
+ eor w17,w17,w5
+ ushr v19.4s,v22.4s,#24
+ eor w19,w19,w6
+ sli v3.4s,v20.4s,#8
+ eor w20,w20,w7
+ sli v7.4s,v21.4s,#8
+ eor w21,w21,w8
+ sli v19.4s,v22.4s,#8
+ ror w17,w17,#24
+ add v2.4s,v2.4s,v3.4s
+ ror w19,w19,#24
+ add v6.4s,v6.4s,v7.4s
+ ror w20,w20,#24
+ add v18.4s,v18.4s,v19.4s
+ ror w21,w21,#24
+ eor v20.16b,v1.16b,v2.16b
+ add w13,w13,w17
+ eor v21.16b,v5.16b,v6.16b
+ add w14,w14,w19
+ eor v22.16b,v17.16b,v18.16b
+ add w15,w15,w20
+ ushr v1.4s,v20.4s,#25
+ add w16,w16,w21
+ ushr v5.4s,v21.4s,#25
+ eor w9,w9,w13
+ ushr v17.4s,v22.4s,#25
+ eor w10,w10,w14
+ sli v1.4s,v20.4s,#7
+ eor w11,w11,w15
+ sli v5.4s,v21.4s,#7
+ eor w12,w12,w16
+ sli v17.4s,v22.4s,#7
+ ror w9,w9,#25
+ ext v2.16b,v2.16b,v2.16b,#8
+ ror w10,w10,#25
+ ext v6.16b,v6.16b,v6.16b,#8
+ ror w11,w11,#25
+ ext v18.16b,v18.16b,v18.16b,#8
+ ror w12,w12,#25
+ ext v3.16b,v3.16b,v3.16b,#12
+ ext v7.16b,v7.16b,v7.16b,#12
+ ext v19.16b,v19.16b,v19.16b,#12
+ ext v1.16b,v1.16b,v1.16b,#4
+ ext v5.16b,v5.16b,v5.16b,#4
+ ext v17.16b,v17.16b,v17.16b,#4
+ add v0.4s,v0.4s,v1.4s
+ add w5,w5,w10
+ add v4.4s,v4.4s,v5.4s
+ add w6,w6,w11
+ add v16.4s,v16.4s,v17.4s
+ add w7,w7,w12
+ eor v3.16b,v3.16b,v0.16b
+ add w8,w8,w9
+ eor v7.16b,v7.16b,v4.16b
+ eor w21,w21,w5
+ eor v19.16b,v19.16b,v16.16b
+ eor w17,w17,w6
+ rev32 v3.8h,v3.8h
+ eor w19,w19,w7
+ rev32 v7.8h,v7.8h
+ eor w20,w20,w8
+ rev32 v19.8h,v19.8h
+ ror w21,w21,#16
+ add v2.4s,v2.4s,v3.4s
+ ror w17,w17,#16
+ add v6.4s,v6.4s,v7.4s
+ ror w19,w19,#16
+ add v18.4s,v18.4s,v19.4s
+ ror w20,w20,#16
+ eor v20.16b,v1.16b,v2.16b
+ add w15,w15,w21
+ eor v21.16b,v5.16b,v6.16b
+ add w16,w16,w17
+ eor v22.16b,v17.16b,v18.16b
+ add w13,w13,w19
+ ushr v1.4s,v20.4s,#20
+ add w14,w14,w20
+ ushr v5.4s,v21.4s,#20
+ eor w10,w10,w15
+ ushr v17.4s,v22.4s,#20
+ eor w11,w11,w16
+ sli v1.4s,v20.4s,#12
+ eor w12,w12,w13
+ sli v5.4s,v21.4s,#12
+ eor w9,w9,w14
+ sli v17.4s,v22.4s,#12
+ ror w10,w10,#20
+ add v0.4s,v0.4s,v1.4s
+ ror w11,w11,#20
+ add v4.4s,v4.4s,v5.4s
+ ror w12,w12,#20
+ add v16.4s,v16.4s,v17.4s
+ ror w9,w9,#20
+ eor v20.16b,v3.16b,v0.16b
+ add w5,w5,w10
+ eor v21.16b,v7.16b,v4.16b
+ add w6,w6,w11
+ eor v22.16b,v19.16b,v16.16b
+ add w7,w7,w12
+ ushr v3.4s,v20.4s,#24
+ add w8,w8,w9
+ ushr v7.4s,v21.4s,#24
+ eor w21,w21,w5
+ ushr v19.4s,v22.4s,#24
+ eor w17,w17,w6
+ sli v3.4s,v20.4s,#8
+ eor w19,w19,w7
+ sli v7.4s,v21.4s,#8
+ eor w20,w20,w8
+ sli v19.4s,v22.4s,#8
+ ror w21,w21,#24
+ add v2.4s,v2.4s,v3.4s
+ ror w17,w17,#24
+ add v6.4s,v6.4s,v7.4s
+ ror w19,w19,#24
+ add v18.4s,v18.4s,v19.4s
+ ror w20,w20,#24
+ eor v20.16b,v1.16b,v2.16b
+ add w15,w15,w21
+ eor v21.16b,v5.16b,v6.16b
+ add w16,w16,w17
+ eor v22.16b,v17.16b,v18.16b
+ add w13,w13,w19
+ ushr v1.4s,v20.4s,#25
+ add w14,w14,w20
+ ushr v5.4s,v21.4s,#25
+ eor w10,w10,w15
+ ushr v17.4s,v22.4s,#25
+ eor w11,w11,w16
+ sli v1.4s,v20.4s,#7
+ eor w12,w12,w13
+ sli v5.4s,v21.4s,#7
+ eor w9,w9,w14
+ sli v17.4s,v22.4s,#7
+ ror w10,w10,#25
+ ext v2.16b,v2.16b,v2.16b,#8
+ ror w11,w11,#25
+ ext v6.16b,v6.16b,v6.16b,#8
+ ror w12,w12,#25
+ ext v18.16b,v18.16b,v18.16b,#8
+ ror w9,w9,#25
+ ext v3.16b,v3.16b,v3.16b,#4
+ ext v7.16b,v7.16b,v7.16b,#4
+ ext v19.16b,v19.16b,v19.16b,#4
+ ext v1.16b,v1.16b,v1.16b,#12
+ ext v5.16b,v5.16b,v5.16b,#12
+ ext v17.16b,v17.16b,v17.16b,#12
+ cbnz x4,.Loop_neon
+
+ add w5,w5,w22 // accumulate key block
+ add v0.4s,v0.4s,v24.4s
+ add x6,x6,x22,lsr#32
+ add v4.4s,v4.4s,v24.4s
+ add w7,w7,w23
+ add v16.4s,v16.4s,v24.4s
+ add x8,x8,x23,lsr#32
+ add v2.4s,v2.4s,v26.4s
+ add w9,w9,w24
+ add v6.4s,v6.4s,v26.4s
+ add x10,x10,x24,lsr#32
+ add v18.4s,v18.4s,v26.4s
+ add w11,w11,w25
+ add v3.4s,v3.4s,v27.4s
+ add x12,x12,x25,lsr#32
+ add w13,w13,w26
+ add v7.4s,v7.4s,v28.4s
+ add x14,x14,x26,lsr#32
+ add w15,w15,w27
+ add v19.4s,v19.4s,v29.4s
+ add x16,x16,x27,lsr#32
+ add w17,w17,w28
+ add v1.4s,v1.4s,v25.4s
+ add x19,x19,x28,lsr#32
+ add w20,w20,w30
+ add v5.4s,v5.4s,v25.4s
+ add x21,x21,x30,lsr#32
+ add v17.4s,v17.4s,v25.4s
+
+ b.lo .Ltail_neon
+
+ add x5,x5,x6,lsl#32 // pack
+ add x7,x7,x8,lsl#32
+ ldp x6,x8,[x1,#0] // load input
+ add x9,x9,x10,lsl#32
+ add x11,x11,x12,lsl#32
+ ldp x10,x12,[x1,#16]
+ add x13,x13,x14,lsl#32
+ add x15,x15,x16,lsl#32
+ ldp x14,x16,[x1,#32]
+ add x17,x17,x19,lsl#32
+ add x20,x20,x21,lsl#32
+ ldp x19,x21,[x1,#48]
+ add x1,x1,#64
+#ifdef __ARMEB__
+ rev x5,x5
+ rev x7,x7
+ rev x9,x9
+ rev x11,x11
+ rev x13,x13
+ rev x15,x15
+ rev x17,x17
+ rev x20,x20
+#endif
+ ld1 {v20.16b,v21.16b,v22.16b,v23.16b},[x1],#64
+ eor x5,x5,x6
+ eor x7,x7,x8
+ eor x9,x9,x10
+ eor x11,x11,x12
+ eor x13,x13,x14
+ eor v0.16b,v0.16b,v20.16b
+ eor x15,x15,x16
+ eor v1.16b,v1.16b,v21.16b
+ eor x17,x17,x19
+ eor v2.16b,v2.16b,v22.16b
+ eor x20,x20,x21
+ eor v3.16b,v3.16b,v23.16b
+ ld1 {v20.16b,v21.16b,v22.16b,v23.16b},[x1],#64
+
+ stp x5,x7,[x0,#0] // store output
+ add x28,x28,#4 // increment counter
+ stp x9,x11,[x0,#16]
+ add v27.4s,v27.4s,v31.4s // += 4
+ stp x13,x15,[x0,#32]
+ add v28.4s,v28.4s,v31.4s
+ stp x17,x20,[x0,#48]
+ add v29.4s,v29.4s,v31.4s
+ add x0,x0,#64
+
+ st1 {v0.16b,v1.16b,v2.16b,v3.16b},[x0],#64
+ ld1 {v0.16b,v1.16b,v2.16b,v3.16b},[x1],#64
+
+ eor v4.16b,v4.16b,v20.16b
+ eor v5.16b,v5.16b,v21.16b
+ eor v6.16b,v6.16b,v22.16b
+ eor v7.16b,v7.16b,v23.16b
+ st1 {v4.16b,v5.16b,v6.16b,v7.16b},[x0],#64
+
+ eor v16.16b,v16.16b,v0.16b
+ eor v17.16b,v17.16b,v1.16b
+ eor v18.16b,v18.16b,v2.16b
+ eor v19.16b,v19.16b,v3.16b
+ st1 {v16.16b,v17.16b,v18.16b,v19.16b},[x0],#64
+
+ b.hi .Loop_outer_neon
+
+ ldp x19,x20,[x29,#16]
+ add sp,sp,#64
+ ldp x21,x22,[x29,#32]
+ ldp x23,x24,[x29,#48]
+ ldp x25,x26,[x29,#64]
+ ldp x27,x28,[x29,#80]
+ ldp x29,x30,[sp],#96
+ ret
+
+.Ltail_neon:
+ add x2,x2,#256
+ cmp x2,#64
+ b.lo .Less_than_64
+
+ add x5,x5,x6,lsl#32 // pack
+ add x7,x7,x8,lsl#32
+ ldp x6,x8,[x1,#0] // load input
+ add x9,x9,x10,lsl#32
+ add x11,x11,x12,lsl#32
+ ldp x10,x12,[x1,#16]
+ add x13,x13,x14,lsl#32
+ add x15,x15,x16,lsl#32
+ ldp x14,x16,[x1,#32]
+ add x17,x17,x19,lsl#32
+ add x20,x20,x21,lsl#32
+ ldp x19,x21,[x1,#48]
+ add x1,x1,#64
+#ifdef __ARMEB__
+ rev x5,x5
+ rev x7,x7
+ rev x9,x9
+ rev x11,x11
+ rev x13,x13
+ rev x15,x15
+ rev x17,x17
+ rev x20,x20
+#endif
+ eor x5,x5,x6
+ eor x7,x7,x8
+ eor x9,x9,x10
+ eor x11,x11,x12
+ eor x13,x13,x14
+ eor x15,x15,x16
+ eor x17,x17,x19
+ eor x20,x20,x21
+
+ stp x5,x7,[x0,#0] // store output
+ add x28,x28,#4 // increment counter
+ stp x9,x11,[x0,#16]
+ stp x13,x15,[x0,#32]
+ stp x17,x20,[x0,#48]
+ add x0,x0,#64
+ b.eq .Ldone_neon
+ sub x2,x2,#64
+ cmp x2,#64
+ b.lo .Less_than_128
+
+ ld1 {v20.16b,v21.16b,v22.16b,v23.16b},[x1],#64
+ eor v0.16b,v0.16b,v20.16b
+ eor v1.16b,v1.16b,v21.16b
+ eor v2.16b,v2.16b,v22.16b
+ eor v3.16b,v3.16b,v23.16b
+ st1 {v0.16b,v1.16b,v2.16b,v3.16b},[x0],#64
+ b.eq .Ldone_neon
+ sub x2,x2,#64
+ cmp x2,#64
+ b.lo .Less_than_192
+
+ ld1 {v20.16b,v21.16b,v22.16b,v23.16b},[x1],#64
+ eor v4.16b,v4.16b,v20.16b
+ eor v5.16b,v5.16b,v21.16b
+ eor v6.16b,v6.16b,v22.16b
+ eor v7.16b,v7.16b,v23.16b
+ st1 {v4.16b,v5.16b,v6.16b,v7.16b},[x0],#64
+ b.eq .Ldone_neon
+ sub x2,x2,#64
+
+ st1 {v16.16b,v17.16b,v18.16b,v19.16b},[sp]
+ b .Last_neon
+
+.Less_than_128:
+ st1 {v0.16b,v1.16b,v2.16b,v3.16b},[sp]
+ b .Last_neon
+.Less_than_192:
+ st1 {v4.16b,v5.16b,v6.16b,v7.16b},[sp]
+ b .Last_neon
+
+.align 4
+.Last_neon:
+ sub x0,x0,#1
+ add x1,x1,x2
+ add x0,x0,x2
+ add x4,sp,x2
+ neg x2,x2
+
+.Loop_tail_neon:
+ ldrb w10,[x1,x2]
+ ldrb w11,[x4,x2]
+ add x2,x2,#1
+ eor w10,w10,w11
+ strb w10,[x0,x2]
+ cbnz x2,.Loop_tail_neon
+
+ stp xzr,xzr,[sp,#0]
+ stp xzr,xzr,[sp,#16]
+ stp xzr,xzr,[sp,#32]
+ stp xzr,xzr,[sp,#48]
+
+.Ldone_neon:
+ ldp x19,x20,[x29,#16]
+ add sp,sp,#64
+ ldp x21,x22,[x29,#32]
+ ldp x23,x24,[x29,#48]
+ ldp x25,x26,[x29,#64]
+ ldp x27,x28,[x29,#80]
+ ldp x29,x30,[sp],#96
+ ret
+
+.L512_or_more_neon:
+ sub sp,sp,#128+64
+
+ ldp x22,x23,[x5] // load sigma
+ ld1 {v24.4s},[x5],#16
+ ldp x24,x25,[x3] // load key
+ ldp x26,x27,[x3,#16]
+ ld1 {v25.4s,v26.4s},[x3]
+ ldp x28,x30,[x4] // load counter
+ ld1 {v27.4s},[x4]
+ ld1 {v31.4s},[x5]
+#ifdef __ARMEB__
+ rev64 v24.4s,v24.4s
+ ror x24,x24,#32
+ ror x25,x25,#32
+ ror x26,x26,#32
+ ror x27,x27,#32
+ ror x28,x28,#32
+ ror x30,x30,#32
+#endif
+ add v27.4s,v27.4s,v31.4s // += 1
+ stp q24,q25,[sp,#0] // off-load key block, invariant part
+ add v27.4s,v27.4s,v31.4s // not typo
+ str q26,[sp,#32]
+ add v28.4s,v27.4s,v31.4s
+ add v29.4s,v28.4s,v31.4s
+ add v30.4s,v29.4s,v31.4s
+ shl v31.4s,v31.4s,#2 // 1 -> 4
+
+ stp d8,d9,[sp,#128+0] // meet ABI requirements
+ stp d10,d11,[sp,#128+16]
+ stp d12,d13,[sp,#128+32]
+ stp d14,d15,[sp,#128+48]
+
+ sub x2,x2,#512 // not typo
+
+.Loop_outer_512_neon:
+ mov v0.16b,v24.16b
+ mov v4.16b,v24.16b
+ mov v8.16b,v24.16b
+ mov v12.16b,v24.16b
+ mov v16.16b,v24.16b
+ mov v20.16b,v24.16b
+ mov v1.16b,v25.16b
+ mov w5,w22 // unpack key block
+ mov v5.16b,v25.16b
+ lsr x6,x22,#32
+ mov v9.16b,v25.16b
+ mov w7,w23
+ mov v13.16b,v25.16b
+ lsr x8,x23,#32
+ mov v17.16b,v25.16b
+ mov w9,w24
+ mov v21.16b,v25.16b
+ lsr x10,x24,#32
+ mov v3.16b,v27.16b
+ mov w11,w25
+ mov v7.16b,v28.16b
+ lsr x12,x25,#32
+ mov v11.16b,v29.16b
+ mov w13,w26
+ mov v15.16b,v30.16b
+ lsr x14,x26,#32
+ mov v2.16b,v26.16b
+ mov w15,w27
+ mov v6.16b,v26.16b
+ lsr x16,x27,#32
+ add v19.4s,v3.4s,v31.4s // +4
+ mov w17,w28
+ add v23.4s,v7.4s,v31.4s // +4
+ lsr x19,x28,#32
+ mov v10.16b,v26.16b
+ mov w20,w30
+ mov v14.16b,v26.16b
+ lsr x21,x30,#32
+ mov v18.16b,v26.16b
+ stp q27,q28,[sp,#48] // off-load key block, variable part
+ mov v22.16b,v26.16b
+ str q29,[sp,#80]
+
+ mov x4,#5
+ subs x2,x2,#512
+.Loop_upper_neon:
+ sub x4,x4,#1
+ add v0.4s,v0.4s,v1.4s
+ add w5,w5,w9
+ add v4.4s,v4.4s,v5.4s
+ add w6,w6,w10
+ add v8.4s,v8.4s,v9.4s
+ add w7,w7,w11
+ add v12.4s,v12.4s,v13.4s
+ add w8,w8,w12
+ add v16.4s,v16.4s,v17.4s
+ eor w17,w17,w5
+ add v20.4s,v20.4s,v21.4s
+ eor w19,w19,w6
+ eor v3.16b,v3.16b,v0.16b
+ eor w20,w20,w7
+ eor v7.16b,v7.16b,v4.16b
+ eor w21,w21,w8
+ eor v11.16b,v11.16b,v8.16b
+ ror w17,w17,#16
+ eor v15.16b,v15.16b,v12.16b
+ ror w19,w19,#16
+ eor v19.16b,v19.16b,v16.16b
+ ror w20,w20,#16
+ eor v23.16b,v23.16b,v20.16b
+ ror w21,w21,#16
+ rev32 v3.8h,v3.8h
+ add w13,w13,w17
+ rev32 v7.8h,v7.8h
+ add w14,w14,w19
+ rev32 v11.8h,v11.8h
+ add w15,w15,w20
+ rev32 v15.8h,v15.8h
+ add w16,w16,w21
+ rev32 v19.8h,v19.8h
+ eor w9,w9,w13
+ rev32 v23.8h,v23.8h
+ eor w10,w10,w14
+ add v2.4s,v2.4s,v3.4s
+ eor w11,w11,w15
+ add v6.4s,v6.4s,v7.4s
+ eor w12,w12,w16
+ add v10.4s,v10.4s,v11.4s
+ ror w9,w9,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w10,w10,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w11,w11,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w12,w12,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w9
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w10
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w11
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w12
+ eor v28.16b,v17.16b,v18.16b
+ eor w17,w17,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w19,w19,w6
+ ushr v1.4s,v24.4s,#20
+ eor w20,w20,w7
+ ushr v5.4s,v25.4s,#20
+ eor w21,w21,w8
+ ushr v9.4s,v26.4s,#20
+ ror w17,w17,#24
+ ushr v13.4s,v27.4s,#20
+ ror w19,w19,#24
+ ushr v17.4s,v28.4s,#20
+ ror w20,w20,#24
+ ushr v21.4s,v29.4s,#20
+ ror w21,w21,#24
+ sli v1.4s,v24.4s,#12
+ add w13,w13,w17
+ sli v5.4s,v25.4s,#12
+ add w14,w14,w19
+ sli v9.4s,v26.4s,#12
+ add w15,w15,w20
+ sli v13.4s,v27.4s,#12
+ add w16,w16,w21
+ sli v17.4s,v28.4s,#12
+ eor w9,w9,w13
+ sli v21.4s,v29.4s,#12
+ eor w10,w10,w14
+ add v0.4s,v0.4s,v1.4s
+ eor w11,w11,w15
+ add v4.4s,v4.4s,v5.4s
+ eor w12,w12,w16
+ add v8.4s,v8.4s,v9.4s
+ ror w9,w9,#25
+ add v12.4s,v12.4s,v13.4s
+ ror w10,w10,#25
+ add v16.4s,v16.4s,v17.4s
+ ror w11,w11,#25
+ add v20.4s,v20.4s,v21.4s
+ ror w12,w12,#25
+ eor v24.16b,v3.16b,v0.16b
+ add w5,w5,w10
+ eor v25.16b,v7.16b,v4.16b
+ add w6,w6,w11
+ eor v26.16b,v11.16b,v8.16b
+ add w7,w7,w12
+ eor v27.16b,v15.16b,v12.16b
+ add w8,w8,w9
+ eor v28.16b,v19.16b,v16.16b
+ eor w21,w21,w5
+ eor v29.16b,v23.16b,v20.16b
+ eor w17,w17,w6
+ ushr v3.4s,v24.4s,#24
+ eor w19,w19,w7
+ ushr v7.4s,v25.4s,#24
+ eor w20,w20,w8
+ ushr v11.4s,v26.4s,#24
+ ror w21,w21,#16
+ ushr v15.4s,v27.4s,#24
+ ror w17,w17,#16
+ ushr v19.4s,v28.4s,#24
+ ror w19,w19,#16
+ ushr v23.4s,v29.4s,#24
+ ror w20,w20,#16
+ sli v3.4s,v24.4s,#8
+ add w15,w15,w21
+ sli v7.4s,v25.4s,#8
+ add w16,w16,w17
+ sli v11.4s,v26.4s,#8
+ add w13,w13,w19
+ sli v15.4s,v27.4s,#8
+ add w14,w14,w20
+ sli v19.4s,v28.4s,#8
+ eor w10,w10,w15
+ sli v23.4s,v29.4s,#8
+ eor w11,w11,w16
+ add v2.4s,v2.4s,v3.4s
+ eor w12,w12,w13
+ add v6.4s,v6.4s,v7.4s
+ eor w9,w9,w14
+ add v10.4s,v10.4s,v11.4s
+ ror w10,w10,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w11,w11,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w12,w12,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w9,w9,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w10
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w11
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w12
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w9
+ eor v28.16b,v17.16b,v18.16b
+ eor w21,w21,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w17,w17,w6
+ ushr v1.4s,v24.4s,#25
+ eor w19,w19,w7
+ ushr v5.4s,v25.4s,#25
+ eor w20,w20,w8
+ ushr v9.4s,v26.4s,#25
+ ror w21,w21,#24
+ ushr v13.4s,v27.4s,#25
+ ror w17,w17,#24
+ ushr v17.4s,v28.4s,#25
+ ror w19,w19,#24
+ ushr v21.4s,v29.4s,#25
+ ror w20,w20,#24
+ sli v1.4s,v24.4s,#7
+ add w15,w15,w21
+ sli v5.4s,v25.4s,#7
+ add w16,w16,w17
+ sli v9.4s,v26.4s,#7
+ add w13,w13,w19
+ sli v13.4s,v27.4s,#7
+ add w14,w14,w20
+ sli v17.4s,v28.4s,#7
+ eor w10,w10,w15
+ sli v21.4s,v29.4s,#7
+ eor w11,w11,w16
+ ext v2.16b,v2.16b,v2.16b,#8
+ eor w12,w12,w13
+ ext v6.16b,v6.16b,v6.16b,#8
+ eor w9,w9,w14
+ ext v10.16b,v10.16b,v10.16b,#8
+ ror w10,w10,#25
+ ext v14.16b,v14.16b,v14.16b,#8
+ ror w11,w11,#25
+ ext v18.16b,v18.16b,v18.16b,#8
+ ror w12,w12,#25
+ ext v22.16b,v22.16b,v22.16b,#8
+ ror w9,w9,#25
+ ext v3.16b,v3.16b,v3.16b,#12
+ ext v7.16b,v7.16b,v7.16b,#12
+ ext v11.16b,v11.16b,v11.16b,#12
+ ext v15.16b,v15.16b,v15.16b,#12
+ ext v19.16b,v19.16b,v19.16b,#12
+ ext v23.16b,v23.16b,v23.16b,#12
+ ext v1.16b,v1.16b,v1.16b,#4
+ ext v5.16b,v5.16b,v5.16b,#4
+ ext v9.16b,v9.16b,v9.16b,#4
+ ext v13.16b,v13.16b,v13.16b,#4
+ ext v17.16b,v17.16b,v17.16b,#4
+ ext v21.16b,v21.16b,v21.16b,#4
+ add v0.4s,v0.4s,v1.4s
+ add w5,w5,w9
+ add v4.4s,v4.4s,v5.4s
+ add w6,w6,w10
+ add v8.4s,v8.4s,v9.4s
+ add w7,w7,w11
+ add v12.4s,v12.4s,v13.4s
+ add w8,w8,w12
+ add v16.4s,v16.4s,v17.4s
+ eor w17,w17,w5
+ add v20.4s,v20.4s,v21.4s
+ eor w19,w19,w6
+ eor v3.16b,v3.16b,v0.16b
+ eor w20,w20,w7
+ eor v7.16b,v7.16b,v4.16b
+ eor w21,w21,w8
+ eor v11.16b,v11.16b,v8.16b
+ ror w17,w17,#16
+ eor v15.16b,v15.16b,v12.16b
+ ror w19,w19,#16
+ eor v19.16b,v19.16b,v16.16b
+ ror w20,w20,#16
+ eor v23.16b,v23.16b,v20.16b
+ ror w21,w21,#16
+ rev32 v3.8h,v3.8h
+ add w13,w13,w17
+ rev32 v7.8h,v7.8h
+ add w14,w14,w19
+ rev32 v11.8h,v11.8h
+ add w15,w15,w20
+ rev32 v15.8h,v15.8h
+ add w16,w16,w21
+ rev32 v19.8h,v19.8h
+ eor w9,w9,w13
+ rev32 v23.8h,v23.8h
+ eor w10,w10,w14
+ add v2.4s,v2.4s,v3.4s
+ eor w11,w11,w15
+ add v6.4s,v6.4s,v7.4s
+ eor w12,w12,w16
+ add v10.4s,v10.4s,v11.4s
+ ror w9,w9,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w10,w10,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w11,w11,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w12,w12,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w9
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w10
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w11
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w12
+ eor v28.16b,v17.16b,v18.16b
+ eor w17,w17,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w19,w19,w6
+ ushr v1.4s,v24.4s,#20
+ eor w20,w20,w7
+ ushr v5.4s,v25.4s,#20
+ eor w21,w21,w8
+ ushr v9.4s,v26.4s,#20
+ ror w17,w17,#24
+ ushr v13.4s,v27.4s,#20
+ ror w19,w19,#24
+ ushr v17.4s,v28.4s,#20
+ ror w20,w20,#24
+ ushr v21.4s,v29.4s,#20
+ ror w21,w21,#24
+ sli v1.4s,v24.4s,#12
+ add w13,w13,w17
+ sli v5.4s,v25.4s,#12
+ add w14,w14,w19
+ sli v9.4s,v26.4s,#12
+ add w15,w15,w20
+ sli v13.4s,v27.4s,#12
+ add w16,w16,w21
+ sli v17.4s,v28.4s,#12
+ eor w9,w9,w13
+ sli v21.4s,v29.4s,#12
+ eor w10,w10,w14
+ add v0.4s,v0.4s,v1.4s
+ eor w11,w11,w15
+ add v4.4s,v4.4s,v5.4s
+ eor w12,w12,w16
+ add v8.4s,v8.4s,v9.4s
+ ror w9,w9,#25
+ add v12.4s,v12.4s,v13.4s
+ ror w10,w10,#25
+ add v16.4s,v16.4s,v17.4s
+ ror w11,w11,#25
+ add v20.4s,v20.4s,v21.4s
+ ror w12,w12,#25
+ eor v24.16b,v3.16b,v0.16b
+ add w5,w5,w10
+ eor v25.16b,v7.16b,v4.16b
+ add w6,w6,w11
+ eor v26.16b,v11.16b,v8.16b
+ add w7,w7,w12
+ eor v27.16b,v15.16b,v12.16b
+ add w8,w8,w9
+ eor v28.16b,v19.16b,v16.16b
+ eor w21,w21,w5
+ eor v29.16b,v23.16b,v20.16b
+ eor w17,w17,w6
+ ushr v3.4s,v24.4s,#24
+ eor w19,w19,w7
+ ushr v7.4s,v25.4s,#24
+ eor w20,w20,w8
+ ushr v11.4s,v26.4s,#24
+ ror w21,w21,#16
+ ushr v15.4s,v27.4s,#24
+ ror w17,w17,#16
+ ushr v19.4s,v28.4s,#24
+ ror w19,w19,#16
+ ushr v23.4s,v29.4s,#24
+ ror w20,w20,#16
+ sli v3.4s,v24.4s,#8
+ add w15,w15,w21
+ sli v7.4s,v25.4s,#8
+ add w16,w16,w17
+ sli v11.4s,v26.4s,#8
+ add w13,w13,w19
+ sli v15.4s,v27.4s,#8
+ add w14,w14,w20
+ sli v19.4s,v28.4s,#8
+ eor w10,w10,w15
+ sli v23.4s,v29.4s,#8
+ eor w11,w11,w16
+ add v2.4s,v2.4s,v3.4s
+ eor w12,w12,w13
+ add v6.4s,v6.4s,v7.4s
+ eor w9,w9,w14
+ add v10.4s,v10.4s,v11.4s
+ ror w10,w10,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w11,w11,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w12,w12,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w9,w9,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w10
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w11
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w12
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w9
+ eor v28.16b,v17.16b,v18.16b
+ eor w21,w21,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w17,w17,w6
+ ushr v1.4s,v24.4s,#25
+ eor w19,w19,w7
+ ushr v5.4s,v25.4s,#25
+ eor w20,w20,w8
+ ushr v9.4s,v26.4s,#25
+ ror w21,w21,#24
+ ushr v13.4s,v27.4s,#25
+ ror w17,w17,#24
+ ushr v17.4s,v28.4s,#25
+ ror w19,w19,#24
+ ushr v21.4s,v29.4s,#25
+ ror w20,w20,#24
+ sli v1.4s,v24.4s,#7
+ add w15,w15,w21
+ sli v5.4s,v25.4s,#7
+ add w16,w16,w17
+ sli v9.4s,v26.4s,#7
+ add w13,w13,w19
+ sli v13.4s,v27.4s,#7
+ add w14,w14,w20
+ sli v17.4s,v28.4s,#7
+ eor w10,w10,w15
+ sli v21.4s,v29.4s,#7
+ eor w11,w11,w16
+ ext v2.16b,v2.16b,v2.16b,#8
+ eor w12,w12,w13
+ ext v6.16b,v6.16b,v6.16b,#8
+ eor w9,w9,w14
+ ext v10.16b,v10.16b,v10.16b,#8
+ ror w10,w10,#25
+ ext v14.16b,v14.16b,v14.16b,#8
+ ror w11,w11,#25
+ ext v18.16b,v18.16b,v18.16b,#8
+ ror w12,w12,#25
+ ext v22.16b,v22.16b,v22.16b,#8
+ ror w9,w9,#25
+ ext v3.16b,v3.16b,v3.16b,#4
+ ext v7.16b,v7.16b,v7.16b,#4
+ ext v11.16b,v11.16b,v11.16b,#4
+ ext v15.16b,v15.16b,v15.16b,#4
+ ext v19.16b,v19.16b,v19.16b,#4
+ ext v23.16b,v23.16b,v23.16b,#4
+ ext v1.16b,v1.16b,v1.16b,#12
+ ext v5.16b,v5.16b,v5.16b,#12
+ ext v9.16b,v9.16b,v9.16b,#12
+ ext v13.16b,v13.16b,v13.16b,#12
+ ext v17.16b,v17.16b,v17.16b,#12
+ ext v21.16b,v21.16b,v21.16b,#12
+ cbnz x4,.Loop_upper_neon
+
+ add w5,w5,w22 // accumulate key block
+ add x6,x6,x22,lsr#32
+ add w7,w7,w23
+ add x8,x8,x23,lsr#32
+ add w9,w9,w24
+ add x10,x10,x24,lsr#32
+ add w11,w11,w25
+ add x12,x12,x25,lsr#32
+ add w13,w13,w26
+ add x14,x14,x26,lsr#32
+ add w15,w15,w27
+ add x16,x16,x27,lsr#32
+ add w17,w17,w28
+ add x19,x19,x28,lsr#32
+ add w20,w20,w30
+ add x21,x21,x30,lsr#32
+
+ add x5,x5,x6,lsl#32 // pack
+ add x7,x7,x8,lsl#32
+ ldp x6,x8,[x1,#0] // load input
+ add x9,x9,x10,lsl#32
+ add x11,x11,x12,lsl#32
+ ldp x10,x12,[x1,#16]
+ add x13,x13,x14,lsl#32
+ add x15,x15,x16,lsl#32
+ ldp x14,x16,[x1,#32]
+ add x17,x17,x19,lsl#32
+ add x20,x20,x21,lsl#32
+ ldp x19,x21,[x1,#48]
+ add x1,x1,#64
+#ifdef __ARMEB__
+ rev x5,x5
+ rev x7,x7
+ rev x9,x9
+ rev x11,x11
+ rev x13,x13
+ rev x15,x15
+ rev x17,x17
+ rev x20,x20
+#endif
+ eor x5,x5,x6
+ eor x7,x7,x8
+ eor x9,x9,x10
+ eor x11,x11,x12
+ eor x13,x13,x14
+ eor x15,x15,x16
+ eor x17,x17,x19
+ eor x20,x20,x21
+
+ stp x5,x7,[x0,#0] // store output
+ add x28,x28,#1 // increment counter
+ mov w5,w22 // unpack key block
+ lsr x6,x22,#32
+ stp x9,x11,[x0,#16]
+ mov w7,w23
+ lsr x8,x23,#32
+ stp x13,x15,[x0,#32]
+ mov w9,w24
+ lsr x10,x24,#32
+ stp x17,x20,[x0,#48]
+ add x0,x0,#64
+ mov w11,w25
+ lsr x12,x25,#32
+ mov w13,w26
+ lsr x14,x26,#32
+ mov w15,w27
+ lsr x16,x27,#32
+ mov w17,w28
+ lsr x19,x28,#32
+ mov w20,w30
+ lsr x21,x30,#32
+
+ mov x4,#5
+.Loop_lower_neon:
+ sub x4,x4,#1
+ add v0.4s,v0.4s,v1.4s
+ add w5,w5,w9
+ add v4.4s,v4.4s,v5.4s
+ add w6,w6,w10
+ add v8.4s,v8.4s,v9.4s
+ add w7,w7,w11
+ add v12.4s,v12.4s,v13.4s
+ add w8,w8,w12
+ add v16.4s,v16.4s,v17.4s
+ eor w17,w17,w5
+ add v20.4s,v20.4s,v21.4s
+ eor w19,w19,w6
+ eor v3.16b,v3.16b,v0.16b
+ eor w20,w20,w7
+ eor v7.16b,v7.16b,v4.16b
+ eor w21,w21,w8
+ eor v11.16b,v11.16b,v8.16b
+ ror w17,w17,#16
+ eor v15.16b,v15.16b,v12.16b
+ ror w19,w19,#16
+ eor v19.16b,v19.16b,v16.16b
+ ror w20,w20,#16
+ eor v23.16b,v23.16b,v20.16b
+ ror w21,w21,#16
+ rev32 v3.8h,v3.8h
+ add w13,w13,w17
+ rev32 v7.8h,v7.8h
+ add w14,w14,w19
+ rev32 v11.8h,v11.8h
+ add w15,w15,w20
+ rev32 v15.8h,v15.8h
+ add w16,w16,w21
+ rev32 v19.8h,v19.8h
+ eor w9,w9,w13
+ rev32 v23.8h,v23.8h
+ eor w10,w10,w14
+ add v2.4s,v2.4s,v3.4s
+ eor w11,w11,w15
+ add v6.4s,v6.4s,v7.4s
+ eor w12,w12,w16
+ add v10.4s,v10.4s,v11.4s
+ ror w9,w9,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w10,w10,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w11,w11,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w12,w12,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w9
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w10
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w11
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w12
+ eor v28.16b,v17.16b,v18.16b
+ eor w17,w17,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w19,w19,w6
+ ushr v1.4s,v24.4s,#20
+ eor w20,w20,w7
+ ushr v5.4s,v25.4s,#20
+ eor w21,w21,w8
+ ushr v9.4s,v26.4s,#20
+ ror w17,w17,#24
+ ushr v13.4s,v27.4s,#20
+ ror w19,w19,#24
+ ushr v17.4s,v28.4s,#20
+ ror w20,w20,#24
+ ushr v21.4s,v29.4s,#20
+ ror w21,w21,#24
+ sli v1.4s,v24.4s,#12
+ add w13,w13,w17
+ sli v5.4s,v25.4s,#12
+ add w14,w14,w19
+ sli v9.4s,v26.4s,#12
+ add w15,w15,w20
+ sli v13.4s,v27.4s,#12
+ add w16,w16,w21
+ sli v17.4s,v28.4s,#12
+ eor w9,w9,w13
+ sli v21.4s,v29.4s,#12
+ eor w10,w10,w14
+ add v0.4s,v0.4s,v1.4s
+ eor w11,w11,w15
+ add v4.4s,v4.4s,v5.4s
+ eor w12,w12,w16
+ add v8.4s,v8.4s,v9.4s
+ ror w9,w9,#25
+ add v12.4s,v12.4s,v13.4s
+ ror w10,w10,#25
+ add v16.4s,v16.4s,v17.4s
+ ror w11,w11,#25
+ add v20.4s,v20.4s,v21.4s
+ ror w12,w12,#25
+ eor v24.16b,v3.16b,v0.16b
+ add w5,w5,w10
+ eor v25.16b,v7.16b,v4.16b
+ add w6,w6,w11
+ eor v26.16b,v11.16b,v8.16b
+ add w7,w7,w12
+ eor v27.16b,v15.16b,v12.16b
+ add w8,w8,w9
+ eor v28.16b,v19.16b,v16.16b
+ eor w21,w21,w5
+ eor v29.16b,v23.16b,v20.16b
+ eor w17,w17,w6
+ ushr v3.4s,v24.4s,#24
+ eor w19,w19,w7
+ ushr v7.4s,v25.4s,#24
+ eor w20,w20,w8
+ ushr v11.4s,v26.4s,#24
+ ror w21,w21,#16
+ ushr v15.4s,v27.4s,#24
+ ror w17,w17,#16
+ ushr v19.4s,v28.4s,#24
+ ror w19,w19,#16
+ ushr v23.4s,v29.4s,#24
+ ror w20,w20,#16
+ sli v3.4s,v24.4s,#8
+ add w15,w15,w21
+ sli v7.4s,v25.4s,#8
+ add w16,w16,w17
+ sli v11.4s,v26.4s,#8
+ add w13,w13,w19
+ sli v15.4s,v27.4s,#8
+ add w14,w14,w20
+ sli v19.4s,v28.4s,#8
+ eor w10,w10,w15
+ sli v23.4s,v29.4s,#8
+ eor w11,w11,w16
+ add v2.4s,v2.4s,v3.4s
+ eor w12,w12,w13
+ add v6.4s,v6.4s,v7.4s
+ eor w9,w9,w14
+ add v10.4s,v10.4s,v11.4s
+ ror w10,w10,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w11,w11,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w12,w12,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w9,w9,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w10
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w11
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w12
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w9
+ eor v28.16b,v17.16b,v18.16b
+ eor w21,w21,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w17,w17,w6
+ ushr v1.4s,v24.4s,#25
+ eor w19,w19,w7
+ ushr v5.4s,v25.4s,#25
+ eor w20,w20,w8
+ ushr v9.4s,v26.4s,#25
+ ror w21,w21,#24
+ ushr v13.4s,v27.4s,#25
+ ror w17,w17,#24
+ ushr v17.4s,v28.4s,#25
+ ror w19,w19,#24
+ ushr v21.4s,v29.4s,#25
+ ror w20,w20,#24
+ sli v1.4s,v24.4s,#7
+ add w15,w15,w21
+ sli v5.4s,v25.4s,#7
+ add w16,w16,w17
+ sli v9.4s,v26.4s,#7
+ add w13,w13,w19
+ sli v13.4s,v27.4s,#7
+ add w14,w14,w20
+ sli v17.4s,v28.4s,#7
+ eor w10,w10,w15
+ sli v21.4s,v29.4s,#7
+ eor w11,w11,w16
+ ext v2.16b,v2.16b,v2.16b,#8
+ eor w12,w12,w13
+ ext v6.16b,v6.16b,v6.16b,#8
+ eor w9,w9,w14
+ ext v10.16b,v10.16b,v10.16b,#8
+ ror w10,w10,#25
+ ext v14.16b,v14.16b,v14.16b,#8
+ ror w11,w11,#25
+ ext v18.16b,v18.16b,v18.16b,#8
+ ror w12,w12,#25
+ ext v22.16b,v22.16b,v22.16b,#8
+ ror w9,w9,#25
+ ext v3.16b,v3.16b,v3.16b,#12
+ ext v7.16b,v7.16b,v7.16b,#12
+ ext v11.16b,v11.16b,v11.16b,#12
+ ext v15.16b,v15.16b,v15.16b,#12
+ ext v19.16b,v19.16b,v19.16b,#12
+ ext v23.16b,v23.16b,v23.16b,#12
+ ext v1.16b,v1.16b,v1.16b,#4
+ ext v5.16b,v5.16b,v5.16b,#4
+ ext v9.16b,v9.16b,v9.16b,#4
+ ext v13.16b,v13.16b,v13.16b,#4
+ ext v17.16b,v17.16b,v17.16b,#4
+ ext v21.16b,v21.16b,v21.16b,#4
+ add v0.4s,v0.4s,v1.4s
+ add w5,w5,w9
+ add v4.4s,v4.4s,v5.4s
+ add w6,w6,w10
+ add v8.4s,v8.4s,v9.4s
+ add w7,w7,w11
+ add v12.4s,v12.4s,v13.4s
+ add w8,w8,w12
+ add v16.4s,v16.4s,v17.4s
+ eor w17,w17,w5
+ add v20.4s,v20.4s,v21.4s
+ eor w19,w19,w6
+ eor v3.16b,v3.16b,v0.16b
+ eor w20,w20,w7
+ eor v7.16b,v7.16b,v4.16b
+ eor w21,w21,w8
+ eor v11.16b,v11.16b,v8.16b
+ ror w17,w17,#16
+ eor v15.16b,v15.16b,v12.16b
+ ror w19,w19,#16
+ eor v19.16b,v19.16b,v16.16b
+ ror w20,w20,#16
+ eor v23.16b,v23.16b,v20.16b
+ ror w21,w21,#16
+ rev32 v3.8h,v3.8h
+ add w13,w13,w17
+ rev32 v7.8h,v7.8h
+ add w14,w14,w19
+ rev32 v11.8h,v11.8h
+ add w15,w15,w20
+ rev32 v15.8h,v15.8h
+ add w16,w16,w21
+ rev32 v19.8h,v19.8h
+ eor w9,w9,w13
+ rev32 v23.8h,v23.8h
+ eor w10,w10,w14
+ add v2.4s,v2.4s,v3.4s
+ eor w11,w11,w15
+ add v6.4s,v6.4s,v7.4s
+ eor w12,w12,w16
+ add v10.4s,v10.4s,v11.4s
+ ror w9,w9,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w10,w10,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w11,w11,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w12,w12,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w9
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w10
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w11
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w12
+ eor v28.16b,v17.16b,v18.16b
+ eor w17,w17,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w19,w19,w6
+ ushr v1.4s,v24.4s,#20
+ eor w20,w20,w7
+ ushr v5.4s,v25.4s,#20
+ eor w21,w21,w8
+ ushr v9.4s,v26.4s,#20
+ ror w17,w17,#24
+ ushr v13.4s,v27.4s,#20
+ ror w19,w19,#24
+ ushr v17.4s,v28.4s,#20
+ ror w20,w20,#24
+ ushr v21.4s,v29.4s,#20
+ ror w21,w21,#24
+ sli v1.4s,v24.4s,#12
+ add w13,w13,w17
+ sli v5.4s,v25.4s,#12
+ add w14,w14,w19
+ sli v9.4s,v26.4s,#12
+ add w15,w15,w20
+ sli v13.4s,v27.4s,#12
+ add w16,w16,w21
+ sli v17.4s,v28.4s,#12
+ eor w9,w9,w13
+ sli v21.4s,v29.4s,#12
+ eor w10,w10,w14
+ add v0.4s,v0.4s,v1.4s
+ eor w11,w11,w15
+ add v4.4s,v4.4s,v5.4s
+ eor w12,w12,w16
+ add v8.4s,v8.4s,v9.4s
+ ror w9,w9,#25
+ add v12.4s,v12.4s,v13.4s
+ ror w10,w10,#25
+ add v16.4s,v16.4s,v17.4s
+ ror w11,w11,#25
+ add v20.4s,v20.4s,v21.4s
+ ror w12,w12,#25
+ eor v24.16b,v3.16b,v0.16b
+ add w5,w5,w10
+ eor v25.16b,v7.16b,v4.16b
+ add w6,w6,w11
+ eor v26.16b,v11.16b,v8.16b
+ add w7,w7,w12
+ eor v27.16b,v15.16b,v12.16b
+ add w8,w8,w9
+ eor v28.16b,v19.16b,v16.16b
+ eor w21,w21,w5
+ eor v29.16b,v23.16b,v20.16b
+ eor w17,w17,w6
+ ushr v3.4s,v24.4s,#24
+ eor w19,w19,w7
+ ushr v7.4s,v25.4s,#24
+ eor w20,w20,w8
+ ushr v11.4s,v26.4s,#24
+ ror w21,w21,#16
+ ushr v15.4s,v27.4s,#24
+ ror w17,w17,#16
+ ushr v19.4s,v28.4s,#24
+ ror w19,w19,#16
+ ushr v23.4s,v29.4s,#24
+ ror w20,w20,#16
+ sli v3.4s,v24.4s,#8
+ add w15,w15,w21
+ sli v7.4s,v25.4s,#8
+ add w16,w16,w17
+ sli v11.4s,v26.4s,#8
+ add w13,w13,w19
+ sli v15.4s,v27.4s,#8
+ add w14,w14,w20
+ sli v19.4s,v28.4s,#8
+ eor w10,w10,w15
+ sli v23.4s,v29.4s,#8
+ eor w11,w11,w16
+ add v2.4s,v2.4s,v3.4s
+ eor w12,w12,w13
+ add v6.4s,v6.4s,v7.4s
+ eor w9,w9,w14
+ add v10.4s,v10.4s,v11.4s
+ ror w10,w10,#20
+ add v14.4s,v14.4s,v15.4s
+ ror w11,w11,#20
+ add v18.4s,v18.4s,v19.4s
+ ror w12,w12,#20
+ add v22.4s,v22.4s,v23.4s
+ ror w9,w9,#20
+ eor v24.16b,v1.16b,v2.16b
+ add w5,w5,w10
+ eor v25.16b,v5.16b,v6.16b
+ add w6,w6,w11
+ eor v26.16b,v9.16b,v10.16b
+ add w7,w7,w12
+ eor v27.16b,v13.16b,v14.16b
+ add w8,w8,w9
+ eor v28.16b,v17.16b,v18.16b
+ eor w21,w21,w5
+ eor v29.16b,v21.16b,v22.16b
+ eor w17,w17,w6
+ ushr v1.4s,v24.4s,#25
+ eor w19,w19,w7
+ ushr v5.4s,v25.4s,#25
+ eor w20,w20,w8
+ ushr v9.4s,v26.4s,#25
+ ror w21,w21,#24
+ ushr v13.4s,v27.4s,#25
+ ror w17,w17,#24
+ ushr v17.4s,v28.4s,#25
+ ror w19,w19,#24
+ ushr v21.4s,v29.4s,#25
+ ror w20,w20,#24
+ sli v1.4s,v24.4s,#7
+ add w15,w15,w21
+ sli v5.4s,v25.4s,#7
+ add w16,w16,w17
+ sli v9.4s,v26.4s,#7
+ add w13,w13,w19
+ sli v13.4s,v27.4s,#7
+ add w14,w14,w20
+ sli v17.4s,v28.4s,#7
+ eor w10,w10,w15
+ sli v21.4s,v29.4s,#7
+ eor w11,w11,w16
+ ext v2.16b,v2.16b,v2.16b,#8
+ eor w12,w12,w13
+ ext v6.16b,v6.16b,v6.16b,#8
+ eor w9,w9,w14
+ ext v10.16b,v10.16b,v10.16b,#8
+ ror w10,w10,#25
+ ext v14.16b,v14.16b,v14.16b,#8
+ ror w11,w11,#25
+ ext v18.16b,v18.16b,v18.16b,#8
+ ror w12,w12,#25
+ ext v22.16b,v22.16b,v22.16b,#8
+ ror w9,w9,#25
+ ext v3.16b,v3.16b,v3.16b,#4
+ ext v7.16b,v7.16b,v7.16b,#4
+ ext v11.16b,v11.16b,v11.16b,#4
+ ext v15.16b,v15.16b,v15.16b,#4
+ ext v19.16b,v19.16b,v19.16b,#4
+ ext v23.16b,v23.16b,v23.16b,#4
+ ext v1.16b,v1.16b,v1.16b,#12
+ ext v5.16b,v5.16b,v5.16b,#12
+ ext v9.16b,v9.16b,v9.16b,#12
+ ext v13.16b,v13.16b,v13.16b,#12
+ ext v17.16b,v17.16b,v17.16b,#12
+ ext v21.16b,v21.16b,v21.16b,#12
+ cbnz x4,.Loop_lower_neon
+
+ add w5,w5,w22 // accumulate key block
+ ldp q24,q25,[sp,#0]
+ add x6,x6,x22,lsr#32
+ ldp q26,q27,[sp,#32]
+ add w7,w7,w23
+ ldp q28,q29,[sp,#64]
+ add x8,x8,x23,lsr#32
+ add v0.4s,v0.4s,v24.4s
+ add w9,w9,w24
+ add v4.4s,v4.4s,v24.4s
+ add x10,x10,x24,lsr#32
+ add v8.4s,v8.4s,v24.4s
+ add w11,w11,w25
+ add v12.4s,v12.4s,v24.4s
+ add x12,x12,x25,lsr#32
+ add v16.4s,v16.4s,v24.4s
+ add w13,w13,w26
+ add v20.4s,v20.4s,v24.4s
+ add x14,x14,x26,lsr#32
+ add v2.4s,v2.4s,v26.4s
+ add w15,w15,w27
+ add v6.4s,v6.4s,v26.4s
+ add x16,x16,x27,lsr#32
+ add v10.4s,v10.4s,v26.4s
+ add w17,w17,w28
+ add v14.4s,v14.4s,v26.4s
+ add x19,x19,x28,lsr#32
+ add v18.4s,v18.4s,v26.4s
+ add w20,w20,w30
+ add v22.4s,v22.4s,v26.4s
+ add x21,x21,x30,lsr#32
+ add v19.4s,v19.4s,v31.4s // +4
+ add x5,x5,x6,lsl#32 // pack
+ add v23.4s,v23.4s,v31.4s // +4
+ add x7,x7,x8,lsl#32
+ add v3.4s,v3.4s,v27.4s
+ ldp x6,x8,[x1,#0] // load input
+ add v7.4s,v7.4s,v28.4s
+ add x9,x9,x10,lsl#32
+ add v11.4s,v11.4s,v29.4s
+ add x11,x11,x12,lsl#32
+ add v15.4s,v15.4s,v30.4s
+ ldp x10,x12,[x1,#16]
+ add v19.4s,v19.4s,v27.4s
+ add x13,x13,x14,lsl#32
+ add v23.4s,v23.4s,v28.4s
+ add x15,x15,x16,lsl#32
+ add v1.4s,v1.4s,v25.4s
+ ldp x14,x16,[x1,#32]
+ add v5.4s,v5.4s,v25.4s
+ add x17,x17,x19,lsl#32
+ add v9.4s,v9.4s,v25.4s
+ add x20,x20,x21,lsl#32
+ add v13.4s,v13.4s,v25.4s
+ ldp x19,x21,[x1,#48]
+ add v17.4s,v17.4s,v25.4s
+ add x1,x1,#64
+ add v21.4s,v21.4s,v25.4s
+
+#ifdef __ARMEB__
+ rev x5,x5
+ rev x7,x7
+ rev x9,x9
+ rev x11,x11
+ rev x13,x13
+ rev x15,x15
+ rev x17,x17
+ rev x20,x20
+#endif
+ ld1 {v24.16b,v25.16b,v26.16b,v27.16b},[x1],#64
+ eor x5,x5,x6
+ eor x7,x7,x8
+ eor x9,x9,x10
+ eor x11,x11,x12
+ eor x13,x13,x14
+ eor v0.16b,v0.16b,v24.16b
+ eor x15,x15,x16
+ eor v1.16b,v1.16b,v25.16b
+ eor x17,x17,x19
+ eor v2.16b,v2.16b,v26.16b
+ eor x20,x20,x21
+ eor v3.16b,v3.16b,v27.16b
+ ld1 {v24.16b,v25.16b,v26.16b,v27.16b},[x1],#64
+
+ stp x5,x7,[x0,#0] // store output
+ add x28,x28,#7 // increment counter
+ stp x9,x11,[x0,#16]
+ stp x13,x15,[x0,#32]
+ stp x17,x20,[x0,#48]
+ add x0,x0,#64
+ st1 {v0.16b,v1.16b,v2.16b,v3.16b},[x0],#64
+
+ ld1 {v0.16b,v1.16b,v2.16b,v3.16b},[x1],#64
+ eor v4.16b,v4.16b,v24.16b
+ eor v5.16b,v5.16b,v25.16b
+ eor v6.16b,v6.16b,v26.16b
+ eor v7.16b,v7.16b,v27.16b
+ st1 {v4.16b,v5.16b,v6.16b,v7.16b},[x0],#64
+
+ ld1 {v4.16b,v5.16b,v6.16b,v7.16b},[x1],#64
+ eor v8.16b,v8.16b,v0.16b
+ ldp q24,q25,[sp,#0]
+ eor v9.16b,v9.16b,v1.16b
+ ldp q26,q27,[sp,#32]
+ eor v10.16b,v10.16b,v2.16b
+ eor v11.16b,v11.16b,v3.16b
+ st1 {v8.16b,v9.16b,v10.16b,v11.16b},[x0],#64
+
+ ld1 {v8.16b,v9.16b,v10.16b,v11.16b},[x1],#64
+ eor v12.16b,v12.16b,v4.16b
+ eor v13.16b,v13.16b,v5.16b
+ eor v14.16b,v14.16b,v6.16b
+ eor v15.16b,v15.16b,v7.16b
+ st1 {v12.16b,v13.16b,v14.16b,v15.16b},[x0],#64
+
+ ld1 {v12.16b,v13.16b,v14.16b,v15.16b},[x1],#64
+ eor v16.16b,v16.16b,v8.16b
+ eor v17.16b,v17.16b,v9.16b
+ eor v18.16b,v18.16b,v10.16b
+ eor v19.16b,v19.16b,v11.16b
+ st1 {v16.16b,v17.16b,v18.16b,v19.16b},[x0],#64
+
+ shl v0.4s,v31.4s,#1 // 4 -> 8
+ eor v20.16b,v20.16b,v12.16b
+ eor v21.16b,v21.16b,v13.16b
+ eor v22.16b,v22.16b,v14.16b
+ eor v23.16b,v23.16b,v15.16b
+ st1 {v20.16b,v21.16b,v22.16b,v23.16b},[x0],#64
+
+ add v27.4s,v27.4s,v0.4s // += 8
+ add v28.4s,v28.4s,v0.4s
+ add v29.4s,v29.4s,v0.4s
+ add v30.4s,v30.4s,v0.4s
+
+ b.hs .Loop_outer_512_neon
+
+ adds x2,x2,#512
+ ushr v0.4s,v31.4s,#2 // 4 -> 1
+
+ ldp d8,d9,[sp,#128+0] // meet ABI requirements
+ ldp d10,d11,[sp,#128+16]
+ ldp d12,d13,[sp,#128+32]
+ ldp d14,d15,[sp,#128+48]
+
+ stp q24,q31,[sp,#0] // wipe off-load area
+ stp q24,q31,[sp,#32]
+ stp q24,q31,[sp,#64]
+
+ b.eq .Ldone_512_neon
+
+ cmp x2,#192
+ sub v27.4s,v27.4s,v0.4s // -= 1
+ sub v28.4s,v28.4s,v0.4s
+ sub v29.4s,v29.4s,v0.4s
+ add sp,sp,#128
+ b.hs .Loop_outer_neon
+
+ eor v25.16b,v25.16b,v25.16b
+ eor v26.16b,v26.16b,v26.16b
+ eor v27.16b,v27.16b,v27.16b
+ eor v28.16b,v28.16b,v28.16b
+ eor v29.16b,v29.16b,v29.16b
+ eor v30.16b,v30.16b,v30.16b
+ b .Loop_outer
+
+.Ldone_512_neon:
+ ldp x19,x20,[x29,#16]
+ add sp,sp,#128+64
+ ldp x21,x22,[x29,#32]
+ ldp x23,x24,[x29,#48]
+ ldp x25,x26,[x29,#64]
+ ldp x27,x28,[x29,#80]
+ ldp x29,x30,[sp],#96
+.Labort_neon:
+ ret
+ENDPROC(chacha20_neon)
diff --git a/lib/zinc/chacha20/chacha20-mips.S b/lib/zinc/chacha20/chacha20-mips.S
new file mode 100644
index 000000000000..77da2c2fb240
--- /dev/null
+++ b/lib/zinc/chacha20/chacha20-mips.S
@@ -0,0 +1,474 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2016-2018 René van Dorst <opensource@...rst.com>. All Rights Reserved.
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#define MASK_U32 0x3c
+#define MASK_BYTES 0x03
+#define CHACHA20_BLOCK_SIZE 64
+#define STACK_SIZE 4*16
+
+#define X0 $t0
+#define X1 $t1
+#define X2 $t2
+#define X3 $t3
+#define X4 $t4
+#define X5 $t5
+#define X6 $t6
+#define X7 $t7
+#define X8 $v1
+#define X9 $fp
+#define X10 $s7
+#define X11 $s6
+#define X12 $s5
+#define X13 $s4
+#define X14 $s3
+#define X15 $s2
+/* Use regs which are overwritten on exit for Tx so we don't leak clear data. */
+#define T0 $s1
+#define T1 $s0
+#define T(n) T ## n
+#define X(n) X ## n
+
+/* Input arguments */
+#define OUT $a0
+#define IN $a1
+#define BYTES $a2
+/* KEY and NONCE argument must be u32 aligned */
+#define KEY $a3
+/* NONCE pointer is given via stack */
+#define NONCE $t9
+
+/* Output argument */
+/* NONCE[0] is kept in a register and not in memory.
+ * We don't want to touch original value in memory.
+ * Must be incremented every loop iteration.
+ */
+#define NONCE_0 $v0
+
+/* SAVED_X and SAVED_CA are set in the jump table.
+ * Use regs which are overwritten on exit else we don't leak clear data.
+ * They are used to handling the last bytes which are not multiple of 4.
+ */
+#define SAVED_X X15
+#define SAVED_CA $ra
+
+#define PTR_LAST_ROUND $t8
+
+/* ChaCha20 constants and stack location */
+#define CONSTANT_OFS_SP 48
+#define UNALIGNED_OFS_SP 40
+
+#define CONSTANT_1 0x61707865
+#define CONSTANT_2 0x3320646e
+#define CONSTANT_3 0x79622d32
+#define CONSTANT_4 0x6b206574
+
+#if __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+#define MSB 0
+#define LSB 3
+#define ROTx rotl
+#define ROTR(n) rotr n, 24
+#define CPU_TO_LE32(n) \
+ wsbh n; \
+ rotr n, 16;
+#else
+#define MSB 3
+#define LSB 0
+#define ROTx rotr
+#define CPU_TO_LE32(n)
+#define ROTR(n)
+#endif
+
+#define STORE_UNALIGNED(x, a, s, o) \
+.Lchacha20_mips_xor_unaligned_ ## x ## _b: ; \
+ .if ((s != NONCE) || (o != 0)); \
+ lw T0, o(s); \
+ .endif; \
+ lwl T1, x-4+MSB ## (IN); \
+ lwr T1, x-4+LSB ## (IN); \
+ .if ((s == NONCE) && (o == 0)); \
+ addu X ## a, NONCE_0; \
+ .else; \
+ addu X ## a, T0; \
+ .endif; \
+ CPU_TO_LE32(X ## a); \
+ xor X ## a, T1; \
+ swl X ## a, x-4+MSB ## (OUT); \
+ swr X ## a, x-4+LSB ## (OUT);
+
+#define STORE_ALIGNED(x, a, s, o) \
+.Lchacha20_mips_xor_aligned_ ## x ## _b: ; \
+ .if ((s != NONCE) || (o != 0)); \
+ lw T0, o(s); \
+ .endif; \
+ lw T1, x-4 ## (IN); \
+ .if ((s == NONCE) && (o == 0)); \
+ addu X ## a, NONCE_0; \
+ .else; \
+ addu X ## a, T0; \
+ .endif; \
+ CPU_TO_LE32(X ## a); \
+ xor X ## a, T1; \
+ sw X ## a, x-4 ## (OUT);
+
+/* Jump table macro.
+ * Used for setup and handling the last bytes, which are not multiple of 4.
+ * X15 is free to store Xn
+ * Every jumptable entry must be equal in size.
+ */
+#define JMPTBL_ALIGNED(x, a, s, o) \
+.Lchacha20_mips_jmptbl_aligned_ ## a: ; \
+ .if ((s == NONCE) && (o == 0)); \
+ move SAVED_CA, NONCE_0; \
+ .else; \
+ lw SAVED_CA, o(s);\
+ .endif; \
+ b .Lchacha20_mips_xor_aligned_ ## x ## _b; \
+ move SAVED_X, X ## a;
+
+#define JMPTBL_UNALIGNED(x, a, s, o) \
+.Lchacha20_mips_jmptbl_unaligned_ ## a: ; \
+ .if ((s == NONCE) && (o == 0)); \
+ move SAVED_CA, NONCE_0; \
+ .else; \
+ lw SAVED_CA, o(s);\
+ .endif; \
+ b .Lchacha20_mips_xor_unaligned_ ## x ## _b; \
+ move SAVED_X, X ## a;
+
+#define AXR(A, B, C, D, K, L, M, N, V, W, Y, Z, S) \
+ addu X(A), X(K); \
+ addu X(B), X(L); \
+ addu X(C), X(M); \
+ addu X(D), X(N); \
+ xor X(V), X(A); \
+ xor X(W), X(B); \
+ xor X(Y), X(C); \
+ xor X(Z), X(D); \
+ rotl X(V), S; \
+ rotl X(W), S; \
+ rotl X(Y), S; \
+ rotl X(Z), S;
+
+.text
+.set reorder
+.set noat
+.globl chacha20_mips
+.ent chacha20_mips
+chacha20_mips:
+ .frame $sp, STACK_SIZE, $ra
+ /* This is in the fifth argument */
+ lw NONCE, 16($sp)
+
+ /* Return bytes = 0. */
+ .set noreorder
+ beqz BYTES, .Lchacha20_mips_end
+ addiu $sp, -STACK_SIZE
+ .set reorder
+
+ /* Calculate PTR_LAST_ROUND */
+ addiu PTR_LAST_ROUND, BYTES, -1
+ ins PTR_LAST_ROUND, $zero, 0, 6
+ addu PTR_LAST_ROUND, OUT
+
+ /* Save s0-s7, fp, ra. */
+ sw $ra, 0($sp)
+ sw $fp, 4($sp)
+ sw $s0, 8($sp)
+ sw $s1, 12($sp)
+ sw $s2, 16($sp)
+ sw $s3, 20($sp)
+ sw $s4, 24($sp)
+ sw $s5, 28($sp)
+ sw $s6, 32($sp)
+ sw $s7, 36($sp)
+
+ lw NONCE_0, 0(NONCE)
+ /* Test IN or OUT is unaligned.
+ * UNALIGNED (T1) = ( IN | OUT ) & 0x00000003
+ */
+ or T1, IN, OUT
+ andi T1, 0x3
+
+ /* Load constant */
+ lui X0, %hi(CONSTANT_1)
+ lui X1, %hi(CONSTANT_2)
+ lui X2, %hi(CONSTANT_3)
+ lui X3, %hi(CONSTANT_4)
+ ori X0, %lo(CONSTANT_1)
+ ori X1, %lo(CONSTANT_2)
+ ori X2, %lo(CONSTANT_3)
+ ori X3, %lo(CONSTANT_4)
+
+ /* Store constant on stack. */
+ sw X0, 0+CONSTANT_OFS_SP($sp)
+ sw X1, 4+CONSTANT_OFS_SP($sp)
+ sw X2, 8+CONSTANT_OFS_SP($sp)
+ sw X3, 12+CONSTANT_OFS_SP($sp)
+
+ sw T1, UNALIGNED_OFS_SP($sp)
+
+ .set noreorder
+ b .Lchacha20_rounds_start
+ andi BYTES, (CHACHA20_BLOCK_SIZE-1)
+ .set reorder
+
+.align 4
+.Loop_chacha20_rounds:
+ addiu IN, CHACHA20_BLOCK_SIZE
+ addiu OUT, CHACHA20_BLOCK_SIZE
+ addiu NONCE_0, 1
+
+ lw X0, 0+CONSTANT_OFS_SP($sp)
+ lw X1, 4+CONSTANT_OFS_SP($sp)
+ lw X2, 8+CONSTANT_OFS_SP($sp)
+ lw X3, 12+CONSTANT_OFS_SP($sp)
+ lw T1, UNALIGNED_OFS_SP($sp)
+
+.Lchacha20_rounds_start:
+ lw X4, 0(KEY)
+ lw X5, 4(KEY)
+ lw X6, 8(KEY)
+ lw X7, 12(KEY)
+ lw X8, 16(KEY)
+ lw X9, 20(KEY)
+ lw X10, 24(KEY)
+ lw X11, 28(KEY)
+
+ move X12, NONCE_0
+ lw X13, 4(NONCE)
+ lw X14, 8(NONCE)
+ lw X15, 12(NONCE)
+
+ li $at, 9
+.Loop_chacha20_xor_rounds:
+ AXR( 0, 1, 2, 3, 4, 5, 6, 7, 12,13,14,15, 16);
+ AXR( 8, 9,10,11, 12,13,14,15, 4, 5, 6, 7, 12);
+ AXR( 0, 1, 2, 3, 4, 5, 6, 7, 12,13,14,15, 8);
+ AXR( 8, 9,10,11, 12,13,14,15, 4, 5, 6, 7, 7);
+ AXR( 0, 1, 2, 3, 5, 6, 7, 4, 15,12,13,14, 16);
+ AXR(10,11, 8, 9, 15,12,13,14, 5, 6, 7, 4, 12);
+ AXR( 0, 1, 2, 3, 5, 6, 7, 4, 15,12,13,14, 8);
+ AXR(10,11, 8, 9, 15,12,13,14, 5, 6, 7, 4, 7);
+ .set noreorder
+ bnez $at, .Loop_chacha20_xor_rounds
+ addiu $at, -1
+
+ /* Unaligned? Jump */
+ bnez T1, .Loop_chacha20_unaligned
+ andi $at, BYTES, MASK_U32
+
+ /* Last round? No jump */
+ bne OUT, PTR_LAST_ROUND, .Lchacha20_mips_xor_aligned_64_b
+ /* Load upper half of jump table addr */
+ lui T0, %hi(.Lchacha20_mips_jmptbl_aligned_0)
+
+ /* Full block? Jump */
+ beqz BYTES, .Lchacha20_mips_xor_aligned_64_b
+ /* Calculate lower half jump table addr and offset */
+ ins T0, $at, 2, 6
+
+ subu T0, $at
+ addiu T0, %lo(.Lchacha20_mips_jmptbl_aligned_0)
+
+ jr T0
+ /* Delay slot */
+ nop
+
+ .set reorder
+
+.Loop_chacha20_unaligned:
+ .set noreorder
+
+ /* Last round? no jump */
+ bne OUT, PTR_LAST_ROUND, .Lchacha20_mips_xor_unaligned_64_b
+ /* Load upper half of jump table addr */
+ lui T0, %hi(.Lchacha20_mips_jmptbl_unaligned_0)
+
+ /* Full block? Jump */
+ beqz BYTES, .Lchacha20_mips_xor_unaligned_64_b
+
+ /* Calculate lower half jump table addr and offset */
+ ins T0, $at, 2, 6
+ subu T0, $at
+ addiu T0, %lo(.Lchacha20_mips_jmptbl_unaligned_0)
+
+ jr T0
+ /* Delay slot */
+ nop
+
+ .set reorder
+
+/* Aligned code path
+ */
+.align 4
+ STORE_ALIGNED(64, 15, NONCE,12)
+ STORE_ALIGNED(60, 14, NONCE, 8)
+ STORE_ALIGNED(56, 13, NONCE, 4)
+ STORE_ALIGNED(52, 12, NONCE, 0)
+ STORE_ALIGNED(48, 11, KEY, 28)
+ STORE_ALIGNED(44, 10, KEY, 24)
+ STORE_ALIGNED(40, 9, KEY, 20)
+ STORE_ALIGNED(36, 8, KEY, 16)
+ STORE_ALIGNED(32, 7, KEY, 12)
+ STORE_ALIGNED(28, 6, KEY, 8)
+ STORE_ALIGNED(24, 5, KEY, 4)
+ STORE_ALIGNED(20, 4, KEY, 0)
+ STORE_ALIGNED(16, 3, $sp, 12+CONSTANT_OFS_SP)
+ STORE_ALIGNED(12, 2, $sp, 8+CONSTANT_OFS_SP)
+ STORE_ALIGNED( 8, 1, $sp, 4+CONSTANT_OFS_SP)
+.Lchacha20_mips_xor_aligned_4_b:
+ /* STORE_ALIGNED( 4, 0, $sp, 0+CONSTANT_OFS_SP) */
+ lw T0, 0+CONSTANT_OFS_SP($sp)
+ lw T1, 0(IN)
+ addu X0, T0
+ CPU_TO_LE32(X0)
+ xor X0, T1
+ .set noreorder
+ bne OUT, PTR_LAST_ROUND, .Loop_chacha20_rounds
+ sw X0, 0(OUT)
+ .set reorder
+
+ .set noreorder
+ bne $at, BYTES, .Lchacha20_mips_xor_bytes
+ /* Empty delayslot, Increase NONCE_0, return NONCE_0 value */
+ addiu NONCE_0, 1
+ .set noreorder
+
+.Lchacha20_mips_xor_done:
+ /* Restore used registers */
+ lw $ra, 0($sp)
+ lw $fp, 4($sp)
+ lw $s0, 8($sp)
+ lw $s1, 12($sp)
+ lw $s2, 16($sp)
+ lw $s3, 20($sp)
+ lw $s4, 24($sp)
+ lw $s5, 28($sp)
+ lw $s6, 32($sp)
+ lw $s7, 36($sp)
+.Lchacha20_mips_end:
+ .set noreorder
+ jr $ra
+ addiu $sp, STACK_SIZE
+ .set reorder
+
+ .set noreorder
+ /* Start jump table */
+ JMPTBL_ALIGNED( 0, 0, $sp, 0+CONSTANT_OFS_SP)
+ JMPTBL_ALIGNED( 4, 1, $sp, 4+CONSTANT_OFS_SP)
+ JMPTBL_ALIGNED( 8, 2, $sp, 8+CONSTANT_OFS_SP)
+ JMPTBL_ALIGNED(12, 3, $sp, 12+CONSTANT_OFS_SP)
+ JMPTBL_ALIGNED(16, 4, KEY, 0)
+ JMPTBL_ALIGNED(20, 5, KEY, 4)
+ JMPTBL_ALIGNED(24, 6, KEY, 8)
+ JMPTBL_ALIGNED(28, 7, KEY, 12)
+ JMPTBL_ALIGNED(32, 8, KEY, 16)
+ JMPTBL_ALIGNED(36, 9, KEY, 20)
+ JMPTBL_ALIGNED(40, 10, KEY, 24)
+ JMPTBL_ALIGNED(44, 11, KEY, 28)
+ JMPTBL_ALIGNED(48, 12, NONCE, 0)
+ JMPTBL_ALIGNED(52, 13, NONCE, 4)
+ JMPTBL_ALIGNED(56, 14, NONCE, 8)
+ JMPTBL_ALIGNED(60, 15, NONCE,12)
+ /* End jump table */
+ .set reorder
+
+/* Unaligned code path
+ */
+ STORE_UNALIGNED(64, 15, NONCE,12)
+ STORE_UNALIGNED(60, 14, NONCE, 8)
+ STORE_UNALIGNED(56, 13, NONCE, 4)
+ STORE_UNALIGNED(52, 12, NONCE, 0)
+ STORE_UNALIGNED(48, 11, KEY, 28)
+ STORE_UNALIGNED(44, 10, KEY, 24)
+ STORE_UNALIGNED(40, 9, KEY, 20)
+ STORE_UNALIGNED(36, 8, KEY, 16)
+ STORE_UNALIGNED(32, 7, KEY, 12)
+ STORE_UNALIGNED(28, 6, KEY, 8)
+ STORE_UNALIGNED(24, 5, KEY, 4)
+ STORE_UNALIGNED(20, 4, KEY, 0)
+ STORE_UNALIGNED(16, 3, $sp, 12+CONSTANT_OFS_SP)
+ STORE_UNALIGNED(12, 2, $sp, 8+CONSTANT_OFS_SP)
+ STORE_UNALIGNED( 8, 1, $sp, 4+CONSTANT_OFS_SP)
+.Lchacha20_mips_xor_unaligned_4_b:
+ /* STORE_UNALIGNED( 4, 0, $sp, 0+CONSTANT_OFS_SP) */
+ lw T0, 0+CONSTANT_OFS_SP($sp)
+ lwl T1, 0+MSB(IN)
+ lwr T1, 0+LSB(IN)
+ addu X0, T0
+ CPU_TO_LE32(X0)
+ xor X0, T1
+ swl X0, 0+MSB(OUT)
+ .set noreorder
+ bne OUT, PTR_LAST_ROUND, .Loop_chacha20_rounds
+ swr X0, 0+LSB(OUT)
+ .set reorder
+
+ /* Fall through to byte handling */
+ .set noreorder
+ beq $at, BYTES, .Lchacha20_mips_xor_done
+ /* Empty delayslot, increase NONCE_0, return NONCE_0 value */
+.Lchacha20_mips_xor_unaligned_0_b:
+.Lchacha20_mips_xor_aligned_0_b:
+ addiu NONCE_0, 1
+ .set reorder
+
+.Lchacha20_mips_xor_bytes:
+ addu OUT, $at
+ addu IN, $at
+ addu SAVED_X, SAVED_CA
+ /* First byte */
+ lbu T1, 0(IN)
+ andi $at, BYTES, 2
+ CPU_TO_LE32(SAVED_X)
+ ROTR(SAVED_X)
+ xor T1, SAVED_X
+ .set noreorder
+ beqz $at, .Lchacha20_mips_xor_done
+ sb T1, 0(OUT)
+ .set reorder
+ /* Second byte */
+ lbu T1, 1(IN)
+ andi $at, BYTES, 1
+ ROTx SAVED_X, 8
+ xor T1, SAVED_X
+ .set noreorder
+ beqz $at, .Lchacha20_mips_xor_done
+ sb T1, 1(OUT)
+ .set reorder
+ /* Third byte */
+ lbu T1, 2(IN)
+ ROTx SAVED_X, 8
+ xor T1, SAVED_X
+ .set noreorder
+ b .Lchacha20_mips_xor_done
+ sb T1, 2(OUT)
+ .set reorder
+.set noreorder
+
+.Lchacha20_mips_jmptbl_unaligned:
+ /* Start jump table */
+ JMPTBL_UNALIGNED( 0, 0, $sp, 0+CONSTANT_OFS_SP)
+ JMPTBL_UNALIGNED( 4, 1, $sp, 4+CONSTANT_OFS_SP)
+ JMPTBL_UNALIGNED( 8, 2, $sp, 8+CONSTANT_OFS_SP)
+ JMPTBL_UNALIGNED(12, 3, $sp, 12+CONSTANT_OFS_SP)
+ JMPTBL_UNALIGNED(16, 4, KEY, 0)
+ JMPTBL_UNALIGNED(20, 5, KEY, 4)
+ JMPTBL_UNALIGNED(24, 6, KEY, 8)
+ JMPTBL_UNALIGNED(28, 7, KEY, 12)
+ JMPTBL_UNALIGNED(32, 8, KEY, 16)
+ JMPTBL_UNALIGNED(36, 9, KEY, 20)
+ JMPTBL_UNALIGNED(40, 10, KEY, 24)
+ JMPTBL_UNALIGNED(44, 11, KEY, 28)
+ JMPTBL_UNALIGNED(48, 12, NONCE, 0)
+ JMPTBL_UNALIGNED(52, 13, NONCE, 4)
+ JMPTBL_UNALIGNED(56, 14, NONCE, 8)
+ JMPTBL_UNALIGNED(60, 15, NONCE,12)
+ /* End jump table */
+.set reorder
+
+.end chacha20_mips
+.set at
diff --git a/lib/zinc/chacha20/chacha20-x86_64.S b/lib/zinc/chacha20/chacha20-x86_64.S
new file mode 100644
index 000000000000..39883f3718b6
--- /dev/null
+++ b/lib/zinc/chacha20/chacha20-x86_64.S
@@ -0,0 +1,2630 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2017 Samuel Neves <sneves@....uc.pt>. All Rights Reserved.
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#include <linux/linkage.h>
+
+.section .rodata.cst16.Lzero, "aM", @progbits, 16
+.align 16
+.Lzero:
+.long 0,0,0,0
+.section .rodata.cst16.Lone, "aM", @progbits, 16
+.align 16
+.Lone:
+.long 1,0,0,0
+.section .rodata.cst16.Linc, "aM", @progbits, 16
+.align 16
+.Linc:
+.long 0,1,2,3
+.section .rodata.cst16.Lfour, "aM", @progbits, 16
+.align 16
+.Lfour:
+.long 4,4,4,4
+.section .rodata.cst32.Lincy, "aM", @progbits, 32
+.align 32
+.Lincy:
+.long 0,2,4,6,1,3,5,7
+.section .rodata.cst32.Leight, "aM", @progbits, 32
+.align 32
+.Leight:
+.long 8,8,8,8,8,8,8,8
+.section .rodata.cst16.Lrot16, "aM", @progbits, 16
+.align 16
+.Lrot16:
+.byte 0x2,0x3,0x0,0x1, 0x6,0x7,0x4,0x5, 0xa,0xb,0x8,0x9, 0xe,0xf,0xc,0xd
+.section .rodata.cst16.Lrot24, "aM", @progbits, 16
+.align 16
+.Lrot24:
+.byte 0x3,0x0,0x1,0x2, 0x7,0x4,0x5,0x6, 0xb,0x8,0x9,0xa, 0xf,0xc,0xd,0xe
+.section .rodata.cst16.Lsigma, "aM", @progbits, 16
+.align 16
+.Lsigma:
+.byte 101,120,112,97,110,100,32,51,50,45,98,121,116,101,32,107,0
+.section .rodata.cst64.Lzeroz, "aM", @progbits, 64
+.align 64
+.Lzeroz:
+.long 0,0,0,0, 1,0,0,0, 2,0,0,0, 3,0,0,0
+.section .rodata.cst64.Lfourz, "aM", @progbits, 64
+.align 64
+.Lfourz:
+.long 4,0,0,0, 4,0,0,0, 4,0,0,0, 4,0,0,0
+.section .rodata.cst64.Lincz, "aM", @progbits, 64
+.align 64
+.Lincz:
+.long 0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15
+.section .rodata.cst64.Lsixteen, "aM", @progbits, 64
+.align 64
+.Lsixteen:
+.long 16,16,16,16,16,16,16,16,16,16,16,16,16,16,16,16
+.section .rodata.cst32.Ltwoy, "aM", @progbits, 32
+.align 64
+.Ltwoy:
+.long 2,0,0,0, 2,0,0,0
+
+.text
+
+#ifdef CONFIG_AS_SSSE3
+.align 32
+ENTRY(hchacha20_ssse3)
+ movdqa .Lsigma(%rip),%xmm0
+ movdqu (%rdx),%xmm1
+ movdqu 16(%rdx),%xmm2
+ movdqu (%rsi),%xmm3
+ movdqa .Lrot16(%rip),%xmm6
+ movdqa .Lrot24(%rip),%xmm7
+ movq $10,%r8
+ .align 32
+.Loop_hssse3:
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm6,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $20,%xmm1
+ pslld $12,%xmm4
+ por %xmm4,%xmm1
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm7,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $25,%xmm1
+ pslld $7,%xmm4
+ por %xmm4,%xmm1
+ pshufd $78,%xmm2,%xmm2
+ pshufd $57,%xmm1,%xmm1
+ pshufd $147,%xmm3,%xmm3
+ nop
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm6,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $20,%xmm1
+ pslld $12,%xmm4
+ por %xmm4,%xmm1
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm7,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $25,%xmm1
+ pslld $7,%xmm4
+ por %xmm4,%xmm1
+ pshufd $78,%xmm2,%xmm2
+ pshufd $147,%xmm1,%xmm1
+ pshufd $57,%xmm3,%xmm3
+ decq %r8
+ jnz .Loop_hssse3
+ movdqu %xmm0,0(%rdi)
+ movdqu %xmm3,16(%rdi)
+ ret
+ENDPROC(hchacha20_ssse3)
+
+.align 32
+ENTRY(chacha20_ssse3)
+.Lchacha20_ssse3:
+ cmpq $0,%rdx
+ je .Lssse3_epilogue
+ leaq 8(%rsp),%r10
+
+ cmpq $128,%rdx
+ ja .Lchacha20_4x
+
+.Ldo_sse3_after_all:
+ subq $64+8,%rsp
+ andq $-32,%rsp
+ movdqa .Lsigma(%rip),%xmm0
+ movdqu (%rcx),%xmm1
+ movdqu 16(%rcx),%xmm2
+ movdqu (%r8),%xmm3
+ movdqa .Lrot16(%rip),%xmm6
+ movdqa .Lrot24(%rip),%xmm7
+
+ movdqa %xmm0,0(%rsp)
+ movdqa %xmm1,16(%rsp)
+ movdqa %xmm2,32(%rsp)
+ movdqa %xmm3,48(%rsp)
+ movq $10,%r8
+ jmp .Loop_ssse3
+
+.align 32
+.Loop_outer_ssse3:
+ movdqa .Lone(%rip),%xmm3
+ movdqa 0(%rsp),%xmm0
+ movdqa 16(%rsp),%xmm1
+ movdqa 32(%rsp),%xmm2
+ paddd 48(%rsp),%xmm3
+ movq $10,%r8
+ movdqa %xmm3,48(%rsp)
+ jmp .Loop_ssse3
+
+.align 32
+.Loop_ssse3:
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm6,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $20,%xmm1
+ pslld $12,%xmm4
+ por %xmm4,%xmm1
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm7,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $25,%xmm1
+ pslld $7,%xmm4
+ por %xmm4,%xmm1
+ pshufd $78,%xmm2,%xmm2
+ pshufd $57,%xmm1,%xmm1
+ pshufd $147,%xmm3,%xmm3
+ nop
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm6,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $20,%xmm1
+ pslld $12,%xmm4
+ por %xmm4,%xmm1
+ paddd %xmm1,%xmm0
+ pxor %xmm0,%xmm3
+ pshufb %xmm7,%xmm3
+ paddd %xmm3,%xmm2
+ pxor %xmm2,%xmm1
+ movdqa %xmm1,%xmm4
+ psrld $25,%xmm1
+ pslld $7,%xmm4
+ por %xmm4,%xmm1
+ pshufd $78,%xmm2,%xmm2
+ pshufd $147,%xmm1,%xmm1
+ pshufd $57,%xmm3,%xmm3
+ decq %r8
+ jnz .Loop_ssse3
+ paddd 0(%rsp),%xmm0
+ paddd 16(%rsp),%xmm1
+ paddd 32(%rsp),%xmm2
+ paddd 48(%rsp),%xmm3
+
+ cmpq $64,%rdx
+ jb .Ltail_ssse3
+
+ movdqu 0(%rsi),%xmm4
+ movdqu 16(%rsi),%xmm5
+ pxor %xmm4,%xmm0
+ movdqu 32(%rsi),%xmm4
+ pxor %xmm5,%xmm1
+ movdqu 48(%rsi),%xmm5
+ leaq 64(%rsi),%rsi
+ pxor %xmm4,%xmm2
+ pxor %xmm5,%xmm3
+
+ movdqu %xmm0,0(%rdi)
+ movdqu %xmm1,16(%rdi)
+ movdqu %xmm2,32(%rdi)
+ movdqu %xmm3,48(%rdi)
+ leaq 64(%rdi),%rdi
+
+ subq $64,%rdx
+ jnz .Loop_outer_ssse3
+
+ jmp .Ldone_ssse3
+
+.align 16
+.Ltail_ssse3:
+ movdqa %xmm0,0(%rsp)
+ movdqa %xmm1,16(%rsp)
+ movdqa %xmm2,32(%rsp)
+ movdqa %xmm3,48(%rsp)
+ xorq %r8,%r8
+
+.Loop_tail_ssse3:
+ movzbl (%rsi,%r8,1),%eax
+ movzbl (%rsp,%r8,1),%ecx
+ leaq 1(%r8),%r8
+ xorl %ecx,%eax
+ movb %al,-1(%rdi,%r8,1)
+ decq %rdx
+ jnz .Loop_tail_ssse3
+
+.Ldone_ssse3:
+ leaq -8(%r10),%rsp
+
+.Lssse3_epilogue:
+ ret
+
+.align 32
+.Lchacha20_4x:
+ leaq 8(%rsp),%r10
+
+.Lproceed4x:
+ subq $0x140+8,%rsp
+ andq $-32,%rsp
+ movdqa .Lsigma(%rip),%xmm11
+ movdqu (%rcx),%xmm15
+ movdqu 16(%rcx),%xmm7
+ movdqu (%r8),%xmm3
+ leaq 256(%rsp),%rcx
+ leaq .Lrot16(%rip),%r9
+ leaq .Lrot24(%rip),%r11
+
+ pshufd $0x00,%xmm11,%xmm8
+ pshufd $0x55,%xmm11,%xmm9
+ movdqa %xmm8,64(%rsp)
+ pshufd $0xaa,%xmm11,%xmm10
+ movdqa %xmm9,80(%rsp)
+ pshufd $0xff,%xmm11,%xmm11
+ movdqa %xmm10,96(%rsp)
+ movdqa %xmm11,112(%rsp)
+
+ pshufd $0x00,%xmm15,%xmm12
+ pshufd $0x55,%xmm15,%xmm13
+ movdqa %xmm12,128-256(%rcx)
+ pshufd $0xaa,%xmm15,%xmm14
+ movdqa %xmm13,144-256(%rcx)
+ pshufd $0xff,%xmm15,%xmm15
+ movdqa %xmm14,160-256(%rcx)
+ movdqa %xmm15,176-256(%rcx)
+
+ pshufd $0x00,%xmm7,%xmm4
+ pshufd $0x55,%xmm7,%xmm5
+ movdqa %xmm4,192-256(%rcx)
+ pshufd $0xaa,%xmm7,%xmm6
+ movdqa %xmm5,208-256(%rcx)
+ pshufd $0xff,%xmm7,%xmm7
+ movdqa %xmm6,224-256(%rcx)
+ movdqa %xmm7,240-256(%rcx)
+
+ pshufd $0x00,%xmm3,%xmm0
+ pshufd $0x55,%xmm3,%xmm1
+ paddd .Linc(%rip),%xmm0
+ pshufd $0xaa,%xmm3,%xmm2
+ movdqa %xmm1,272-256(%rcx)
+ pshufd $0xff,%xmm3,%xmm3
+ movdqa %xmm2,288-256(%rcx)
+ movdqa %xmm3,304-256(%rcx)
+
+ jmp .Loop_enter4x
+
+.align 32
+.Loop_outer4x:
+ movdqa 64(%rsp),%xmm8
+ movdqa 80(%rsp),%xmm9
+ movdqa 96(%rsp),%xmm10
+ movdqa 112(%rsp),%xmm11
+ movdqa 128-256(%rcx),%xmm12
+ movdqa 144-256(%rcx),%xmm13
+ movdqa 160-256(%rcx),%xmm14
+ movdqa 176-256(%rcx),%xmm15
+ movdqa 192-256(%rcx),%xmm4
+ movdqa 208-256(%rcx),%xmm5
+ movdqa 224-256(%rcx),%xmm6
+ movdqa 240-256(%rcx),%xmm7
+ movdqa 256-256(%rcx),%xmm0
+ movdqa 272-256(%rcx),%xmm1
+ movdqa 288-256(%rcx),%xmm2
+ movdqa 304-256(%rcx),%xmm3
+ paddd .Lfour(%rip),%xmm0
+
+.Loop_enter4x:
+ movdqa %xmm6,32(%rsp)
+ movdqa %xmm7,48(%rsp)
+ movdqa (%r9),%xmm7
+ movl $10,%eax
+ movdqa %xmm0,256-256(%rcx)
+ jmp .Loop4x
+
+.align 32
+.Loop4x:
+ paddd %xmm12,%xmm8
+ paddd %xmm13,%xmm9
+ pxor %xmm8,%xmm0
+ pxor %xmm9,%xmm1
+ pshufb %xmm7,%xmm0
+ pshufb %xmm7,%xmm1
+ paddd %xmm0,%xmm4
+ paddd %xmm1,%xmm5
+ pxor %xmm4,%xmm12
+ pxor %xmm5,%xmm13
+ movdqa %xmm12,%xmm6
+ pslld $12,%xmm12
+ psrld $20,%xmm6
+ movdqa %xmm13,%xmm7
+ pslld $12,%xmm13
+ por %xmm6,%xmm12
+ psrld $20,%xmm7
+ movdqa (%r11),%xmm6
+ por %xmm7,%xmm13
+ paddd %xmm12,%xmm8
+ paddd %xmm13,%xmm9
+ pxor %xmm8,%xmm0
+ pxor %xmm9,%xmm1
+ pshufb %xmm6,%xmm0
+ pshufb %xmm6,%xmm1
+ paddd %xmm0,%xmm4
+ paddd %xmm1,%xmm5
+ pxor %xmm4,%xmm12
+ pxor %xmm5,%xmm13
+ movdqa %xmm12,%xmm7
+ pslld $7,%xmm12
+ psrld $25,%xmm7
+ movdqa %xmm13,%xmm6
+ pslld $7,%xmm13
+ por %xmm7,%xmm12
+ psrld $25,%xmm6
+ movdqa (%r9),%xmm7
+ por %xmm6,%xmm13
+ movdqa %xmm4,0(%rsp)
+ movdqa %xmm5,16(%rsp)
+ movdqa 32(%rsp),%xmm4
+ movdqa 48(%rsp),%xmm5
+ paddd %xmm14,%xmm10
+ paddd %xmm15,%xmm11
+ pxor %xmm10,%xmm2
+ pxor %xmm11,%xmm3
+ pshufb %xmm7,%xmm2
+ pshufb %xmm7,%xmm3
+ paddd %xmm2,%xmm4
+ paddd %xmm3,%xmm5
+ pxor %xmm4,%xmm14
+ pxor %xmm5,%xmm15
+ movdqa %xmm14,%xmm6
+ pslld $12,%xmm14
+ psrld $20,%xmm6
+ movdqa %xmm15,%xmm7
+ pslld $12,%xmm15
+ por %xmm6,%xmm14
+ psrld $20,%xmm7
+ movdqa (%r11),%xmm6
+ por %xmm7,%xmm15
+ paddd %xmm14,%xmm10
+ paddd %xmm15,%xmm11
+ pxor %xmm10,%xmm2
+ pxor %xmm11,%xmm3
+ pshufb %xmm6,%xmm2
+ pshufb %xmm6,%xmm3
+ paddd %xmm2,%xmm4
+ paddd %xmm3,%xmm5
+ pxor %xmm4,%xmm14
+ pxor %xmm5,%xmm15
+ movdqa %xmm14,%xmm7
+ pslld $7,%xmm14
+ psrld $25,%xmm7
+ movdqa %xmm15,%xmm6
+ pslld $7,%xmm15
+ por %xmm7,%xmm14
+ psrld $25,%xmm6
+ movdqa (%r9),%xmm7
+ por %xmm6,%xmm15
+ paddd %xmm13,%xmm8
+ paddd %xmm14,%xmm9
+ pxor %xmm8,%xmm3
+ pxor %xmm9,%xmm0
+ pshufb %xmm7,%xmm3
+ pshufb %xmm7,%xmm0
+ paddd %xmm3,%xmm4
+ paddd %xmm0,%xmm5
+ pxor %xmm4,%xmm13
+ pxor %xmm5,%xmm14
+ movdqa %xmm13,%xmm6
+ pslld $12,%xmm13
+ psrld $20,%xmm6
+ movdqa %xmm14,%xmm7
+ pslld $12,%xmm14
+ por %xmm6,%xmm13
+ psrld $20,%xmm7
+ movdqa (%r11),%xmm6
+ por %xmm7,%xmm14
+ paddd %xmm13,%xmm8
+ paddd %xmm14,%xmm9
+ pxor %xmm8,%xmm3
+ pxor %xmm9,%xmm0
+ pshufb %xmm6,%xmm3
+ pshufb %xmm6,%xmm0
+ paddd %xmm3,%xmm4
+ paddd %xmm0,%xmm5
+ pxor %xmm4,%xmm13
+ pxor %xmm5,%xmm14
+ movdqa %xmm13,%xmm7
+ pslld $7,%xmm13
+ psrld $25,%xmm7
+ movdqa %xmm14,%xmm6
+ pslld $7,%xmm14
+ por %xmm7,%xmm13
+ psrld $25,%xmm6
+ movdqa (%r9),%xmm7
+ por %xmm6,%xmm14
+ movdqa %xmm4,32(%rsp)
+ movdqa %xmm5,48(%rsp)
+ movdqa 0(%rsp),%xmm4
+ movdqa 16(%rsp),%xmm5
+ paddd %xmm15,%xmm10
+ paddd %xmm12,%xmm11
+ pxor %xmm10,%xmm1
+ pxor %xmm11,%xmm2
+ pshufb %xmm7,%xmm1
+ pshufb %xmm7,%xmm2
+ paddd %xmm1,%xmm4
+ paddd %xmm2,%xmm5
+ pxor %xmm4,%xmm15
+ pxor %xmm5,%xmm12
+ movdqa %xmm15,%xmm6
+ pslld $12,%xmm15
+ psrld $20,%xmm6
+ movdqa %xmm12,%xmm7
+ pslld $12,%xmm12
+ por %xmm6,%xmm15
+ psrld $20,%xmm7
+ movdqa (%r11),%xmm6
+ por %xmm7,%xmm12
+ paddd %xmm15,%xmm10
+ paddd %xmm12,%xmm11
+ pxor %xmm10,%xmm1
+ pxor %xmm11,%xmm2
+ pshufb %xmm6,%xmm1
+ pshufb %xmm6,%xmm2
+ paddd %xmm1,%xmm4
+ paddd %xmm2,%xmm5
+ pxor %xmm4,%xmm15
+ pxor %xmm5,%xmm12
+ movdqa %xmm15,%xmm7
+ pslld $7,%xmm15
+ psrld $25,%xmm7
+ movdqa %xmm12,%xmm6
+ pslld $7,%xmm12
+ por %xmm7,%xmm15
+ psrld $25,%xmm6
+ movdqa (%r9),%xmm7
+ por %xmm6,%xmm12
+ decl %eax
+ jnz .Loop4x
+
+ paddd 64(%rsp),%xmm8
+ paddd 80(%rsp),%xmm9
+ paddd 96(%rsp),%xmm10
+ paddd 112(%rsp),%xmm11
+
+ movdqa %xmm8,%xmm6
+ punpckldq %xmm9,%xmm8
+ movdqa %xmm10,%xmm7
+ punpckldq %xmm11,%xmm10
+ punpckhdq %xmm9,%xmm6
+ punpckhdq %xmm11,%xmm7
+ movdqa %xmm8,%xmm9
+ punpcklqdq %xmm10,%xmm8
+ movdqa %xmm6,%xmm11
+ punpcklqdq %xmm7,%xmm6
+ punpckhqdq %xmm10,%xmm9
+ punpckhqdq %xmm7,%xmm11
+ paddd 128-256(%rcx),%xmm12
+ paddd 144-256(%rcx),%xmm13
+ paddd 160-256(%rcx),%xmm14
+ paddd 176-256(%rcx),%xmm15
+
+ movdqa %xmm8,0(%rsp)
+ movdqa %xmm9,16(%rsp)
+ movdqa 32(%rsp),%xmm8
+ movdqa 48(%rsp),%xmm9
+
+ movdqa %xmm12,%xmm10
+ punpckldq %xmm13,%xmm12
+ movdqa %xmm14,%xmm7
+ punpckldq %xmm15,%xmm14
+ punpckhdq %xmm13,%xmm10
+ punpckhdq %xmm15,%xmm7
+ movdqa %xmm12,%xmm13
+ punpcklqdq %xmm14,%xmm12
+ movdqa %xmm10,%xmm15
+ punpcklqdq %xmm7,%xmm10
+ punpckhqdq %xmm14,%xmm13
+ punpckhqdq %xmm7,%xmm15
+ paddd 192-256(%rcx),%xmm4
+ paddd 208-256(%rcx),%xmm5
+ paddd 224-256(%rcx),%xmm8
+ paddd 240-256(%rcx),%xmm9
+
+ movdqa %xmm6,32(%rsp)
+ movdqa %xmm11,48(%rsp)
+
+ movdqa %xmm4,%xmm14
+ punpckldq %xmm5,%xmm4
+ movdqa %xmm8,%xmm7
+ punpckldq %xmm9,%xmm8
+ punpckhdq %xmm5,%xmm14
+ punpckhdq %xmm9,%xmm7
+ movdqa %xmm4,%xmm5
+ punpcklqdq %xmm8,%xmm4
+ movdqa %xmm14,%xmm9
+ punpcklqdq %xmm7,%xmm14
+ punpckhqdq %xmm8,%xmm5
+ punpckhqdq %xmm7,%xmm9
+ paddd 256-256(%rcx),%xmm0
+ paddd 272-256(%rcx),%xmm1
+ paddd 288-256(%rcx),%xmm2
+ paddd 304-256(%rcx),%xmm3
+
+ movdqa %xmm0,%xmm8
+ punpckldq %xmm1,%xmm0
+ movdqa %xmm2,%xmm7
+ punpckldq %xmm3,%xmm2
+ punpckhdq %xmm1,%xmm8
+ punpckhdq %xmm3,%xmm7
+ movdqa %xmm0,%xmm1
+ punpcklqdq %xmm2,%xmm0
+ movdqa %xmm8,%xmm3
+ punpcklqdq %xmm7,%xmm8
+ punpckhqdq %xmm2,%xmm1
+ punpckhqdq %xmm7,%xmm3
+ cmpq $256,%rdx
+ jb .Ltail4x
+
+ movdqu 0(%rsi),%xmm6
+ movdqu 16(%rsi),%xmm11
+ movdqu 32(%rsi),%xmm2
+ movdqu 48(%rsi),%xmm7
+ pxor 0(%rsp),%xmm6
+ pxor %xmm12,%xmm11
+ pxor %xmm4,%xmm2
+ pxor %xmm0,%xmm7
+
+ movdqu %xmm6,0(%rdi)
+ movdqu 64(%rsi),%xmm6
+ movdqu %xmm11,16(%rdi)
+ movdqu 80(%rsi),%xmm11
+ movdqu %xmm2,32(%rdi)
+ movdqu 96(%rsi),%xmm2
+ movdqu %xmm7,48(%rdi)
+ movdqu 112(%rsi),%xmm7
+ leaq 128(%rsi),%rsi
+ pxor 16(%rsp),%xmm6
+ pxor %xmm13,%xmm11
+ pxor %xmm5,%xmm2
+ pxor %xmm1,%xmm7
+
+ movdqu %xmm6,64(%rdi)
+ movdqu 0(%rsi),%xmm6
+ movdqu %xmm11,80(%rdi)
+ movdqu 16(%rsi),%xmm11
+ movdqu %xmm2,96(%rdi)
+ movdqu 32(%rsi),%xmm2
+ movdqu %xmm7,112(%rdi)
+ leaq 128(%rdi),%rdi
+ movdqu 48(%rsi),%xmm7
+ pxor 32(%rsp),%xmm6
+ pxor %xmm10,%xmm11
+ pxor %xmm14,%xmm2
+ pxor %xmm8,%xmm7
+
+ movdqu %xmm6,0(%rdi)
+ movdqu 64(%rsi),%xmm6
+ movdqu %xmm11,16(%rdi)
+ movdqu 80(%rsi),%xmm11
+ movdqu %xmm2,32(%rdi)
+ movdqu 96(%rsi),%xmm2
+ movdqu %xmm7,48(%rdi)
+ movdqu 112(%rsi),%xmm7
+ leaq 128(%rsi),%rsi
+ pxor 48(%rsp),%xmm6
+ pxor %xmm15,%xmm11
+ pxor %xmm9,%xmm2
+ pxor %xmm3,%xmm7
+ movdqu %xmm6,64(%rdi)
+ movdqu %xmm11,80(%rdi)
+ movdqu %xmm2,96(%rdi)
+ movdqu %xmm7,112(%rdi)
+ leaq 128(%rdi),%rdi
+
+ subq $256,%rdx
+ jnz .Loop_outer4x
+
+ jmp .Ldone4x
+
+.Ltail4x:
+ cmpq $192,%rdx
+ jae .L192_or_more4x
+ cmpq $128,%rdx
+ jae .L128_or_more4x
+ cmpq $64,%rdx
+ jae .L64_or_more4x
+
+
+ xorq %r9,%r9
+
+ movdqa %xmm12,16(%rsp)
+ movdqa %xmm4,32(%rsp)
+ movdqa %xmm0,48(%rsp)
+ jmp .Loop_tail4x
+
+.align 32
+.L64_or_more4x:
+ movdqu 0(%rsi),%xmm6
+ movdqu 16(%rsi),%xmm11
+ movdqu 32(%rsi),%xmm2
+ movdqu 48(%rsi),%xmm7
+ pxor 0(%rsp),%xmm6
+ pxor %xmm12,%xmm11
+ pxor %xmm4,%xmm2
+ pxor %xmm0,%xmm7
+ movdqu %xmm6,0(%rdi)
+ movdqu %xmm11,16(%rdi)
+ movdqu %xmm2,32(%rdi)
+ movdqu %xmm7,48(%rdi)
+ je .Ldone4x
+
+ movdqa 16(%rsp),%xmm6
+ leaq 64(%rsi),%rsi
+ xorq %r9,%r9
+ movdqa %xmm6,0(%rsp)
+ movdqa %xmm13,16(%rsp)
+ leaq 64(%rdi),%rdi
+ movdqa %xmm5,32(%rsp)
+ subq $64,%rdx
+ movdqa %xmm1,48(%rsp)
+ jmp .Loop_tail4x
+
+.align 32
+.L128_or_more4x:
+ movdqu 0(%rsi),%xmm6
+ movdqu 16(%rsi),%xmm11
+ movdqu 32(%rsi),%xmm2
+ movdqu 48(%rsi),%xmm7
+ pxor 0(%rsp),%xmm6
+ pxor %xmm12,%xmm11
+ pxor %xmm4,%xmm2
+ pxor %xmm0,%xmm7
+
+ movdqu %xmm6,0(%rdi)
+ movdqu 64(%rsi),%xmm6
+ movdqu %xmm11,16(%rdi)
+ movdqu 80(%rsi),%xmm11
+ movdqu %xmm2,32(%rdi)
+ movdqu 96(%rsi),%xmm2
+ movdqu %xmm7,48(%rdi)
+ movdqu 112(%rsi),%xmm7
+ pxor 16(%rsp),%xmm6
+ pxor %xmm13,%xmm11
+ pxor %xmm5,%xmm2
+ pxor %xmm1,%xmm7
+ movdqu %xmm6,64(%rdi)
+ movdqu %xmm11,80(%rdi)
+ movdqu %xmm2,96(%rdi)
+ movdqu %xmm7,112(%rdi)
+ je .Ldone4x
+
+ movdqa 32(%rsp),%xmm6
+ leaq 128(%rsi),%rsi
+ xorq %r9,%r9
+ movdqa %xmm6,0(%rsp)
+ movdqa %xmm10,16(%rsp)
+ leaq 128(%rdi),%rdi
+ movdqa %xmm14,32(%rsp)
+ subq $128,%rdx
+ movdqa %xmm8,48(%rsp)
+ jmp .Loop_tail4x
+
+.align 32
+.L192_or_more4x:
+ movdqu 0(%rsi),%xmm6
+ movdqu 16(%rsi),%xmm11
+ movdqu 32(%rsi),%xmm2
+ movdqu 48(%rsi),%xmm7
+ pxor 0(%rsp),%xmm6
+ pxor %xmm12,%xmm11
+ pxor %xmm4,%xmm2
+ pxor %xmm0,%xmm7
+
+ movdqu %xmm6,0(%rdi)
+ movdqu 64(%rsi),%xmm6
+ movdqu %xmm11,16(%rdi)
+ movdqu 80(%rsi),%xmm11
+ movdqu %xmm2,32(%rdi)
+ movdqu 96(%rsi),%xmm2
+ movdqu %xmm7,48(%rdi)
+ movdqu 112(%rsi),%xmm7
+ leaq 128(%rsi),%rsi
+ pxor 16(%rsp),%xmm6
+ pxor %xmm13,%xmm11
+ pxor %xmm5,%xmm2
+ pxor %xmm1,%xmm7
+
+ movdqu %xmm6,64(%rdi)
+ movdqu 0(%rsi),%xmm6
+ movdqu %xmm11,80(%rdi)
+ movdqu 16(%rsi),%xmm11
+ movdqu %xmm2,96(%rdi)
+ movdqu 32(%rsi),%xmm2
+ movdqu %xmm7,112(%rdi)
+ leaq 128(%rdi),%rdi
+ movdqu 48(%rsi),%xmm7
+ pxor 32(%rsp),%xmm6
+ pxor %xmm10,%xmm11
+ pxor %xmm14,%xmm2
+ pxor %xmm8,%xmm7
+ movdqu %xmm6,0(%rdi)
+ movdqu %xmm11,16(%rdi)
+ movdqu %xmm2,32(%rdi)
+ movdqu %xmm7,48(%rdi)
+ je .Ldone4x
+
+ movdqa 48(%rsp),%xmm6
+ leaq 64(%rsi),%rsi
+ xorq %r9,%r9
+ movdqa %xmm6,0(%rsp)
+ movdqa %xmm15,16(%rsp)
+ leaq 64(%rdi),%rdi
+ movdqa %xmm9,32(%rsp)
+ subq $192,%rdx
+ movdqa %xmm3,48(%rsp)
+
+.Loop_tail4x:
+ movzbl (%rsi,%r9,1),%eax
+ movzbl (%rsp,%r9,1),%ecx
+ leaq 1(%r9),%r9
+ xorl %ecx,%eax
+ movb %al,-1(%rdi,%r9,1)
+ decq %rdx
+ jnz .Loop_tail4x
+
+.Ldone4x:
+ leaq -8(%r10),%rsp
+
+.L4x_epilogue:
+ ret
+ENDPROC(chacha20_ssse3)
+#endif /* CONFIG_AS_SSSE3 */
+
+#ifdef CONFIG_AS_AVX2
+.align 32
+ENTRY(chacha20_avx2)
+.Lchacha20_avx2:
+ cmpq $0,%rdx
+ je .L8x_epilogue
+ leaq 8(%rsp),%r10
+
+ subq $0x280+8,%rsp
+ andq $-32,%rsp
+ vzeroupper
+
+ vbroadcasti128 .Lsigma(%rip),%ymm11
+ vbroadcasti128 (%rcx),%ymm3
+ vbroadcasti128 16(%rcx),%ymm15
+ vbroadcasti128 (%r8),%ymm7
+ leaq 256(%rsp),%rcx
+ leaq 512(%rsp),%rax
+ leaq .Lrot16(%rip),%r9
+ leaq .Lrot24(%rip),%r11
+
+ vpshufd $0x00,%ymm11,%ymm8
+ vpshufd $0x55,%ymm11,%ymm9
+ vmovdqa %ymm8,128-256(%rcx)
+ vpshufd $0xaa,%ymm11,%ymm10
+ vmovdqa %ymm9,160-256(%rcx)
+ vpshufd $0xff,%ymm11,%ymm11
+ vmovdqa %ymm10,192-256(%rcx)
+ vmovdqa %ymm11,224-256(%rcx)
+
+ vpshufd $0x00,%ymm3,%ymm0
+ vpshufd $0x55,%ymm3,%ymm1
+ vmovdqa %ymm0,256-256(%rcx)
+ vpshufd $0xaa,%ymm3,%ymm2
+ vmovdqa %ymm1,288-256(%rcx)
+ vpshufd $0xff,%ymm3,%ymm3
+ vmovdqa %ymm2,320-256(%rcx)
+ vmovdqa %ymm3,352-256(%rcx)
+
+ vpshufd $0x00,%ymm15,%ymm12
+ vpshufd $0x55,%ymm15,%ymm13
+ vmovdqa %ymm12,384-512(%rax)
+ vpshufd $0xaa,%ymm15,%ymm14
+ vmovdqa %ymm13,416-512(%rax)
+ vpshufd $0xff,%ymm15,%ymm15
+ vmovdqa %ymm14,448-512(%rax)
+ vmovdqa %ymm15,480-512(%rax)
+
+ vpshufd $0x00,%ymm7,%ymm4
+ vpshufd $0x55,%ymm7,%ymm5
+ vpaddd .Lincy(%rip),%ymm4,%ymm4
+ vpshufd $0xaa,%ymm7,%ymm6
+ vmovdqa %ymm5,544-512(%rax)
+ vpshufd $0xff,%ymm7,%ymm7
+ vmovdqa %ymm6,576-512(%rax)
+ vmovdqa %ymm7,608-512(%rax)
+
+ jmp .Loop_enter8x
+
+.align 32
+.Loop_outer8x:
+ vmovdqa 128-256(%rcx),%ymm8
+ vmovdqa 160-256(%rcx),%ymm9
+ vmovdqa 192-256(%rcx),%ymm10
+ vmovdqa 224-256(%rcx),%ymm11
+ vmovdqa 256-256(%rcx),%ymm0
+ vmovdqa 288-256(%rcx),%ymm1
+ vmovdqa 320-256(%rcx),%ymm2
+ vmovdqa 352-256(%rcx),%ymm3
+ vmovdqa 384-512(%rax),%ymm12
+ vmovdqa 416-512(%rax),%ymm13
+ vmovdqa 448-512(%rax),%ymm14
+ vmovdqa 480-512(%rax),%ymm15
+ vmovdqa 512-512(%rax),%ymm4
+ vmovdqa 544-512(%rax),%ymm5
+ vmovdqa 576-512(%rax),%ymm6
+ vmovdqa 608-512(%rax),%ymm7
+ vpaddd .Leight(%rip),%ymm4,%ymm4
+
+.Loop_enter8x:
+ vmovdqa %ymm14,64(%rsp)
+ vmovdqa %ymm15,96(%rsp)
+ vbroadcasti128 (%r9),%ymm15
+ vmovdqa %ymm4,512-512(%rax)
+ movl $10,%eax
+ jmp .Loop8x
+
+.align 32
+.Loop8x:
+ vpaddd %ymm0,%ymm8,%ymm8
+ vpxor %ymm4,%ymm8,%ymm4
+ vpshufb %ymm15,%ymm4,%ymm4
+ vpaddd %ymm1,%ymm9,%ymm9
+ vpxor %ymm5,%ymm9,%ymm5
+ vpshufb %ymm15,%ymm5,%ymm5
+ vpaddd %ymm4,%ymm12,%ymm12
+ vpxor %ymm0,%ymm12,%ymm0
+ vpslld $12,%ymm0,%ymm14
+ vpsrld $20,%ymm0,%ymm0
+ vpor %ymm0,%ymm14,%ymm0
+ vbroadcasti128 (%r11),%ymm14
+ vpaddd %ymm5,%ymm13,%ymm13
+ vpxor %ymm1,%ymm13,%ymm1
+ vpslld $12,%ymm1,%ymm15
+ vpsrld $20,%ymm1,%ymm1
+ vpor %ymm1,%ymm15,%ymm1
+ vpaddd %ymm0,%ymm8,%ymm8
+ vpxor %ymm4,%ymm8,%ymm4
+ vpshufb %ymm14,%ymm4,%ymm4
+ vpaddd %ymm1,%ymm9,%ymm9
+ vpxor %ymm5,%ymm9,%ymm5
+ vpshufb %ymm14,%ymm5,%ymm5
+ vpaddd %ymm4,%ymm12,%ymm12
+ vpxor %ymm0,%ymm12,%ymm0
+ vpslld $7,%ymm0,%ymm15
+ vpsrld $25,%ymm0,%ymm0
+ vpor %ymm0,%ymm15,%ymm0
+ vbroadcasti128 (%r9),%ymm15
+ vpaddd %ymm5,%ymm13,%ymm13
+ vpxor %ymm1,%ymm13,%ymm1
+ vpslld $7,%ymm1,%ymm14
+ vpsrld $25,%ymm1,%ymm1
+ vpor %ymm1,%ymm14,%ymm1
+ vmovdqa %ymm12,0(%rsp)
+ vmovdqa %ymm13,32(%rsp)
+ vmovdqa 64(%rsp),%ymm12
+ vmovdqa 96(%rsp),%ymm13
+ vpaddd %ymm2,%ymm10,%ymm10
+ vpxor %ymm6,%ymm10,%ymm6
+ vpshufb %ymm15,%ymm6,%ymm6
+ vpaddd %ymm3,%ymm11,%ymm11
+ vpxor %ymm7,%ymm11,%ymm7
+ vpshufb %ymm15,%ymm7,%ymm7
+ vpaddd %ymm6,%ymm12,%ymm12
+ vpxor %ymm2,%ymm12,%ymm2
+ vpslld $12,%ymm2,%ymm14
+ vpsrld $20,%ymm2,%ymm2
+ vpor %ymm2,%ymm14,%ymm2
+ vbroadcasti128 (%r11),%ymm14
+ vpaddd %ymm7,%ymm13,%ymm13
+ vpxor %ymm3,%ymm13,%ymm3
+ vpslld $12,%ymm3,%ymm15
+ vpsrld $20,%ymm3,%ymm3
+ vpor %ymm3,%ymm15,%ymm3
+ vpaddd %ymm2,%ymm10,%ymm10
+ vpxor %ymm6,%ymm10,%ymm6
+ vpshufb %ymm14,%ymm6,%ymm6
+ vpaddd %ymm3,%ymm11,%ymm11
+ vpxor %ymm7,%ymm11,%ymm7
+ vpshufb %ymm14,%ymm7,%ymm7
+ vpaddd %ymm6,%ymm12,%ymm12
+ vpxor %ymm2,%ymm12,%ymm2
+ vpslld $7,%ymm2,%ymm15
+ vpsrld $25,%ymm2,%ymm2
+ vpor %ymm2,%ymm15,%ymm2
+ vbroadcasti128 (%r9),%ymm15
+ vpaddd %ymm7,%ymm13,%ymm13
+ vpxor %ymm3,%ymm13,%ymm3
+ vpslld $7,%ymm3,%ymm14
+ vpsrld $25,%ymm3,%ymm3
+ vpor %ymm3,%ymm14,%ymm3
+ vpaddd %ymm1,%ymm8,%ymm8
+ vpxor %ymm7,%ymm8,%ymm7
+ vpshufb %ymm15,%ymm7,%ymm7
+ vpaddd %ymm2,%ymm9,%ymm9
+ vpxor %ymm4,%ymm9,%ymm4
+ vpshufb %ymm15,%ymm4,%ymm4
+ vpaddd %ymm7,%ymm12,%ymm12
+ vpxor %ymm1,%ymm12,%ymm1
+ vpslld $12,%ymm1,%ymm14
+ vpsrld $20,%ymm1,%ymm1
+ vpor %ymm1,%ymm14,%ymm1
+ vbroadcasti128 (%r11),%ymm14
+ vpaddd %ymm4,%ymm13,%ymm13
+ vpxor %ymm2,%ymm13,%ymm2
+ vpslld $12,%ymm2,%ymm15
+ vpsrld $20,%ymm2,%ymm2
+ vpor %ymm2,%ymm15,%ymm2
+ vpaddd %ymm1,%ymm8,%ymm8
+ vpxor %ymm7,%ymm8,%ymm7
+ vpshufb %ymm14,%ymm7,%ymm7
+ vpaddd %ymm2,%ymm9,%ymm9
+ vpxor %ymm4,%ymm9,%ymm4
+ vpshufb %ymm14,%ymm4,%ymm4
+ vpaddd %ymm7,%ymm12,%ymm12
+ vpxor %ymm1,%ymm12,%ymm1
+ vpslld $7,%ymm1,%ymm15
+ vpsrld $25,%ymm1,%ymm1
+ vpor %ymm1,%ymm15,%ymm1
+ vbroadcasti128 (%r9),%ymm15
+ vpaddd %ymm4,%ymm13,%ymm13
+ vpxor %ymm2,%ymm13,%ymm2
+ vpslld $7,%ymm2,%ymm14
+ vpsrld $25,%ymm2,%ymm2
+ vpor %ymm2,%ymm14,%ymm2
+ vmovdqa %ymm12,64(%rsp)
+ vmovdqa %ymm13,96(%rsp)
+ vmovdqa 0(%rsp),%ymm12
+ vmovdqa 32(%rsp),%ymm13
+ vpaddd %ymm3,%ymm10,%ymm10
+ vpxor %ymm5,%ymm10,%ymm5
+ vpshufb %ymm15,%ymm5,%ymm5
+ vpaddd %ymm0,%ymm11,%ymm11
+ vpxor %ymm6,%ymm11,%ymm6
+ vpshufb %ymm15,%ymm6,%ymm6
+ vpaddd %ymm5,%ymm12,%ymm12
+ vpxor %ymm3,%ymm12,%ymm3
+ vpslld $12,%ymm3,%ymm14
+ vpsrld $20,%ymm3,%ymm3
+ vpor %ymm3,%ymm14,%ymm3
+ vbroadcasti128 (%r11),%ymm14
+ vpaddd %ymm6,%ymm13,%ymm13
+ vpxor %ymm0,%ymm13,%ymm0
+ vpslld $12,%ymm0,%ymm15
+ vpsrld $20,%ymm0,%ymm0
+ vpor %ymm0,%ymm15,%ymm0
+ vpaddd %ymm3,%ymm10,%ymm10
+ vpxor %ymm5,%ymm10,%ymm5
+ vpshufb %ymm14,%ymm5,%ymm5
+ vpaddd %ymm0,%ymm11,%ymm11
+ vpxor %ymm6,%ymm11,%ymm6
+ vpshufb %ymm14,%ymm6,%ymm6
+ vpaddd %ymm5,%ymm12,%ymm12
+ vpxor %ymm3,%ymm12,%ymm3
+ vpslld $7,%ymm3,%ymm15
+ vpsrld $25,%ymm3,%ymm3
+ vpor %ymm3,%ymm15,%ymm3
+ vbroadcasti128 (%r9),%ymm15
+ vpaddd %ymm6,%ymm13,%ymm13
+ vpxor %ymm0,%ymm13,%ymm0
+ vpslld $7,%ymm0,%ymm14
+ vpsrld $25,%ymm0,%ymm0
+ vpor %ymm0,%ymm14,%ymm0
+ decl %eax
+ jnz .Loop8x
+
+ leaq 512(%rsp),%rax
+ vpaddd 128-256(%rcx),%ymm8,%ymm8
+ vpaddd 160-256(%rcx),%ymm9,%ymm9
+ vpaddd 192-256(%rcx),%ymm10,%ymm10
+ vpaddd 224-256(%rcx),%ymm11,%ymm11
+
+ vpunpckldq %ymm9,%ymm8,%ymm14
+ vpunpckldq %ymm11,%ymm10,%ymm15
+ vpunpckhdq %ymm9,%ymm8,%ymm8
+ vpunpckhdq %ymm11,%ymm10,%ymm10
+ vpunpcklqdq %ymm15,%ymm14,%ymm9
+ vpunpckhqdq %ymm15,%ymm14,%ymm14
+ vpunpcklqdq %ymm10,%ymm8,%ymm11
+ vpunpckhqdq %ymm10,%ymm8,%ymm8
+ vpaddd 256-256(%rcx),%ymm0,%ymm0
+ vpaddd 288-256(%rcx),%ymm1,%ymm1
+ vpaddd 320-256(%rcx),%ymm2,%ymm2
+ vpaddd 352-256(%rcx),%ymm3,%ymm3
+
+ vpunpckldq %ymm1,%ymm0,%ymm10
+ vpunpckldq %ymm3,%ymm2,%ymm15
+ vpunpckhdq %ymm1,%ymm0,%ymm0
+ vpunpckhdq %ymm3,%ymm2,%ymm2
+ vpunpcklqdq %ymm15,%ymm10,%ymm1
+ vpunpckhqdq %ymm15,%ymm10,%ymm10
+ vpunpcklqdq %ymm2,%ymm0,%ymm3
+ vpunpckhqdq %ymm2,%ymm0,%ymm0
+ vperm2i128 $0x20,%ymm1,%ymm9,%ymm15
+ vperm2i128 $0x31,%ymm1,%ymm9,%ymm1
+ vperm2i128 $0x20,%ymm10,%ymm14,%ymm9
+ vperm2i128 $0x31,%ymm10,%ymm14,%ymm10
+ vperm2i128 $0x20,%ymm3,%ymm11,%ymm14
+ vperm2i128 $0x31,%ymm3,%ymm11,%ymm3
+ vperm2i128 $0x20,%ymm0,%ymm8,%ymm11
+ vperm2i128 $0x31,%ymm0,%ymm8,%ymm0
+ vmovdqa %ymm15,0(%rsp)
+ vmovdqa %ymm9,32(%rsp)
+ vmovdqa 64(%rsp),%ymm15
+ vmovdqa 96(%rsp),%ymm9
+
+ vpaddd 384-512(%rax),%ymm12,%ymm12
+ vpaddd 416-512(%rax),%ymm13,%ymm13
+ vpaddd 448-512(%rax),%ymm15,%ymm15
+ vpaddd 480-512(%rax),%ymm9,%ymm9
+
+ vpunpckldq %ymm13,%ymm12,%ymm2
+ vpunpckldq %ymm9,%ymm15,%ymm8
+ vpunpckhdq %ymm13,%ymm12,%ymm12
+ vpunpckhdq %ymm9,%ymm15,%ymm15
+ vpunpcklqdq %ymm8,%ymm2,%ymm13
+ vpunpckhqdq %ymm8,%ymm2,%ymm2
+ vpunpcklqdq %ymm15,%ymm12,%ymm9
+ vpunpckhqdq %ymm15,%ymm12,%ymm12
+ vpaddd 512-512(%rax),%ymm4,%ymm4
+ vpaddd 544-512(%rax),%ymm5,%ymm5
+ vpaddd 576-512(%rax),%ymm6,%ymm6
+ vpaddd 608-512(%rax),%ymm7,%ymm7
+
+ vpunpckldq %ymm5,%ymm4,%ymm15
+ vpunpckldq %ymm7,%ymm6,%ymm8
+ vpunpckhdq %ymm5,%ymm4,%ymm4
+ vpunpckhdq %ymm7,%ymm6,%ymm6
+ vpunpcklqdq %ymm8,%ymm15,%ymm5
+ vpunpckhqdq %ymm8,%ymm15,%ymm15
+ vpunpcklqdq %ymm6,%ymm4,%ymm7
+ vpunpckhqdq %ymm6,%ymm4,%ymm4
+ vperm2i128 $0x20,%ymm5,%ymm13,%ymm8
+ vperm2i128 $0x31,%ymm5,%ymm13,%ymm5
+ vperm2i128 $0x20,%ymm15,%ymm2,%ymm13
+ vperm2i128 $0x31,%ymm15,%ymm2,%ymm15
+ vperm2i128 $0x20,%ymm7,%ymm9,%ymm2
+ vperm2i128 $0x31,%ymm7,%ymm9,%ymm7
+ vperm2i128 $0x20,%ymm4,%ymm12,%ymm9
+ vperm2i128 $0x31,%ymm4,%ymm12,%ymm4
+ vmovdqa 0(%rsp),%ymm6
+ vmovdqa 32(%rsp),%ymm12
+
+ cmpq $512,%rdx
+ jb .Ltail8x
+
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vpxor 64(%rsi),%ymm1,%ymm1
+ vpxor 96(%rsi),%ymm5,%ymm5
+ leaq 128(%rsi),%rsi
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ vmovdqu %ymm1,64(%rdi)
+ vmovdqu %ymm5,96(%rdi)
+ leaq 128(%rdi),%rdi
+
+ vpxor 0(%rsi),%ymm12,%ymm12
+ vpxor 32(%rsi),%ymm13,%ymm13
+ vpxor 64(%rsi),%ymm10,%ymm10
+ vpxor 96(%rsi),%ymm15,%ymm15
+ leaq 128(%rsi),%rsi
+ vmovdqu %ymm12,0(%rdi)
+ vmovdqu %ymm13,32(%rdi)
+ vmovdqu %ymm10,64(%rdi)
+ vmovdqu %ymm15,96(%rdi)
+ leaq 128(%rdi),%rdi
+
+ vpxor 0(%rsi),%ymm14,%ymm14
+ vpxor 32(%rsi),%ymm2,%ymm2
+ vpxor 64(%rsi),%ymm3,%ymm3
+ vpxor 96(%rsi),%ymm7,%ymm7
+ leaq 128(%rsi),%rsi
+ vmovdqu %ymm14,0(%rdi)
+ vmovdqu %ymm2,32(%rdi)
+ vmovdqu %ymm3,64(%rdi)
+ vmovdqu %ymm7,96(%rdi)
+ leaq 128(%rdi),%rdi
+
+ vpxor 0(%rsi),%ymm11,%ymm11
+ vpxor 32(%rsi),%ymm9,%ymm9
+ vpxor 64(%rsi),%ymm0,%ymm0
+ vpxor 96(%rsi),%ymm4,%ymm4
+ leaq 128(%rsi),%rsi
+ vmovdqu %ymm11,0(%rdi)
+ vmovdqu %ymm9,32(%rdi)
+ vmovdqu %ymm0,64(%rdi)
+ vmovdqu %ymm4,96(%rdi)
+ leaq 128(%rdi),%rdi
+
+ subq $512,%rdx
+ jnz .Loop_outer8x
+
+ jmp .Ldone8x
+
+.Ltail8x:
+ cmpq $448,%rdx
+ jae .L448_or_more8x
+ cmpq $384,%rdx
+ jae .L384_or_more8x
+ cmpq $320,%rdx
+ jae .L320_or_more8x
+ cmpq $256,%rdx
+ jae .L256_or_more8x
+ cmpq $192,%rdx
+ jae .L192_or_more8x
+ cmpq $128,%rdx
+ jae .L128_or_more8x
+ cmpq $64,%rdx
+ jae .L64_or_more8x
+
+ xorq %r9,%r9
+ vmovdqa %ymm6,0(%rsp)
+ vmovdqa %ymm8,32(%rsp)
+ jmp .Loop_tail8x
+
+.align 32
+.L64_or_more8x:
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ je .Ldone8x
+
+ leaq 64(%rsi),%rsi
+ xorq %r9,%r9
+ vmovdqa %ymm1,0(%rsp)
+ leaq 64(%rdi),%rdi
+ subq $64,%rdx
+ vmovdqa %ymm5,32(%rsp)
+ jmp .Loop_tail8x
+
+.align 32
+.L128_or_more8x:
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vpxor 64(%rsi),%ymm1,%ymm1
+ vpxor 96(%rsi),%ymm5,%ymm5
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ vmovdqu %ymm1,64(%rdi)
+ vmovdqu %ymm5,96(%rdi)
+ je .Ldone8x
+
+ leaq 128(%rsi),%rsi
+ xorq %r9,%r9
+ vmovdqa %ymm12,0(%rsp)
+ leaq 128(%rdi),%rdi
+ subq $128,%rdx
+ vmovdqa %ymm13,32(%rsp)
+ jmp .Loop_tail8x
+
+.align 32
+.L192_or_more8x:
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vpxor 64(%rsi),%ymm1,%ymm1
+ vpxor 96(%rsi),%ymm5,%ymm5
+ vpxor 128(%rsi),%ymm12,%ymm12
+ vpxor 160(%rsi),%ymm13,%ymm13
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ vmovdqu %ymm1,64(%rdi)
+ vmovdqu %ymm5,96(%rdi)
+ vmovdqu %ymm12,128(%rdi)
+ vmovdqu %ymm13,160(%rdi)
+ je .Ldone8x
+
+ leaq 192(%rsi),%rsi
+ xorq %r9,%r9
+ vmovdqa %ymm10,0(%rsp)
+ leaq 192(%rdi),%rdi
+ subq $192,%rdx
+ vmovdqa %ymm15,32(%rsp)
+ jmp .Loop_tail8x
+
+.align 32
+.L256_or_more8x:
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vpxor 64(%rsi),%ymm1,%ymm1
+ vpxor 96(%rsi),%ymm5,%ymm5
+ vpxor 128(%rsi),%ymm12,%ymm12
+ vpxor 160(%rsi),%ymm13,%ymm13
+ vpxor 192(%rsi),%ymm10,%ymm10
+ vpxor 224(%rsi),%ymm15,%ymm15
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ vmovdqu %ymm1,64(%rdi)
+ vmovdqu %ymm5,96(%rdi)
+ vmovdqu %ymm12,128(%rdi)
+ vmovdqu %ymm13,160(%rdi)
+ vmovdqu %ymm10,192(%rdi)
+ vmovdqu %ymm15,224(%rdi)
+ je .Ldone8x
+
+ leaq 256(%rsi),%rsi
+ xorq %r9,%r9
+ vmovdqa %ymm14,0(%rsp)
+ leaq 256(%rdi),%rdi
+ subq $256,%rdx
+ vmovdqa %ymm2,32(%rsp)
+ jmp .Loop_tail8x
+
+.align 32
+.L320_or_more8x:
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vpxor 64(%rsi),%ymm1,%ymm1
+ vpxor 96(%rsi),%ymm5,%ymm5
+ vpxor 128(%rsi),%ymm12,%ymm12
+ vpxor 160(%rsi),%ymm13,%ymm13
+ vpxor 192(%rsi),%ymm10,%ymm10
+ vpxor 224(%rsi),%ymm15,%ymm15
+ vpxor 256(%rsi),%ymm14,%ymm14
+ vpxor 288(%rsi),%ymm2,%ymm2
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ vmovdqu %ymm1,64(%rdi)
+ vmovdqu %ymm5,96(%rdi)
+ vmovdqu %ymm12,128(%rdi)
+ vmovdqu %ymm13,160(%rdi)
+ vmovdqu %ymm10,192(%rdi)
+ vmovdqu %ymm15,224(%rdi)
+ vmovdqu %ymm14,256(%rdi)
+ vmovdqu %ymm2,288(%rdi)
+ je .Ldone8x
+
+ leaq 320(%rsi),%rsi
+ xorq %r9,%r9
+ vmovdqa %ymm3,0(%rsp)
+ leaq 320(%rdi),%rdi
+ subq $320,%rdx
+ vmovdqa %ymm7,32(%rsp)
+ jmp .Loop_tail8x
+
+.align 32
+.L384_or_more8x:
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vpxor 64(%rsi),%ymm1,%ymm1
+ vpxor 96(%rsi),%ymm5,%ymm5
+ vpxor 128(%rsi),%ymm12,%ymm12
+ vpxor 160(%rsi),%ymm13,%ymm13
+ vpxor 192(%rsi),%ymm10,%ymm10
+ vpxor 224(%rsi),%ymm15,%ymm15
+ vpxor 256(%rsi),%ymm14,%ymm14
+ vpxor 288(%rsi),%ymm2,%ymm2
+ vpxor 320(%rsi),%ymm3,%ymm3
+ vpxor 352(%rsi),%ymm7,%ymm7
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ vmovdqu %ymm1,64(%rdi)
+ vmovdqu %ymm5,96(%rdi)
+ vmovdqu %ymm12,128(%rdi)
+ vmovdqu %ymm13,160(%rdi)
+ vmovdqu %ymm10,192(%rdi)
+ vmovdqu %ymm15,224(%rdi)
+ vmovdqu %ymm14,256(%rdi)
+ vmovdqu %ymm2,288(%rdi)
+ vmovdqu %ymm3,320(%rdi)
+ vmovdqu %ymm7,352(%rdi)
+ je .Ldone8x
+
+ leaq 384(%rsi),%rsi
+ xorq %r9,%r9
+ vmovdqa %ymm11,0(%rsp)
+ leaq 384(%rdi),%rdi
+ subq $384,%rdx
+ vmovdqa %ymm9,32(%rsp)
+ jmp .Loop_tail8x
+
+.align 32
+.L448_or_more8x:
+ vpxor 0(%rsi),%ymm6,%ymm6
+ vpxor 32(%rsi),%ymm8,%ymm8
+ vpxor 64(%rsi),%ymm1,%ymm1
+ vpxor 96(%rsi),%ymm5,%ymm5
+ vpxor 128(%rsi),%ymm12,%ymm12
+ vpxor 160(%rsi),%ymm13,%ymm13
+ vpxor 192(%rsi),%ymm10,%ymm10
+ vpxor 224(%rsi),%ymm15,%ymm15
+ vpxor 256(%rsi),%ymm14,%ymm14
+ vpxor 288(%rsi),%ymm2,%ymm2
+ vpxor 320(%rsi),%ymm3,%ymm3
+ vpxor 352(%rsi),%ymm7,%ymm7
+ vpxor 384(%rsi),%ymm11,%ymm11
+ vpxor 416(%rsi),%ymm9,%ymm9
+ vmovdqu %ymm6,0(%rdi)
+ vmovdqu %ymm8,32(%rdi)
+ vmovdqu %ymm1,64(%rdi)
+ vmovdqu %ymm5,96(%rdi)
+ vmovdqu %ymm12,128(%rdi)
+ vmovdqu %ymm13,160(%rdi)
+ vmovdqu %ymm10,192(%rdi)
+ vmovdqu %ymm15,224(%rdi)
+ vmovdqu %ymm14,256(%rdi)
+ vmovdqu %ymm2,288(%rdi)
+ vmovdqu %ymm3,320(%rdi)
+ vmovdqu %ymm7,352(%rdi)
+ vmovdqu %ymm11,384(%rdi)
+ vmovdqu %ymm9,416(%rdi)
+ je .Ldone8x
+
+ leaq 448(%rsi),%rsi
+ xorq %r9,%r9
+ vmovdqa %ymm0,0(%rsp)
+ leaq 448(%rdi),%rdi
+ subq $448,%rdx
+ vmovdqa %ymm4,32(%rsp)
+
+.Loop_tail8x:
+ movzbl (%rsi,%r9,1),%eax
+ movzbl (%rsp,%r9,1),%ecx
+ leaq 1(%r9),%r9
+ xorl %ecx,%eax
+ movb %al,-1(%rdi,%r9,1)
+ decq %rdx
+ jnz .Loop_tail8x
+
+.Ldone8x:
+ vzeroall
+ leaq -8(%r10),%rsp
+
+.L8x_epilogue:
+ ret
+ENDPROC(chacha20_avx2)
+#endif /* CONFIG_AS_AVX2 */
+
+#ifdef CONFIG_AS_AVX512
+.align 32
+ENTRY(chacha20_avx512)
+.Lchacha20_avx512:
+ cmpq $0,%rdx
+ je .Lavx512_epilogue
+ leaq 8(%rsp),%r10
+
+ cmpq $512,%rdx
+ ja .Lchacha20_16x
+
+ subq $64+8,%rsp
+ andq $-64,%rsp
+ vbroadcasti32x4 .Lsigma(%rip),%zmm0
+ vbroadcasti32x4 (%rcx),%zmm1
+ vbroadcasti32x4 16(%rcx),%zmm2
+ vbroadcasti32x4 (%r8),%zmm3
+
+ vmovdqa32 %zmm0,%zmm16
+ vmovdqa32 %zmm1,%zmm17
+ vmovdqa32 %zmm2,%zmm18
+ vpaddd .Lzeroz(%rip),%zmm3,%zmm3
+ vmovdqa32 .Lfourz(%rip),%zmm20
+ movq $10,%r8
+ vmovdqa32 %zmm3,%zmm19
+ jmp .Loop_avx512
+
+.align 16
+.Loop_outer_avx512:
+ vmovdqa32 %zmm16,%zmm0
+ vmovdqa32 %zmm17,%zmm1
+ vmovdqa32 %zmm18,%zmm2
+ vpaddd %zmm20,%zmm19,%zmm3
+ movq $10,%r8
+ vmovdqa32 %zmm3,%zmm19
+ jmp .Loop_avx512
+
+.align 32
+.Loop_avx512:
+ vpaddd %zmm1,%zmm0,%zmm0
+ vpxord %zmm0,%zmm3,%zmm3
+ vprold $16,%zmm3,%zmm3
+ vpaddd %zmm3,%zmm2,%zmm2
+ vpxord %zmm2,%zmm1,%zmm1
+ vprold $12,%zmm1,%zmm1
+ vpaddd %zmm1,%zmm0,%zmm0
+ vpxord %zmm0,%zmm3,%zmm3
+ vprold $8,%zmm3,%zmm3
+ vpaddd %zmm3,%zmm2,%zmm2
+ vpxord %zmm2,%zmm1,%zmm1
+ vprold $7,%zmm1,%zmm1
+ vpshufd $78,%zmm2,%zmm2
+ vpshufd $57,%zmm1,%zmm1
+ vpshufd $147,%zmm3,%zmm3
+ vpaddd %zmm1,%zmm0,%zmm0
+ vpxord %zmm0,%zmm3,%zmm3
+ vprold $16,%zmm3,%zmm3
+ vpaddd %zmm3,%zmm2,%zmm2
+ vpxord %zmm2,%zmm1,%zmm1
+ vprold $12,%zmm1,%zmm1
+ vpaddd %zmm1,%zmm0,%zmm0
+ vpxord %zmm0,%zmm3,%zmm3
+ vprold $8,%zmm3,%zmm3
+ vpaddd %zmm3,%zmm2,%zmm2
+ vpxord %zmm2,%zmm1,%zmm1
+ vprold $7,%zmm1,%zmm1
+ vpshufd $78,%zmm2,%zmm2
+ vpshufd $147,%zmm1,%zmm1
+ vpshufd $57,%zmm3,%zmm3
+ decq %r8
+ jnz .Loop_avx512
+ vpaddd %zmm16,%zmm0,%zmm0
+ vpaddd %zmm17,%zmm1,%zmm1
+ vpaddd %zmm18,%zmm2,%zmm2
+ vpaddd %zmm19,%zmm3,%zmm3
+
+ subq $64,%rdx
+ jb .Ltail64_avx512
+
+ vpxor 0(%rsi),%xmm0,%xmm4
+ vpxor 16(%rsi),%xmm1,%xmm5
+ vpxor 32(%rsi),%xmm2,%xmm6
+ vpxor 48(%rsi),%xmm3,%xmm7
+ leaq 64(%rsi),%rsi
+
+ vmovdqu %xmm4,0(%rdi)
+ vmovdqu %xmm5,16(%rdi)
+ vmovdqu %xmm6,32(%rdi)
+ vmovdqu %xmm7,48(%rdi)
+ leaq 64(%rdi),%rdi
+
+ jz .Ldone_avx512
+
+ vextracti32x4 $1,%zmm0,%xmm4
+ vextracti32x4 $1,%zmm1,%xmm5
+ vextracti32x4 $1,%zmm2,%xmm6
+ vextracti32x4 $1,%zmm3,%xmm7
+
+ subq $64,%rdx
+ jb .Ltail_avx512
+
+ vpxor 0(%rsi),%xmm4,%xmm4
+ vpxor 16(%rsi),%xmm5,%xmm5
+ vpxor 32(%rsi),%xmm6,%xmm6
+ vpxor 48(%rsi),%xmm7,%xmm7
+ leaq 64(%rsi),%rsi
+
+ vmovdqu %xmm4,0(%rdi)
+ vmovdqu %xmm5,16(%rdi)
+ vmovdqu %xmm6,32(%rdi)
+ vmovdqu %xmm7,48(%rdi)
+ leaq 64(%rdi),%rdi
+
+ jz .Ldone_avx512
+
+ vextracti32x4 $2,%zmm0,%xmm4
+ vextracti32x4 $2,%zmm1,%xmm5
+ vextracti32x4 $2,%zmm2,%xmm6
+ vextracti32x4 $2,%zmm3,%xmm7
+
+ subq $64,%rdx
+ jb .Ltail_avx512
+
+ vpxor 0(%rsi),%xmm4,%xmm4
+ vpxor 16(%rsi),%xmm5,%xmm5
+ vpxor 32(%rsi),%xmm6,%xmm6
+ vpxor 48(%rsi),%xmm7,%xmm7
+ leaq 64(%rsi),%rsi
+
+ vmovdqu %xmm4,0(%rdi)
+ vmovdqu %xmm5,16(%rdi)
+ vmovdqu %xmm6,32(%rdi)
+ vmovdqu %xmm7,48(%rdi)
+ leaq 64(%rdi),%rdi
+
+ jz .Ldone_avx512
+
+ vextracti32x4 $3,%zmm0,%xmm4
+ vextracti32x4 $3,%zmm1,%xmm5
+ vextracti32x4 $3,%zmm2,%xmm6
+ vextracti32x4 $3,%zmm3,%xmm7
+
+ subq $64,%rdx
+ jb .Ltail_avx512
+
+ vpxor 0(%rsi),%xmm4,%xmm4
+ vpxor 16(%rsi),%xmm5,%xmm5
+ vpxor 32(%rsi),%xmm6,%xmm6
+ vpxor 48(%rsi),%xmm7,%xmm7
+ leaq 64(%rsi),%rsi
+
+ vmovdqu %xmm4,0(%rdi)
+ vmovdqu %xmm5,16(%rdi)
+ vmovdqu %xmm6,32(%rdi)
+ vmovdqu %xmm7,48(%rdi)
+ leaq 64(%rdi),%rdi
+
+ jnz .Loop_outer_avx512
+
+ jmp .Ldone_avx512
+
+.align 16
+.Ltail64_avx512:
+ vmovdqa %xmm0,0(%rsp)
+ vmovdqa %xmm1,16(%rsp)
+ vmovdqa %xmm2,32(%rsp)
+ vmovdqa %xmm3,48(%rsp)
+ addq $64,%rdx
+ jmp .Loop_tail_avx512
+
+.align 16
+.Ltail_avx512:
+ vmovdqa %xmm4,0(%rsp)
+ vmovdqa %xmm5,16(%rsp)
+ vmovdqa %xmm6,32(%rsp)
+ vmovdqa %xmm7,48(%rsp)
+ addq $64,%rdx
+
+.Loop_tail_avx512:
+ movzbl (%rsi,%r8,1),%eax
+ movzbl (%rsp,%r8,1),%ecx
+ leaq 1(%r8),%r8
+ xorl %ecx,%eax
+ movb %al,-1(%rdi,%r8,1)
+ decq %rdx
+ jnz .Loop_tail_avx512
+
+ vmovdqa32 %zmm16,0(%rsp)
+
+.Ldone_avx512:
+ vzeroall
+ leaq -8(%r10),%rsp
+
+.Lavx512_epilogue:
+ ret
+
+.align 32
+.Lchacha20_16x:
+ leaq 8(%rsp),%r10
+
+ subq $64+8,%rsp
+ andq $-64,%rsp
+ vzeroupper
+
+ leaq .Lsigma(%rip),%r9
+ vbroadcasti32x4 (%r9),%zmm3
+ vbroadcasti32x4 (%rcx),%zmm7
+ vbroadcasti32x4 16(%rcx),%zmm11
+ vbroadcasti32x4 (%r8),%zmm15
+
+ vpshufd $0x00,%zmm3,%zmm0
+ vpshufd $0x55,%zmm3,%zmm1
+ vpshufd $0xaa,%zmm3,%zmm2
+ vpshufd $0xff,%zmm3,%zmm3
+ vmovdqa64 %zmm0,%zmm16
+ vmovdqa64 %zmm1,%zmm17
+ vmovdqa64 %zmm2,%zmm18
+ vmovdqa64 %zmm3,%zmm19
+
+ vpshufd $0x00,%zmm7,%zmm4
+ vpshufd $0x55,%zmm7,%zmm5
+ vpshufd $0xaa,%zmm7,%zmm6
+ vpshufd $0xff,%zmm7,%zmm7
+ vmovdqa64 %zmm4,%zmm20
+ vmovdqa64 %zmm5,%zmm21
+ vmovdqa64 %zmm6,%zmm22
+ vmovdqa64 %zmm7,%zmm23
+
+ vpshufd $0x00,%zmm11,%zmm8
+ vpshufd $0x55,%zmm11,%zmm9
+ vpshufd $0xaa,%zmm11,%zmm10
+ vpshufd $0xff,%zmm11,%zmm11
+ vmovdqa64 %zmm8,%zmm24
+ vmovdqa64 %zmm9,%zmm25
+ vmovdqa64 %zmm10,%zmm26
+ vmovdqa64 %zmm11,%zmm27
+
+ vpshufd $0x00,%zmm15,%zmm12
+ vpshufd $0x55,%zmm15,%zmm13
+ vpshufd $0xaa,%zmm15,%zmm14
+ vpshufd $0xff,%zmm15,%zmm15
+ vpaddd .Lincz(%rip),%zmm12,%zmm12
+ vmovdqa64 %zmm12,%zmm28
+ vmovdqa64 %zmm13,%zmm29
+ vmovdqa64 %zmm14,%zmm30
+ vmovdqa64 %zmm15,%zmm31
+
+ movl $10,%eax
+ jmp .Loop16x
+
+.align 32
+.Loop_outer16x:
+ vpbroadcastd 0(%r9),%zmm0
+ vpbroadcastd 4(%r9),%zmm1
+ vpbroadcastd 8(%r9),%zmm2
+ vpbroadcastd 12(%r9),%zmm3
+ vpaddd .Lsixteen(%rip),%zmm28,%zmm28
+ vmovdqa64 %zmm20,%zmm4
+ vmovdqa64 %zmm21,%zmm5
+ vmovdqa64 %zmm22,%zmm6
+ vmovdqa64 %zmm23,%zmm7
+ vmovdqa64 %zmm24,%zmm8
+ vmovdqa64 %zmm25,%zmm9
+ vmovdqa64 %zmm26,%zmm10
+ vmovdqa64 %zmm27,%zmm11
+ vmovdqa64 %zmm28,%zmm12
+ vmovdqa64 %zmm29,%zmm13
+ vmovdqa64 %zmm30,%zmm14
+ vmovdqa64 %zmm31,%zmm15
+
+ vmovdqa64 %zmm0,%zmm16
+ vmovdqa64 %zmm1,%zmm17
+ vmovdqa64 %zmm2,%zmm18
+ vmovdqa64 %zmm3,%zmm19
+
+ movl $10,%eax
+ jmp .Loop16x
+
+.align 32
+.Loop16x:
+ vpaddd %zmm4,%zmm0,%zmm0
+ vpaddd %zmm5,%zmm1,%zmm1
+ vpaddd %zmm6,%zmm2,%zmm2
+ vpaddd %zmm7,%zmm3,%zmm3
+ vpxord %zmm0,%zmm12,%zmm12
+ vpxord %zmm1,%zmm13,%zmm13
+ vpxord %zmm2,%zmm14,%zmm14
+ vpxord %zmm3,%zmm15,%zmm15
+ vprold $16,%zmm12,%zmm12
+ vprold $16,%zmm13,%zmm13
+ vprold $16,%zmm14,%zmm14
+ vprold $16,%zmm15,%zmm15
+ vpaddd %zmm12,%zmm8,%zmm8
+ vpaddd %zmm13,%zmm9,%zmm9
+ vpaddd %zmm14,%zmm10,%zmm10
+ vpaddd %zmm15,%zmm11,%zmm11
+ vpxord %zmm8,%zmm4,%zmm4
+ vpxord %zmm9,%zmm5,%zmm5
+ vpxord %zmm10,%zmm6,%zmm6
+ vpxord %zmm11,%zmm7,%zmm7
+ vprold $12,%zmm4,%zmm4
+ vprold $12,%zmm5,%zmm5
+ vprold $12,%zmm6,%zmm6
+ vprold $12,%zmm7,%zmm7
+ vpaddd %zmm4,%zmm0,%zmm0
+ vpaddd %zmm5,%zmm1,%zmm1
+ vpaddd %zmm6,%zmm2,%zmm2
+ vpaddd %zmm7,%zmm3,%zmm3
+ vpxord %zmm0,%zmm12,%zmm12
+ vpxord %zmm1,%zmm13,%zmm13
+ vpxord %zmm2,%zmm14,%zmm14
+ vpxord %zmm3,%zmm15,%zmm15
+ vprold $8,%zmm12,%zmm12
+ vprold $8,%zmm13,%zmm13
+ vprold $8,%zmm14,%zmm14
+ vprold $8,%zmm15,%zmm15
+ vpaddd %zmm12,%zmm8,%zmm8
+ vpaddd %zmm13,%zmm9,%zmm9
+ vpaddd %zmm14,%zmm10,%zmm10
+ vpaddd %zmm15,%zmm11,%zmm11
+ vpxord %zmm8,%zmm4,%zmm4
+ vpxord %zmm9,%zmm5,%zmm5
+ vpxord %zmm10,%zmm6,%zmm6
+ vpxord %zmm11,%zmm7,%zmm7
+ vprold $7,%zmm4,%zmm4
+ vprold $7,%zmm5,%zmm5
+ vprold $7,%zmm6,%zmm6
+ vprold $7,%zmm7,%zmm7
+ vpaddd %zmm5,%zmm0,%zmm0
+ vpaddd %zmm6,%zmm1,%zmm1
+ vpaddd %zmm7,%zmm2,%zmm2
+ vpaddd %zmm4,%zmm3,%zmm3
+ vpxord %zmm0,%zmm15,%zmm15
+ vpxord %zmm1,%zmm12,%zmm12
+ vpxord %zmm2,%zmm13,%zmm13
+ vpxord %zmm3,%zmm14,%zmm14
+ vprold $16,%zmm15,%zmm15
+ vprold $16,%zmm12,%zmm12
+ vprold $16,%zmm13,%zmm13
+ vprold $16,%zmm14,%zmm14
+ vpaddd %zmm15,%zmm10,%zmm10
+ vpaddd %zmm12,%zmm11,%zmm11
+ vpaddd %zmm13,%zmm8,%zmm8
+ vpaddd %zmm14,%zmm9,%zmm9
+ vpxord %zmm10,%zmm5,%zmm5
+ vpxord %zmm11,%zmm6,%zmm6
+ vpxord %zmm8,%zmm7,%zmm7
+ vpxord %zmm9,%zmm4,%zmm4
+ vprold $12,%zmm5,%zmm5
+ vprold $12,%zmm6,%zmm6
+ vprold $12,%zmm7,%zmm7
+ vprold $12,%zmm4,%zmm4
+ vpaddd %zmm5,%zmm0,%zmm0
+ vpaddd %zmm6,%zmm1,%zmm1
+ vpaddd %zmm7,%zmm2,%zmm2
+ vpaddd %zmm4,%zmm3,%zmm3
+ vpxord %zmm0,%zmm15,%zmm15
+ vpxord %zmm1,%zmm12,%zmm12
+ vpxord %zmm2,%zmm13,%zmm13
+ vpxord %zmm3,%zmm14,%zmm14
+ vprold $8,%zmm15,%zmm15
+ vprold $8,%zmm12,%zmm12
+ vprold $8,%zmm13,%zmm13
+ vprold $8,%zmm14,%zmm14
+ vpaddd %zmm15,%zmm10,%zmm10
+ vpaddd %zmm12,%zmm11,%zmm11
+ vpaddd %zmm13,%zmm8,%zmm8
+ vpaddd %zmm14,%zmm9,%zmm9
+ vpxord %zmm10,%zmm5,%zmm5
+ vpxord %zmm11,%zmm6,%zmm6
+ vpxord %zmm8,%zmm7,%zmm7
+ vpxord %zmm9,%zmm4,%zmm4
+ vprold $7,%zmm5,%zmm5
+ vprold $7,%zmm6,%zmm6
+ vprold $7,%zmm7,%zmm7
+ vprold $7,%zmm4,%zmm4
+ decl %eax
+ jnz .Loop16x
+
+ vpaddd %zmm16,%zmm0,%zmm0
+ vpaddd %zmm17,%zmm1,%zmm1
+ vpaddd %zmm18,%zmm2,%zmm2
+ vpaddd %zmm19,%zmm3,%zmm3
+
+ vpunpckldq %zmm1,%zmm0,%zmm18
+ vpunpckldq %zmm3,%zmm2,%zmm19
+ vpunpckhdq %zmm1,%zmm0,%zmm0
+ vpunpckhdq %zmm3,%zmm2,%zmm2
+ vpunpcklqdq %zmm19,%zmm18,%zmm1
+ vpunpckhqdq %zmm19,%zmm18,%zmm18
+ vpunpcklqdq %zmm2,%zmm0,%zmm3
+ vpunpckhqdq %zmm2,%zmm0,%zmm0
+ vpaddd %zmm20,%zmm4,%zmm4
+ vpaddd %zmm21,%zmm5,%zmm5
+ vpaddd %zmm22,%zmm6,%zmm6
+ vpaddd %zmm23,%zmm7,%zmm7
+
+ vpunpckldq %zmm5,%zmm4,%zmm2
+ vpunpckldq %zmm7,%zmm6,%zmm19
+ vpunpckhdq %zmm5,%zmm4,%zmm4
+ vpunpckhdq %zmm7,%zmm6,%zmm6
+ vpunpcklqdq %zmm19,%zmm2,%zmm5
+ vpunpckhqdq %zmm19,%zmm2,%zmm2
+ vpunpcklqdq %zmm6,%zmm4,%zmm7
+ vpunpckhqdq %zmm6,%zmm4,%zmm4
+ vshufi32x4 $0x44,%zmm5,%zmm1,%zmm19
+ vshufi32x4 $0xee,%zmm5,%zmm1,%zmm5
+ vshufi32x4 $0x44,%zmm2,%zmm18,%zmm1
+ vshufi32x4 $0xee,%zmm2,%zmm18,%zmm2
+ vshufi32x4 $0x44,%zmm7,%zmm3,%zmm18
+ vshufi32x4 $0xee,%zmm7,%zmm3,%zmm7
+ vshufi32x4 $0x44,%zmm4,%zmm0,%zmm3
+ vshufi32x4 $0xee,%zmm4,%zmm0,%zmm4
+ vpaddd %zmm24,%zmm8,%zmm8
+ vpaddd %zmm25,%zmm9,%zmm9
+ vpaddd %zmm26,%zmm10,%zmm10
+ vpaddd %zmm27,%zmm11,%zmm11
+
+ vpunpckldq %zmm9,%zmm8,%zmm6
+ vpunpckldq %zmm11,%zmm10,%zmm0
+ vpunpckhdq %zmm9,%zmm8,%zmm8
+ vpunpckhdq %zmm11,%zmm10,%zmm10
+ vpunpcklqdq %zmm0,%zmm6,%zmm9
+ vpunpckhqdq %zmm0,%zmm6,%zmm6
+ vpunpcklqdq %zmm10,%zmm8,%zmm11
+ vpunpckhqdq %zmm10,%zmm8,%zmm8
+ vpaddd %zmm28,%zmm12,%zmm12
+ vpaddd %zmm29,%zmm13,%zmm13
+ vpaddd %zmm30,%zmm14,%zmm14
+ vpaddd %zmm31,%zmm15,%zmm15
+
+ vpunpckldq %zmm13,%zmm12,%zmm10
+ vpunpckldq %zmm15,%zmm14,%zmm0
+ vpunpckhdq %zmm13,%zmm12,%zmm12
+ vpunpckhdq %zmm15,%zmm14,%zmm14
+ vpunpcklqdq %zmm0,%zmm10,%zmm13
+ vpunpckhqdq %zmm0,%zmm10,%zmm10
+ vpunpcklqdq %zmm14,%zmm12,%zmm15
+ vpunpckhqdq %zmm14,%zmm12,%zmm12
+ vshufi32x4 $0x44,%zmm13,%zmm9,%zmm0
+ vshufi32x4 $0xee,%zmm13,%zmm9,%zmm13
+ vshufi32x4 $0x44,%zmm10,%zmm6,%zmm9
+ vshufi32x4 $0xee,%zmm10,%zmm6,%zmm10
+ vshufi32x4 $0x44,%zmm15,%zmm11,%zmm6
+ vshufi32x4 $0xee,%zmm15,%zmm11,%zmm15
+ vshufi32x4 $0x44,%zmm12,%zmm8,%zmm11
+ vshufi32x4 $0xee,%zmm12,%zmm8,%zmm12
+ vshufi32x4 $0x88,%zmm0,%zmm19,%zmm16
+ vshufi32x4 $0xdd,%zmm0,%zmm19,%zmm19
+ vshufi32x4 $0x88,%zmm13,%zmm5,%zmm0
+ vshufi32x4 $0xdd,%zmm13,%zmm5,%zmm13
+ vshufi32x4 $0x88,%zmm9,%zmm1,%zmm17
+ vshufi32x4 $0xdd,%zmm9,%zmm1,%zmm1
+ vshufi32x4 $0x88,%zmm10,%zmm2,%zmm9
+ vshufi32x4 $0xdd,%zmm10,%zmm2,%zmm10
+ vshufi32x4 $0x88,%zmm6,%zmm18,%zmm14
+ vshufi32x4 $0xdd,%zmm6,%zmm18,%zmm18
+ vshufi32x4 $0x88,%zmm15,%zmm7,%zmm6
+ vshufi32x4 $0xdd,%zmm15,%zmm7,%zmm15
+ vshufi32x4 $0x88,%zmm11,%zmm3,%zmm8
+ vshufi32x4 $0xdd,%zmm11,%zmm3,%zmm3
+ vshufi32x4 $0x88,%zmm12,%zmm4,%zmm11
+ vshufi32x4 $0xdd,%zmm12,%zmm4,%zmm12
+ cmpq $1024,%rdx
+ jb .Ltail16x
+
+ vpxord 0(%rsi),%zmm16,%zmm16
+ vpxord 64(%rsi),%zmm17,%zmm17
+ vpxord 128(%rsi),%zmm14,%zmm14
+ vpxord 192(%rsi),%zmm8,%zmm8
+ vmovdqu32 %zmm16,0(%rdi)
+ vmovdqu32 %zmm17,64(%rdi)
+ vmovdqu32 %zmm14,128(%rdi)
+ vmovdqu32 %zmm8,192(%rdi)
+
+ vpxord 256(%rsi),%zmm19,%zmm19
+ vpxord 320(%rsi),%zmm1,%zmm1
+ vpxord 384(%rsi),%zmm18,%zmm18
+ vpxord 448(%rsi),%zmm3,%zmm3
+ vmovdqu32 %zmm19,256(%rdi)
+ vmovdqu32 %zmm1,320(%rdi)
+ vmovdqu32 %zmm18,384(%rdi)
+ vmovdqu32 %zmm3,448(%rdi)
+
+ vpxord 512(%rsi),%zmm0,%zmm0
+ vpxord 576(%rsi),%zmm9,%zmm9
+ vpxord 640(%rsi),%zmm6,%zmm6
+ vpxord 704(%rsi),%zmm11,%zmm11
+ vmovdqu32 %zmm0,512(%rdi)
+ vmovdqu32 %zmm9,576(%rdi)
+ vmovdqu32 %zmm6,640(%rdi)
+ vmovdqu32 %zmm11,704(%rdi)
+
+ vpxord 768(%rsi),%zmm13,%zmm13
+ vpxord 832(%rsi),%zmm10,%zmm10
+ vpxord 896(%rsi),%zmm15,%zmm15
+ vpxord 960(%rsi),%zmm12,%zmm12
+ leaq 1024(%rsi),%rsi
+ vmovdqu32 %zmm13,768(%rdi)
+ vmovdqu32 %zmm10,832(%rdi)
+ vmovdqu32 %zmm15,896(%rdi)
+ vmovdqu32 %zmm12,960(%rdi)
+ leaq 1024(%rdi),%rdi
+
+ subq $1024,%rdx
+ jnz .Loop_outer16x
+
+ jmp .Ldone16x
+
+.align 32
+.Ltail16x:
+ xorq %r9,%r9
+ subq %rsi,%rdi
+ cmpq $64,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm16,%zmm16
+ vmovdqu32 %zmm16,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm17,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $128,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm17,%zmm17
+ vmovdqu32 %zmm17,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm14,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $192,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm14,%zmm14
+ vmovdqu32 %zmm14,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm8,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $256,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm8,%zmm8
+ vmovdqu32 %zmm8,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm19,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $320,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm19,%zmm19
+ vmovdqu32 %zmm19,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm1,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $384,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm1,%zmm1
+ vmovdqu32 %zmm1,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm18,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $448,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm18,%zmm18
+ vmovdqu32 %zmm18,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm3,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $512,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm3,%zmm3
+ vmovdqu32 %zmm3,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm0,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $576,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm0,%zmm0
+ vmovdqu32 %zmm0,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm9,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $640,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm9,%zmm9
+ vmovdqu32 %zmm9,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm6,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $704,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm6,%zmm6
+ vmovdqu32 %zmm6,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm11,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $768,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm11,%zmm11
+ vmovdqu32 %zmm11,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm13,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $832,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm13,%zmm13
+ vmovdqu32 %zmm13,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm10,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $896,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm10,%zmm10
+ vmovdqu32 %zmm10,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm15,%zmm16
+ leaq 64(%rsi),%rsi
+
+ cmpq $960,%rdx
+ jb .Less_than_64_16x
+ vpxord (%rsi),%zmm15,%zmm15
+ vmovdqu32 %zmm15,(%rdi,%rsi,1)
+ je .Ldone16x
+ vmovdqa32 %zmm12,%zmm16
+ leaq 64(%rsi),%rsi
+
+.Less_than_64_16x:
+ vmovdqa32 %zmm16,0(%rsp)
+ leaq (%rdi,%rsi,1),%rdi
+ andq $63,%rdx
+
+.Loop_tail16x:
+ movzbl (%rsi,%r9,1),%eax
+ movzbl (%rsp,%r9,1),%ecx
+ leaq 1(%r9),%r9
+ xorl %ecx,%eax
+ movb %al,-1(%rdi,%r9,1)
+ decq %rdx
+ jnz .Loop_tail16x
+
+ vpxord %zmm16,%zmm16,%zmm16
+ vmovdqa32 %zmm16,0(%rsp)
+
+.Ldone16x:
+ vzeroall
+ leaq -8(%r10),%rsp
+
+.L16x_epilogue:
+ ret
+ENDPROC(chacha20_avx512)
+
+.align 32
+ENTRY(chacha20_avx512vl)
+ cmpq $0,%rdx
+ je .Lavx512vl_epilogue
+
+ leaq 8(%rsp),%r10
+
+ cmpq $128,%rdx
+ ja .Lchacha20_8xvl
+
+ subq $64+8,%rsp
+ andq $-64,%rsp
+ vbroadcasti128 .Lsigma(%rip),%ymm0
+ vbroadcasti128 (%rcx),%ymm1
+ vbroadcasti128 16(%rcx),%ymm2
+ vbroadcasti128 (%r8),%ymm3
+
+ vmovdqa32 %ymm0,%ymm16
+ vmovdqa32 %ymm1,%ymm17
+ vmovdqa32 %ymm2,%ymm18
+ vpaddd .Lzeroz(%rip),%ymm3,%ymm3
+ vmovdqa32 .Ltwoy(%rip),%ymm20
+ movq $10,%r8
+ vmovdqa32 %ymm3,%ymm19
+ jmp .Loop_avx512vl
+
+.align 16
+.Loop_outer_avx512vl:
+ vmovdqa32 %ymm18,%ymm2
+ vpaddd %ymm20,%ymm19,%ymm3
+ movq $10,%r8
+ vmovdqa32 %ymm3,%ymm19
+ jmp .Loop_avx512vl
+
+.align 32
+.Loop_avx512vl:
+ vpaddd %ymm1,%ymm0,%ymm0
+ vpxor %ymm0,%ymm3,%ymm3
+ vprold $16,%ymm3,%ymm3
+ vpaddd %ymm3,%ymm2,%ymm2
+ vpxor %ymm2,%ymm1,%ymm1
+ vprold $12,%ymm1,%ymm1
+ vpaddd %ymm1,%ymm0,%ymm0
+ vpxor %ymm0,%ymm3,%ymm3
+ vprold $8,%ymm3,%ymm3
+ vpaddd %ymm3,%ymm2,%ymm2
+ vpxor %ymm2,%ymm1,%ymm1
+ vprold $7,%ymm1,%ymm1
+ vpshufd $78,%ymm2,%ymm2
+ vpshufd $57,%ymm1,%ymm1
+ vpshufd $147,%ymm3,%ymm3
+ vpaddd %ymm1,%ymm0,%ymm0
+ vpxor %ymm0,%ymm3,%ymm3
+ vprold $16,%ymm3,%ymm3
+ vpaddd %ymm3,%ymm2,%ymm2
+ vpxor %ymm2,%ymm1,%ymm1
+ vprold $12,%ymm1,%ymm1
+ vpaddd %ymm1,%ymm0,%ymm0
+ vpxor %ymm0,%ymm3,%ymm3
+ vprold $8,%ymm3,%ymm3
+ vpaddd %ymm3,%ymm2,%ymm2
+ vpxor %ymm2,%ymm1,%ymm1
+ vprold $7,%ymm1,%ymm1
+ vpshufd $78,%ymm2,%ymm2
+ vpshufd $147,%ymm1,%ymm1
+ vpshufd $57,%ymm3,%ymm3
+ decq %r8
+ jnz .Loop_avx512vl
+ vpaddd %ymm16,%ymm0,%ymm0
+ vpaddd %ymm17,%ymm1,%ymm1
+ vpaddd %ymm18,%ymm2,%ymm2
+ vpaddd %ymm19,%ymm3,%ymm3
+
+ subq $64,%rdx
+ jb .Ltail64_avx512vl
+
+ vpxor 0(%rsi),%xmm0,%xmm4
+ vpxor 16(%rsi),%xmm1,%xmm5
+ vpxor 32(%rsi),%xmm2,%xmm6
+ vpxor 48(%rsi),%xmm3,%xmm7
+ leaq 64(%rsi),%rsi
+
+ vmovdqu %xmm4,0(%rdi)
+ vmovdqu %xmm5,16(%rdi)
+ vmovdqu %xmm6,32(%rdi)
+ vmovdqu %xmm7,48(%rdi)
+ leaq 64(%rdi),%rdi
+
+ jz .Ldone_avx512vl
+
+ vextracti128 $1,%ymm0,%xmm4
+ vextracti128 $1,%ymm1,%xmm5
+ vextracti128 $1,%ymm2,%xmm6
+ vextracti128 $1,%ymm3,%xmm7
+
+ subq $64,%rdx
+ jb .Ltail_avx512vl
+
+ vpxor 0(%rsi),%xmm4,%xmm4
+ vpxor 16(%rsi),%xmm5,%xmm5
+ vpxor 32(%rsi),%xmm6,%xmm6
+ vpxor 48(%rsi),%xmm7,%xmm7
+ leaq 64(%rsi),%rsi
+
+ vmovdqu %xmm4,0(%rdi)
+ vmovdqu %xmm5,16(%rdi)
+ vmovdqu %xmm6,32(%rdi)
+ vmovdqu %xmm7,48(%rdi)
+ leaq 64(%rdi),%rdi
+
+ vmovdqa32 %ymm16,%ymm0
+ vmovdqa32 %ymm17,%ymm1
+ jnz .Loop_outer_avx512vl
+
+ jmp .Ldone_avx512vl
+
+.align 16
+.Ltail64_avx512vl:
+ vmovdqa %xmm0,0(%rsp)
+ vmovdqa %xmm1,16(%rsp)
+ vmovdqa %xmm2,32(%rsp)
+ vmovdqa %xmm3,48(%rsp)
+ addq $64,%rdx
+ jmp .Loop_tail_avx512vl
+
+.align 16
+.Ltail_avx512vl:
+ vmovdqa %xmm4,0(%rsp)
+ vmovdqa %xmm5,16(%rsp)
+ vmovdqa %xmm6,32(%rsp)
+ vmovdqa %xmm7,48(%rsp)
+ addq $64,%rdx
+
+.Loop_tail_avx512vl:
+ movzbl (%rsi,%r8,1),%eax
+ movzbl (%rsp,%r8,1),%ecx
+ leaq 1(%r8),%r8
+ xorl %ecx,%eax
+ movb %al,-1(%rdi,%r8,1)
+ decq %rdx
+ jnz .Loop_tail_avx512vl
+
+ vmovdqa32 %ymm16,0(%rsp)
+ vmovdqa32 %ymm16,32(%rsp)
+
+.Ldone_avx512vl:
+ vzeroall
+ leaq -8(%r10),%rsp
+.Lavx512vl_epilogue:
+ ret
+
+.align 32
+.Lchacha20_8xvl:
+ leaq 8(%rsp),%r10
+ subq $64+8,%rsp
+ andq $-64,%rsp
+ vzeroupper
+
+ leaq .Lsigma(%rip),%r9
+ vbroadcasti128 (%r9),%ymm3
+ vbroadcasti128 (%rcx),%ymm7
+ vbroadcasti128 16(%rcx),%ymm11
+ vbroadcasti128 (%r8),%ymm15
+
+ vpshufd $0x00,%ymm3,%ymm0
+ vpshufd $0x55,%ymm3,%ymm1
+ vpshufd $0xaa,%ymm3,%ymm2
+ vpshufd $0xff,%ymm3,%ymm3
+ vmovdqa64 %ymm0,%ymm16
+ vmovdqa64 %ymm1,%ymm17
+ vmovdqa64 %ymm2,%ymm18
+ vmovdqa64 %ymm3,%ymm19
+
+ vpshufd $0x00,%ymm7,%ymm4
+ vpshufd $0x55,%ymm7,%ymm5
+ vpshufd $0xaa,%ymm7,%ymm6
+ vpshufd $0xff,%ymm7,%ymm7
+ vmovdqa64 %ymm4,%ymm20
+ vmovdqa64 %ymm5,%ymm21
+ vmovdqa64 %ymm6,%ymm22
+ vmovdqa64 %ymm7,%ymm23
+
+ vpshufd $0x00,%ymm11,%ymm8
+ vpshufd $0x55,%ymm11,%ymm9
+ vpshufd $0xaa,%ymm11,%ymm10
+ vpshufd $0xff,%ymm11,%ymm11
+ vmovdqa64 %ymm8,%ymm24
+ vmovdqa64 %ymm9,%ymm25
+ vmovdqa64 %ymm10,%ymm26
+ vmovdqa64 %ymm11,%ymm27
+
+ vpshufd $0x00,%ymm15,%ymm12
+ vpshufd $0x55,%ymm15,%ymm13
+ vpshufd $0xaa,%ymm15,%ymm14
+ vpshufd $0xff,%ymm15,%ymm15
+ vpaddd .Lincy(%rip),%ymm12,%ymm12
+ vmovdqa64 %ymm12,%ymm28
+ vmovdqa64 %ymm13,%ymm29
+ vmovdqa64 %ymm14,%ymm30
+ vmovdqa64 %ymm15,%ymm31
+
+ movl $10,%eax
+ jmp .Loop8xvl
+
+.align 32
+.Loop_outer8xvl:
+
+
+ vpbroadcastd 8(%r9),%ymm2
+ vpbroadcastd 12(%r9),%ymm3
+ vpaddd .Leight(%rip),%ymm28,%ymm28
+ vmovdqa64 %ymm20,%ymm4
+ vmovdqa64 %ymm21,%ymm5
+ vmovdqa64 %ymm22,%ymm6
+ vmovdqa64 %ymm23,%ymm7
+ vmovdqa64 %ymm24,%ymm8
+ vmovdqa64 %ymm25,%ymm9
+ vmovdqa64 %ymm26,%ymm10
+ vmovdqa64 %ymm27,%ymm11
+ vmovdqa64 %ymm28,%ymm12
+ vmovdqa64 %ymm29,%ymm13
+ vmovdqa64 %ymm30,%ymm14
+ vmovdqa64 %ymm31,%ymm15
+
+ vmovdqa64 %ymm0,%ymm16
+ vmovdqa64 %ymm1,%ymm17
+ vmovdqa64 %ymm2,%ymm18
+ vmovdqa64 %ymm3,%ymm19
+
+ movl $10,%eax
+ jmp .Loop8xvl
+
+.align 32
+.Loop8xvl:
+ vpaddd %ymm4,%ymm0,%ymm0
+ vpaddd %ymm5,%ymm1,%ymm1
+ vpaddd %ymm6,%ymm2,%ymm2
+ vpaddd %ymm7,%ymm3,%ymm3
+ vpxor %ymm0,%ymm12,%ymm12
+ vpxor %ymm1,%ymm13,%ymm13
+ vpxor %ymm2,%ymm14,%ymm14
+ vpxor %ymm3,%ymm15,%ymm15
+ vprold $16,%ymm12,%ymm12
+ vprold $16,%ymm13,%ymm13
+ vprold $16,%ymm14,%ymm14
+ vprold $16,%ymm15,%ymm15
+ vpaddd %ymm12,%ymm8,%ymm8
+ vpaddd %ymm13,%ymm9,%ymm9
+ vpaddd %ymm14,%ymm10,%ymm10
+ vpaddd %ymm15,%ymm11,%ymm11
+ vpxor %ymm8,%ymm4,%ymm4
+ vpxor %ymm9,%ymm5,%ymm5
+ vpxor %ymm10,%ymm6,%ymm6
+ vpxor %ymm11,%ymm7,%ymm7
+ vprold $12,%ymm4,%ymm4
+ vprold $12,%ymm5,%ymm5
+ vprold $12,%ymm6,%ymm6
+ vprold $12,%ymm7,%ymm7
+ vpaddd %ymm4,%ymm0,%ymm0
+ vpaddd %ymm5,%ymm1,%ymm1
+ vpaddd %ymm6,%ymm2,%ymm2
+ vpaddd %ymm7,%ymm3,%ymm3
+ vpxor %ymm0,%ymm12,%ymm12
+ vpxor %ymm1,%ymm13,%ymm13
+ vpxor %ymm2,%ymm14,%ymm14
+ vpxor %ymm3,%ymm15,%ymm15
+ vprold $8,%ymm12,%ymm12
+ vprold $8,%ymm13,%ymm13
+ vprold $8,%ymm14,%ymm14
+ vprold $8,%ymm15,%ymm15
+ vpaddd %ymm12,%ymm8,%ymm8
+ vpaddd %ymm13,%ymm9,%ymm9
+ vpaddd %ymm14,%ymm10,%ymm10
+ vpaddd %ymm15,%ymm11,%ymm11
+ vpxor %ymm8,%ymm4,%ymm4
+ vpxor %ymm9,%ymm5,%ymm5
+ vpxor %ymm10,%ymm6,%ymm6
+ vpxor %ymm11,%ymm7,%ymm7
+ vprold $7,%ymm4,%ymm4
+ vprold $7,%ymm5,%ymm5
+ vprold $7,%ymm6,%ymm6
+ vprold $7,%ymm7,%ymm7
+ vpaddd %ymm5,%ymm0,%ymm0
+ vpaddd %ymm6,%ymm1,%ymm1
+ vpaddd %ymm7,%ymm2,%ymm2
+ vpaddd %ymm4,%ymm3,%ymm3
+ vpxor %ymm0,%ymm15,%ymm15
+ vpxor %ymm1,%ymm12,%ymm12
+ vpxor %ymm2,%ymm13,%ymm13
+ vpxor %ymm3,%ymm14,%ymm14
+ vprold $16,%ymm15,%ymm15
+ vprold $16,%ymm12,%ymm12
+ vprold $16,%ymm13,%ymm13
+ vprold $16,%ymm14,%ymm14
+ vpaddd %ymm15,%ymm10,%ymm10
+ vpaddd %ymm12,%ymm11,%ymm11
+ vpaddd %ymm13,%ymm8,%ymm8
+ vpaddd %ymm14,%ymm9,%ymm9
+ vpxor %ymm10,%ymm5,%ymm5
+ vpxor %ymm11,%ymm6,%ymm6
+ vpxor %ymm8,%ymm7,%ymm7
+ vpxor %ymm9,%ymm4,%ymm4
+ vprold $12,%ymm5,%ymm5
+ vprold $12,%ymm6,%ymm6
+ vprold $12,%ymm7,%ymm7
+ vprold $12,%ymm4,%ymm4
+ vpaddd %ymm5,%ymm0,%ymm0
+ vpaddd %ymm6,%ymm1,%ymm1
+ vpaddd %ymm7,%ymm2,%ymm2
+ vpaddd %ymm4,%ymm3,%ymm3
+ vpxor %ymm0,%ymm15,%ymm15
+ vpxor %ymm1,%ymm12,%ymm12
+ vpxor %ymm2,%ymm13,%ymm13
+ vpxor %ymm3,%ymm14,%ymm14
+ vprold $8,%ymm15,%ymm15
+ vprold $8,%ymm12,%ymm12
+ vprold $8,%ymm13,%ymm13
+ vprold $8,%ymm14,%ymm14
+ vpaddd %ymm15,%ymm10,%ymm10
+ vpaddd %ymm12,%ymm11,%ymm11
+ vpaddd %ymm13,%ymm8,%ymm8
+ vpaddd %ymm14,%ymm9,%ymm9
+ vpxor %ymm10,%ymm5,%ymm5
+ vpxor %ymm11,%ymm6,%ymm6
+ vpxor %ymm8,%ymm7,%ymm7
+ vpxor %ymm9,%ymm4,%ymm4
+ vprold $7,%ymm5,%ymm5
+ vprold $7,%ymm6,%ymm6
+ vprold $7,%ymm7,%ymm7
+ vprold $7,%ymm4,%ymm4
+ decl %eax
+ jnz .Loop8xvl
+
+ vpaddd %ymm16,%ymm0,%ymm0
+ vpaddd %ymm17,%ymm1,%ymm1
+ vpaddd %ymm18,%ymm2,%ymm2
+ vpaddd %ymm19,%ymm3,%ymm3
+
+ vpunpckldq %ymm1,%ymm0,%ymm18
+ vpunpckldq %ymm3,%ymm2,%ymm19
+ vpunpckhdq %ymm1,%ymm0,%ymm0
+ vpunpckhdq %ymm3,%ymm2,%ymm2
+ vpunpcklqdq %ymm19,%ymm18,%ymm1
+ vpunpckhqdq %ymm19,%ymm18,%ymm18
+ vpunpcklqdq %ymm2,%ymm0,%ymm3
+ vpunpckhqdq %ymm2,%ymm0,%ymm0
+ vpaddd %ymm20,%ymm4,%ymm4
+ vpaddd %ymm21,%ymm5,%ymm5
+ vpaddd %ymm22,%ymm6,%ymm6
+ vpaddd %ymm23,%ymm7,%ymm7
+
+ vpunpckldq %ymm5,%ymm4,%ymm2
+ vpunpckldq %ymm7,%ymm6,%ymm19
+ vpunpckhdq %ymm5,%ymm4,%ymm4
+ vpunpckhdq %ymm7,%ymm6,%ymm6
+ vpunpcklqdq %ymm19,%ymm2,%ymm5
+ vpunpckhqdq %ymm19,%ymm2,%ymm2
+ vpunpcklqdq %ymm6,%ymm4,%ymm7
+ vpunpckhqdq %ymm6,%ymm4,%ymm4
+ vshufi32x4 $0,%ymm5,%ymm1,%ymm19
+ vshufi32x4 $3,%ymm5,%ymm1,%ymm5
+ vshufi32x4 $0,%ymm2,%ymm18,%ymm1
+ vshufi32x4 $3,%ymm2,%ymm18,%ymm2
+ vshufi32x4 $0,%ymm7,%ymm3,%ymm18
+ vshufi32x4 $3,%ymm7,%ymm3,%ymm7
+ vshufi32x4 $0,%ymm4,%ymm0,%ymm3
+ vshufi32x4 $3,%ymm4,%ymm0,%ymm4
+ vpaddd %ymm24,%ymm8,%ymm8
+ vpaddd %ymm25,%ymm9,%ymm9
+ vpaddd %ymm26,%ymm10,%ymm10
+ vpaddd %ymm27,%ymm11,%ymm11
+
+ vpunpckldq %ymm9,%ymm8,%ymm6
+ vpunpckldq %ymm11,%ymm10,%ymm0
+ vpunpckhdq %ymm9,%ymm8,%ymm8
+ vpunpckhdq %ymm11,%ymm10,%ymm10
+ vpunpcklqdq %ymm0,%ymm6,%ymm9
+ vpunpckhqdq %ymm0,%ymm6,%ymm6
+ vpunpcklqdq %ymm10,%ymm8,%ymm11
+ vpunpckhqdq %ymm10,%ymm8,%ymm8
+ vpaddd %ymm28,%ymm12,%ymm12
+ vpaddd %ymm29,%ymm13,%ymm13
+ vpaddd %ymm30,%ymm14,%ymm14
+ vpaddd %ymm31,%ymm15,%ymm15
+
+ vpunpckldq %ymm13,%ymm12,%ymm10
+ vpunpckldq %ymm15,%ymm14,%ymm0
+ vpunpckhdq %ymm13,%ymm12,%ymm12
+ vpunpckhdq %ymm15,%ymm14,%ymm14
+ vpunpcklqdq %ymm0,%ymm10,%ymm13
+ vpunpckhqdq %ymm0,%ymm10,%ymm10
+ vpunpcklqdq %ymm14,%ymm12,%ymm15
+ vpunpckhqdq %ymm14,%ymm12,%ymm12
+ vperm2i128 $0x20,%ymm13,%ymm9,%ymm0
+ vperm2i128 $0x31,%ymm13,%ymm9,%ymm13
+ vperm2i128 $0x20,%ymm10,%ymm6,%ymm9
+ vperm2i128 $0x31,%ymm10,%ymm6,%ymm10
+ vperm2i128 $0x20,%ymm15,%ymm11,%ymm6
+ vperm2i128 $0x31,%ymm15,%ymm11,%ymm15
+ vperm2i128 $0x20,%ymm12,%ymm8,%ymm11
+ vperm2i128 $0x31,%ymm12,%ymm8,%ymm12
+ cmpq $512,%rdx
+ jb .Ltail8xvl
+
+ movl $0x80,%eax
+ vpxord 0(%rsi),%ymm19,%ymm19
+ vpxor 32(%rsi),%ymm0,%ymm0
+ vpxor 64(%rsi),%ymm5,%ymm5
+ vpxor 96(%rsi),%ymm13,%ymm13
+ leaq (%rsi,%rax,1),%rsi
+ vmovdqu32 %ymm19,0(%rdi)
+ vmovdqu %ymm0,32(%rdi)
+ vmovdqu %ymm5,64(%rdi)
+ vmovdqu %ymm13,96(%rdi)
+ leaq (%rdi,%rax,1),%rdi
+
+ vpxor 0(%rsi),%ymm1,%ymm1
+ vpxor 32(%rsi),%ymm9,%ymm9
+ vpxor 64(%rsi),%ymm2,%ymm2
+ vpxor 96(%rsi),%ymm10,%ymm10
+ leaq (%rsi,%rax,1),%rsi
+ vmovdqu %ymm1,0(%rdi)
+ vmovdqu %ymm9,32(%rdi)
+ vmovdqu %ymm2,64(%rdi)
+ vmovdqu %ymm10,96(%rdi)
+ leaq (%rdi,%rax,1),%rdi
+
+ vpxord 0(%rsi),%ymm18,%ymm18
+ vpxor 32(%rsi),%ymm6,%ymm6
+ vpxor 64(%rsi),%ymm7,%ymm7
+ vpxor 96(%rsi),%ymm15,%ymm15
+ leaq (%rsi,%rax,1),%rsi
+ vmovdqu32 %ymm18,0(%rdi)
+ vmovdqu %ymm6,32(%rdi)
+ vmovdqu %ymm7,64(%rdi)
+ vmovdqu %ymm15,96(%rdi)
+ leaq (%rdi,%rax,1),%rdi
+
+ vpxor 0(%rsi),%ymm3,%ymm3
+ vpxor 32(%rsi),%ymm11,%ymm11
+ vpxor 64(%rsi),%ymm4,%ymm4
+ vpxor 96(%rsi),%ymm12,%ymm12
+ leaq (%rsi,%rax,1),%rsi
+ vmovdqu %ymm3,0(%rdi)
+ vmovdqu %ymm11,32(%rdi)
+ vmovdqu %ymm4,64(%rdi)
+ vmovdqu %ymm12,96(%rdi)
+ leaq (%rdi,%rax,1),%rdi
+
+ vpbroadcastd 0(%r9),%ymm0
+ vpbroadcastd 4(%r9),%ymm1
+
+ subq $512,%rdx
+ jnz .Loop_outer8xvl
+
+ jmp .Ldone8xvl
+
+.align 32
+.Ltail8xvl:
+ vmovdqa64 %ymm19,%ymm8
+ xorq %r9,%r9
+ subq %rsi,%rdi
+ cmpq $64,%rdx
+ jb .Less_than_64_8xvl
+ vpxor 0(%rsi),%ymm8,%ymm8
+ vpxor 32(%rsi),%ymm0,%ymm0
+ vmovdqu %ymm8,0(%rdi,%rsi,1)
+ vmovdqu %ymm0,32(%rdi,%rsi,1)
+ je .Ldone8xvl
+ vmovdqa %ymm5,%ymm8
+ vmovdqa %ymm13,%ymm0
+ leaq 64(%rsi),%rsi
+
+ cmpq $128,%rdx
+ jb .Less_than_64_8xvl
+ vpxor 0(%rsi),%ymm5,%ymm5
+ vpxor 32(%rsi),%ymm13,%ymm13
+ vmovdqu %ymm5,0(%rdi,%rsi,1)
+ vmovdqu %ymm13,32(%rdi,%rsi,1)
+ je .Ldone8xvl
+ vmovdqa %ymm1,%ymm8
+ vmovdqa %ymm9,%ymm0
+ leaq 64(%rsi),%rsi
+
+ cmpq $192,%rdx
+ jb .Less_than_64_8xvl
+ vpxor 0(%rsi),%ymm1,%ymm1
+ vpxor 32(%rsi),%ymm9,%ymm9
+ vmovdqu %ymm1,0(%rdi,%rsi,1)
+ vmovdqu %ymm9,32(%rdi,%rsi,1)
+ je .Ldone8xvl
+ vmovdqa %ymm2,%ymm8
+ vmovdqa %ymm10,%ymm0
+ leaq 64(%rsi),%rsi
+
+ cmpq $256,%rdx
+ jb .Less_than_64_8xvl
+ vpxor 0(%rsi),%ymm2,%ymm2
+ vpxor 32(%rsi),%ymm10,%ymm10
+ vmovdqu %ymm2,0(%rdi,%rsi,1)
+ vmovdqu %ymm10,32(%rdi,%rsi,1)
+ je .Ldone8xvl
+ vmovdqa32 %ymm18,%ymm8
+ vmovdqa %ymm6,%ymm0
+ leaq 64(%rsi),%rsi
+
+ cmpq $320,%rdx
+ jb .Less_than_64_8xvl
+ vpxord 0(%rsi),%ymm18,%ymm18
+ vpxor 32(%rsi),%ymm6,%ymm6
+ vmovdqu32 %ymm18,0(%rdi,%rsi,1)
+ vmovdqu %ymm6,32(%rdi,%rsi,1)
+ je .Ldone8xvl
+ vmovdqa %ymm7,%ymm8
+ vmovdqa %ymm15,%ymm0
+ leaq 64(%rsi),%rsi
+
+ cmpq $384,%rdx
+ jb .Less_than_64_8xvl
+ vpxor 0(%rsi),%ymm7,%ymm7
+ vpxor 32(%rsi),%ymm15,%ymm15
+ vmovdqu %ymm7,0(%rdi,%rsi,1)
+ vmovdqu %ymm15,32(%rdi,%rsi,1)
+ je .Ldone8xvl
+ vmovdqa %ymm3,%ymm8
+ vmovdqa %ymm11,%ymm0
+ leaq 64(%rsi),%rsi
+
+ cmpq $448,%rdx
+ jb .Less_than_64_8xvl
+ vpxor 0(%rsi),%ymm3,%ymm3
+ vpxor 32(%rsi),%ymm11,%ymm11
+ vmovdqu %ymm3,0(%rdi,%rsi,1)
+ vmovdqu %ymm11,32(%rdi,%rsi,1)
+ je .Ldone8xvl
+ vmovdqa %ymm4,%ymm8
+ vmovdqa %ymm12,%ymm0
+ leaq 64(%rsi),%rsi
+
+.Less_than_64_8xvl:
+ vmovdqa %ymm8,0(%rsp)
+ vmovdqa %ymm0,32(%rsp)
+ leaq (%rdi,%rsi,1),%rdi
+ andq $63,%rdx
+
+.Loop_tail8xvl:
+ movzbl (%rsi,%r9,1),%eax
+ movzbl (%rsp,%r9,1),%ecx
+ leaq 1(%r9),%r9
+ xorl %ecx,%eax
+ movb %al,-1(%rdi,%r9,1)
+ decq %rdx
+ jnz .Loop_tail8xvl
+
+ vpxor %ymm8,%ymm8,%ymm8
+ vmovdqa %ymm8,0(%rsp)
+ vmovdqa %ymm8,32(%rsp)
+
+.Ldone8xvl:
+ vzeroall
+ leaq -8(%r10),%rsp
+.L8xvl_epilogue:
+ ret
+ENDPROC(chacha20_avx512vl)
+
+#endif /* CONFIG_AS_AVX512 */
diff --git a/lib/zinc/chacha20/chacha20.c b/lib/zinc/chacha20/chacha20.c
new file mode 100644
index 000000000000..bad09c39eed0
--- /dev/null
+++ b/lib/zinc/chacha20/chacha20.c
@@ -0,0 +1,242 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#include <zinc/chacha20.h>
+
+#include <linux/kernel.h>
+#include <crypto/algapi.h>
+
+#if defined(CONFIG_X86_64)
+#include <asm/fpu/api.h>
+#include <asm/cpufeature.h>
+#include <asm/processor.h>
+#include <asm/intel-family.h>
+#ifdef CONFIG_AS_SSSE3
+asmlinkage void hchacha20_ssse3(u8 *derived_key, const u8 *nonce, const u8 *key);
+asmlinkage void chacha20_ssse3(u8 *out, const u8 *in, const size_t len, const u32 key[8], const u32 counter[4]);
+#endif
+#ifdef CONFIG_AS_AVX2
+asmlinkage void chacha20_avx2(u8 *out, const u8 *in, const size_t len, const u32 key[8], const u32 counter[4]);
+#endif
+#ifdef CONFIG_AS_AVX512
+asmlinkage void chacha20_avx512(u8 *out, const u8 *in, const size_t len, const u32 key[8], const u32 counter[4]);
+asmlinkage void chacha20_avx512vl(u8 *out, const u8 *in, const size_t len, const u32 key[8], const u32 counter[4]);
+#endif
+
+static bool chacha20_use_ssse3 __ro_after_init;
+static bool chacha20_use_avx2 __ro_after_init;
+static bool chacha20_use_avx512 __ro_after_init;
+static bool chacha20_use_avx512vl __ro_after_init;
+
+void __init chacha20_fpu_init(void)
+{
+#ifndef CONFIG_UML
+ chacha20_use_ssse3 = boot_cpu_has(X86_FEATURE_SSSE3);
+ chacha20_use_avx2 = boot_cpu_has(X86_FEATURE_AVX) && boot_cpu_has(X86_FEATURE_AVX2) &&
+ cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM, NULL);
+ chacha20_use_avx512 = boot_cpu_has(X86_FEATURE_AVX) && boot_cpu_has(X86_FEATURE_AVX2) && boot_cpu_has(X86_FEATURE_AVX512F) &&
+ cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM | XFEATURE_MASK_AVX512, NULL) &&
+ boot_cpu_data.x86_model != INTEL_FAM6_SKYLAKE_X;
+ chacha20_use_avx512vl = boot_cpu_has(X86_FEATURE_AVX) && boot_cpu_has(X86_FEATURE_AVX2) && boot_cpu_has(X86_FEATURE_AVX512F) && boot_cpu_has(X86_FEATURE_AVX512VL) &&
+ cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM | XFEATURE_MASK_AVX512, NULL);
+#endif
+}
+#elif defined(CONFIG_ARM) || defined(CONFIG_ARM64)
+asmlinkage void chacha20_arm(u8 *out, const u8 *in, const size_t len, const u32 key[8], const u32 counter[4]);
+#if IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && (!defined(__LINUX_ARM_ARCH__) || __LINUX_ARM_ARCH__ >= 7)
+#define ARM_USE_NEON
+#include <asm/hwcap.h>
+#include <asm/neon.h>
+asmlinkage void chacha20_neon(u8 *out, const u8 *in, const size_t len, const u32 key[8], const u32 counter[4]);
+#endif
+static bool chacha20_use_neon __ro_after_init;
+void __init chacha20_fpu_init(void)
+{
+#if defined(CONFIG_ARM64)
+ chacha20_use_neon = elf_hwcap & HWCAP_ASIMD;
+#elif defined(CONFIG_ARM)
+ chacha20_use_neon = elf_hwcap & HWCAP_NEON;
+#endif
+}
+#elif defined(CONFIG_MIPS) && defined(CONFIG_CPU_MIPS32_R2)
+asmlinkage void chacha20_mips(u8 *out, const u8 *in, const size_t len, const u32 key[8], const u32 counter[4]);
+void __init chacha20_fpu_init(void) { }
+#else
+void __init chacha20_fpu_init(void) { }
+#endif
+
+#define EXPAND_32_BYTE_K 0x61707865U, 0x3320646eU, 0x79622d32U, 0x6b206574U
+
+#define QUARTER_ROUND(x, a, b, c, d) ( \
+ x[a] += x[b], \
+ x[d] = rol32((x[d] ^ x[a]), 16), \
+ x[c] += x[d], \
+ x[b] = rol32((x[b] ^ x[c]), 12), \
+ x[a] += x[b], \
+ x[d] = rol32((x[d] ^ x[a]), 8), \
+ x[c] += x[d], \
+ x[b] = rol32((x[b] ^ x[c]), 7) \
+)
+
+#define C(i, j) (i * 4 + j)
+
+#define DOUBLE_ROUND(x) ( \
+ /* Column Round */ \
+ QUARTER_ROUND(x, C(0, 0), C(1, 0), C(2, 0), C(3, 0)), \
+ QUARTER_ROUND(x, C(0, 1), C(1, 1), C(2, 1), C(3, 1)), \
+ QUARTER_ROUND(x, C(0, 2), C(1, 2), C(2, 2), C(3, 2)), \
+ QUARTER_ROUND(x, C(0, 3), C(1, 3), C(2, 3), C(3, 3)), \
+ /* Diagonal Round */ \
+ QUARTER_ROUND(x, C(0, 0), C(1, 1), C(2, 2), C(3, 3)), \
+ QUARTER_ROUND(x, C(0, 1), C(1, 2), C(2, 3), C(3, 0)), \
+ QUARTER_ROUND(x, C(0, 2), C(1, 3), C(2, 0), C(3, 1)), \
+ QUARTER_ROUND(x, C(0, 3), C(1, 0), C(2, 1), C(3, 2)) \
+)
+
+#define TWENTY_ROUNDS(x) ( \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x), \
+ DOUBLE_ROUND(x) \
+)
+
+static void chacha20_block_generic(__le32 *stream, u32 *state)
+{
+ u32 x[CHACHA20_BLOCK_SIZE / sizeof(u32)];
+ int i;
+
+ for (i = 0; i < ARRAY_SIZE(x); ++i)
+ x[i] = state[i];
+
+ TWENTY_ROUNDS(x);
+
+ for (i = 0; i < ARRAY_SIZE(x); ++i)
+ stream[i] = cpu_to_le32(x[i] + state[i]);
+
+ ++state[12];
+}
+
+static void chacha20_generic(u8 *out, const u8 *in, u32 len, const u32 key[8], const u32 counter[4])
+{
+ __le32 buf[CHACHA20_BLOCK_SIZE / sizeof(__le32)];
+ u32 x[] = {
+ EXPAND_32_BYTE_K,
+ key[0], key[1], key[2], key[3],
+ key[4], key[5], key[6], key[7],
+ counter[0], counter[1], counter[2], counter[3]
+ };
+
+ if (out != in)
+ memcpy(out, in, len);
+
+ while (len >= CHACHA20_BLOCK_SIZE) {
+ chacha20_block_generic(buf, x);
+ crypto_xor(out, (u8 *)buf, CHACHA20_BLOCK_SIZE);
+ len -= CHACHA20_BLOCK_SIZE;
+ out += CHACHA20_BLOCK_SIZE;
+ }
+ if (len) {
+ chacha20_block_generic(buf, x);
+ crypto_xor(out, (u8 *)buf, len);
+ }
+}
+
+void chacha20(struct chacha20_ctx *state, u8 *dst, const u8 *src, u32 len, bool have_simd)
+{
+ if (!have_simd
+#if defined(CONFIG_X86_64)
+ || !chacha20_use_ssse3
+
+#elif defined(ARM_USE_NEON)
+ || !chacha20_use_neon
+#endif
+ )
+ goto no_simd;
+
+#if defined(CONFIG_X86_64)
+#ifdef CONFIG_AS_AVX512
+ if (chacha20_use_avx512) {
+ chacha20_avx512(dst, src, len, state->key, state->counter);
+ goto out;
+ }
+ if (chacha20_use_avx512vl) {
+ chacha20_avx512vl(dst, src, len, state->key, state->counter);
+ goto out;
+ }
+#endif
+#ifdef CONFIG_AS_AVX2
+ if (chacha20_use_avx2) {
+ chacha20_avx2(dst, src, len, state->key, state->counter);
+ goto out;
+ }
+#endif
+#ifdef CONFIG_AS_SSSE3
+ chacha20_ssse3(dst, src, len, state->key, state->counter);
+ goto out;
+#endif
+#elif defined(ARM_USE_NEON)
+ chacha20_neon(dst, src, len, state->key, state->counter);
+ goto out;
+#endif
+
+no_simd:
+#if defined(CONFIG_ARM) || defined(CONFIG_ARM64)
+ chacha20_arm(dst, src, len, state->key, state->counter);
+ goto out;
+#elif defined(CONFIG_MIPS) && defined(CONFIG_CPU_MIPS32_R2)
+ chacha20_mips(dst, src, len, state->key, state->counter);
+ goto out;
+#endif
+
+ chacha20_generic(dst, src, len, state->key, state->counter);
+ goto out;
+
+out:
+ state->counter[0] += (len + 63) / 64;
+}
+EXPORT_SYMBOL(chacha20);
+
+static void hchacha20_generic(u8 derived_key[CHACHA20_KEY_SIZE], const u8 nonce[HCHACHA20_NONCE_SIZE], const u8 key[HCHACHA20_KEY_SIZE])
+{
+ __le32 *out = (__force __le32 *)derived_key;
+ u32 x[] = {
+ EXPAND_32_BYTE_K,
+ le32_to_cpup((__le32 *)(key + 0)), le32_to_cpup((__le32 *)(key + 4)), le32_to_cpup((__le32 *)(key + 8)), le32_to_cpup((__le32 *)(key + 12)),
+ le32_to_cpup((__le32 *)(key + 16)), le32_to_cpup((__le32 *)(key + 20)), le32_to_cpup((__le32 *)(key + 24)), le32_to_cpup((__le32 *)(key + 28)),
+ le32_to_cpup((__le32 *)(nonce + 0)), le32_to_cpup((__le32 *)(nonce + 4)), le32_to_cpup((__le32 *)(nonce + 8)), le32_to_cpup((__le32 *)(nonce + 12))
+ };
+
+ TWENTY_ROUNDS(x);
+
+ out[0] = cpu_to_le32(x[0]);
+ out[1] = cpu_to_le32(x[1]);
+ out[2] = cpu_to_le32(x[2]);
+ out[3] = cpu_to_le32(x[3]);
+ out[4] = cpu_to_le32(x[12]);
+ out[5] = cpu_to_le32(x[13]);
+ out[6] = cpu_to_le32(x[14]);
+ out[7] = cpu_to_le32(x[15]);
+}
+
+void hchacha20(u8 derived_key[CHACHA20_KEY_SIZE], const u8 nonce[HCHACHA20_NONCE_SIZE], const u8 key[HCHACHA20_KEY_SIZE], bool have_simd)
+{
+#if defined(CONFIG_X86_64) && defined(CONFIG_AS_SSSE3)
+ if (have_simd && chacha20_use_ssse3) {
+ hchacha20_ssse3(derived_key, nonce, key);
+ return;
+ }
+#endif
+ hchacha20_generic(derived_key, nonce, key);
+}
+/* Deliberately not EXPORT_SYMBOL'd, since there are few reasons why somebody
+ * should be using this directly, rather than via xchacha20. Revisit only in
+ * the unlikely event that somebody has a good reason to export this.
+ */
diff --git a/lib/zinc/chacha20poly1305.c b/lib/zinc/chacha20poly1305.c
new file mode 100644
index 000000000000..6ac744d50b49
--- /dev/null
+++ b/lib/zinc/chacha20poly1305.c
@@ -0,0 +1,286 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#include <zinc/chacha20poly1305.h>
+#include <zinc/chacha20.h>
+#include <zinc/poly1305.h>
+#include <zinc/simd.h>
+
+#include <linux/kernel.h>
+#include <crypto/scatterwalk.h>
+
+static const u8 pad0[16] = { 0 };
+
+static struct crypto_alg chacha20_alg = {
+ .cra_blocksize = 1,
+ .cra_alignmask = sizeof(u32) - 1
+};
+static struct crypto_blkcipher chacha20_cipher = {
+ .base = {
+ .__crt_alg = &chacha20_alg
+ }
+};
+static struct blkcipher_desc chacha20_desc = {
+ .tfm = &chacha20_cipher
+};
+
+static inline void __chacha20poly1305_encrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN],
+ bool have_simd)
+{
+ struct poly1305_ctx poly1305_state;
+ struct chacha20_ctx chacha20_state;
+ union {
+ u8 block0[POLY1305_KEY_SIZE];
+ __le64 lens[2];
+ } b = {{ 0 }};
+
+ chacha20_init(&chacha20_state, key, nonce);
+ chacha20(&chacha20_state, b.block0, b.block0, sizeof(b.block0), have_simd);
+ poly1305_init(&poly1305_state, b.block0, have_simd);
+
+ poly1305_update(&poly1305_state, ad, ad_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - ad_len) & 0xf, have_simd);
+
+ chacha20(&chacha20_state, dst, src, src_len, have_simd);
+
+ poly1305_update(&poly1305_state, dst, src_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - src_len) & 0xf, have_simd);
+
+ b.lens[0] = cpu_to_le64(ad_len);
+ b.lens[1] = cpu_to_le64(src_len);
+ poly1305_update(&poly1305_state, (u8 *)b.lens, sizeof(b.lens), have_simd);
+
+ poly1305_finish(&poly1305_state, dst + src_len, have_simd);
+
+ memzero_explicit(&chacha20_state, sizeof(chacha20_state));
+ memzero_explicit(&b, sizeof(b));
+}
+
+void chacha20poly1305_encrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN])
+{
+ bool have_simd;
+
+ have_simd = simd_get();
+ __chacha20poly1305_encrypt(dst, src, src_len, ad, ad_len, nonce, key, have_simd);
+ simd_put(have_simd);
+}
+EXPORT_SYMBOL(chacha20poly1305_encrypt);
+
+bool chacha20poly1305_encrypt_sg(struct scatterlist *dst, struct scatterlist *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN],
+ bool have_simd)
+{
+ struct poly1305_ctx poly1305_state;
+ struct chacha20_ctx chacha20_state;
+ int ret = 0;
+ struct blkcipher_walk walk;
+ union {
+ u8 block0[POLY1305_KEY_SIZE];
+ u8 mac[POLY1305_MAC_SIZE];
+ __le64 lens[2];
+ } b = {{ 0 }};
+
+ chacha20_init(&chacha20_state, key, nonce);
+ chacha20(&chacha20_state, b.block0, b.block0, sizeof(b.block0), have_simd);
+ poly1305_init(&poly1305_state, b.block0, have_simd);
+
+ poly1305_update(&poly1305_state, ad, ad_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - ad_len) & 0xf, have_simd);
+
+ if (likely(src_len)) {
+ blkcipher_walk_init(&walk, dst, src, src_len);
+ ret = blkcipher_walk_virt_block(&chacha20_desc, &walk, CHACHA20_BLOCK_SIZE);
+ while (walk.nbytes >= CHACHA20_BLOCK_SIZE) {
+ size_t chunk_len = rounddown(walk.nbytes, CHACHA20_BLOCK_SIZE);
+
+ chacha20(&chacha20_state, walk.dst.virt.addr, walk.src.virt.addr, chunk_len, have_simd);
+ poly1305_update(&poly1305_state, walk.dst.virt.addr, chunk_len, have_simd);
+ ret = blkcipher_walk_done(&chacha20_desc, &walk, walk.nbytes % CHACHA20_BLOCK_SIZE);
+ }
+ if (walk.nbytes) {
+ chacha20(&chacha20_state, walk.dst.virt.addr, walk.src.virt.addr, walk.nbytes, have_simd);
+ poly1305_update(&poly1305_state, walk.dst.virt.addr, walk.nbytes, have_simd);
+ ret = blkcipher_walk_done(&chacha20_desc, &walk, 0);
+ }
+ }
+ if (unlikely(ret))
+ goto err;
+
+ poly1305_update(&poly1305_state, pad0, (0x10 - src_len) & 0xf, have_simd);
+
+ b.lens[0] = cpu_to_le64(ad_len);
+ b.lens[1] = cpu_to_le64(src_len);
+ poly1305_update(&poly1305_state, (u8 *)b.lens, sizeof(b.lens), have_simd);
+
+ poly1305_finish(&poly1305_state, b.mac, have_simd);
+ scatterwalk_map_and_copy(b.mac, dst, src_len, sizeof(b.mac), 1);
+err:
+ memzero_explicit(&chacha20_state, sizeof(chacha20_state));
+ memzero_explicit(&b, sizeof(b));
+ return !ret;
+}
+EXPORT_SYMBOL(chacha20poly1305_encrypt_sg);
+
+static inline bool __chacha20poly1305_decrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN],
+ bool have_simd)
+{
+ struct poly1305_ctx poly1305_state;
+ struct chacha20_ctx chacha20_state;
+ int ret;
+ size_t dst_len;
+ union {
+ u8 block0[POLY1305_KEY_SIZE];
+ u8 mac[POLY1305_MAC_SIZE];
+ __le64 lens[2];
+ } b = {{ 0 }};
+
+ if (unlikely(src_len < POLY1305_MAC_SIZE))
+ return false;
+
+ chacha20_init(&chacha20_state, key, nonce);
+ chacha20(&chacha20_state, b.block0, b.block0, sizeof(b.block0), have_simd);
+ poly1305_init(&poly1305_state, b.block0, have_simd);
+
+ poly1305_update(&poly1305_state, ad, ad_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - ad_len) & 0xf, have_simd);
+
+ dst_len = src_len - POLY1305_MAC_SIZE;
+ poly1305_update(&poly1305_state, src, dst_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - dst_len) & 0xf, have_simd);
+
+ b.lens[0] = cpu_to_le64(ad_len);
+ b.lens[1] = cpu_to_le64(dst_len);
+ poly1305_update(&poly1305_state, (u8 *)b.lens, sizeof(b.lens), have_simd);
+
+ poly1305_finish(&poly1305_state, b.mac, have_simd);
+
+ ret = crypto_memneq(b.mac, src + dst_len, POLY1305_MAC_SIZE);
+ if (likely(!ret))
+ chacha20(&chacha20_state, dst, src, dst_len, have_simd);
+
+ memzero_explicit(&chacha20_state, sizeof(chacha20_state));
+ memzero_explicit(&b, sizeof(b));
+
+ return !ret;
+}
+
+bool chacha20poly1305_decrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN])
+{
+ bool have_simd, ret;
+
+ have_simd = simd_get();
+ ret = __chacha20poly1305_decrypt(dst, src, src_len, ad, ad_len, nonce, key, have_simd);
+ simd_put(have_simd);
+ return ret;
+}
+EXPORT_SYMBOL(chacha20poly1305_decrypt);
+
+bool chacha20poly1305_decrypt_sg(struct scatterlist *dst, struct scatterlist *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u64 nonce, const u8 key[CHACHA20POLY1305_KEYLEN],
+ bool have_simd)
+{
+ struct poly1305_ctx poly1305_state;
+ struct chacha20_ctx chacha20_state;
+ struct blkcipher_walk walk;
+ int ret = 0;
+ size_t dst_len;
+ union {
+ u8 block0[POLY1305_KEY_SIZE];
+ struct {
+ u8 read_mac[POLY1305_MAC_SIZE];
+ u8 computed_mac[POLY1305_MAC_SIZE];
+ };
+ __le64 lens[2];
+ } b = {{ 0 }};
+
+ if (unlikely(src_len < POLY1305_MAC_SIZE))
+ return false;
+
+ chacha20_init(&chacha20_state, key, nonce);
+ chacha20(&chacha20_state, b.block0, b.block0, sizeof(b.block0), have_simd);
+ poly1305_init(&poly1305_state, b.block0, have_simd);
+
+ poly1305_update(&poly1305_state, ad, ad_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - ad_len) & 0xf, have_simd);
+
+ dst_len = src_len - POLY1305_MAC_SIZE;
+ if (likely(dst_len)) {
+ blkcipher_walk_init(&walk, dst, src, dst_len);
+ ret = blkcipher_walk_virt_block(&chacha20_desc, &walk, CHACHA20_BLOCK_SIZE);
+ while (walk.nbytes >= CHACHA20_BLOCK_SIZE) {
+ size_t chunk_len = rounddown(walk.nbytes, CHACHA20_BLOCK_SIZE);
+
+ poly1305_update(&poly1305_state, walk.src.virt.addr, chunk_len, have_simd);
+ chacha20(&chacha20_state, walk.dst.virt.addr, walk.src.virt.addr, chunk_len, have_simd);
+ ret = blkcipher_walk_done(&chacha20_desc, &walk, walk.nbytes % CHACHA20_BLOCK_SIZE);
+ }
+ if (walk.nbytes) {
+ poly1305_update(&poly1305_state, walk.src.virt.addr, walk.nbytes, have_simd);
+ chacha20(&chacha20_state, walk.dst.virt.addr, walk.src.virt.addr, walk.nbytes, have_simd);
+ ret = blkcipher_walk_done(&chacha20_desc, &walk, 0);
+ }
+ }
+ if (unlikely(ret))
+ goto err;
+
+ poly1305_update(&poly1305_state, pad0, (0x10 - dst_len) & 0xf, have_simd);
+
+ b.lens[0] = cpu_to_le64(ad_len);
+ b.lens[1] = cpu_to_le64(dst_len);
+ poly1305_update(&poly1305_state, (u8 *)b.lens, sizeof(b.lens), have_simd);
+
+ poly1305_finish(&poly1305_state, b.computed_mac, have_simd);
+
+ scatterwalk_map_and_copy(b.read_mac, src, dst_len, POLY1305_MAC_SIZE, 0);
+ ret = crypto_memneq(b.read_mac, b.computed_mac, POLY1305_MAC_SIZE);
+err:
+ memzero_explicit(&chacha20_state, sizeof(chacha20_state));
+ memzero_explicit(&b, sizeof(b));
+ return !ret;
+}
+EXPORT_SYMBOL(chacha20poly1305_decrypt_sg);
+
+void xchacha20poly1305_encrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u8 nonce[XCHACHA20POLY1305_NONCELEN],
+ const u8 key[CHACHA20POLY1305_KEYLEN])
+{
+ bool have_simd = simd_get();
+ u8 derived_key[CHACHA20POLY1305_KEYLEN] __aligned(16);
+
+ hchacha20(derived_key, nonce, key, have_simd);
+ __chacha20poly1305_encrypt(dst, src, src_len, ad, ad_len, le64_to_cpup((__le64 *)(nonce + 16)), derived_key, have_simd);
+ memzero_explicit(derived_key, CHACHA20POLY1305_KEYLEN);
+ simd_put(have_simd);
+}
+EXPORT_SYMBOL(xchacha20poly1305_encrypt);
+
+bool xchacha20poly1305_decrypt(u8 *dst, const u8 *src, const size_t src_len,
+ const u8 *ad, const size_t ad_len,
+ const u8 nonce[XCHACHA20POLY1305_NONCELEN],
+ const u8 key[CHACHA20POLY1305_KEYLEN])
+{
+ bool ret, have_simd = simd_get();
+ u8 derived_key[CHACHA20POLY1305_KEYLEN] __aligned(16);
+
+ hchacha20(derived_key, nonce, key, have_simd);
+ ret = __chacha20poly1305_decrypt(dst, src, src_len, ad, ad_len, le64_to_cpup((__le64 *)(nonce + 16)), derived_key, have_simd);
+ memzero_explicit(derived_key, CHACHA20POLY1305_KEYLEN);
+ simd_put(have_simd);
+ return ret;
+}
+EXPORT_SYMBOL(xchacha20poly1305_decrypt);
+
+#include "selftest/chacha20poly1305.h"
diff --git a/lib/zinc/curve25519/curve25519-arm.S b/lib/zinc/curve25519/curve25519-arm.S
new file mode 100644
index 000000000000..b8e2c1d96482
--- /dev/null
+++ b/lib/zinc/curve25519/curve25519-arm.S
@@ -0,0 +1,2110 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ *
+ * Based on algorithms from Daniel J. Bernstein and Peter Schwabe.
+ */
+
+#if IS_ENABLED(CONFIG_KERNEL_MODE_NEON)
+
+#include <linux/linkage.h>
+
+ .text
+ .fpu neon
+ .align 4
+
+ENTRY(curve25519_neon)
+ vpush {q4,q5,q6,q7}
+ mov r12,sp
+ sub r3,sp,#736
+ and r3,r3,#0xffffffe0
+ mov sp,r3
+ strd r4,[sp,#0]
+ strd r6,[sp,#8]
+ strd r8,[sp,#16]
+ strd r10,[sp,#24]
+ str r12,[sp,#480]
+ str r14,[sp,#484]
+ mov r0,r0
+ mov r1,r1
+ mov r2,r2
+ add r3,sp,#32
+ ldr r4,=0
+ ldr r5,=254
+ vmov.i32 q0,#1
+ vshr.u64 q1,q0,#7
+ vshr.u64 q0,q0,#8
+ vmov.i32 d4,#19
+ vmov.i32 d5,#38
+ add r6,sp,#512
+ vst1.8 {d2-d3},[r6,: 128]
+ add r6,sp,#528
+ vst1.8 {d0-d1},[r6,: 128]
+ add r6,sp,#544
+ vst1.8 {d4-d5},[r6,: 128]
+ add r6,r3,#0
+ vmov.i32 q2,#0
+ vst1.8 {d4-d5},[r6,: 128]!
+ vst1.8 {d4-d5},[r6,: 128]!
+ vst1.8 d4,[r6,: 64]
+ add r6,r3,#0
+ ldr r7,=960
+ sub r7,r7,#2
+ neg r7,r7
+ sub r7,r7,r7,LSL #7
+ str r7,[r6]
+ add r6,sp,#704
+ vld1.8 {d4-d5},[r1]!
+ vld1.8 {d6-d7},[r1]
+ vst1.8 {d4-d5},[r6,: 128]!
+ vst1.8 {d6-d7},[r6,: 128]
+ sub r1,r6,#16
+ ldrb r6,[r1]
+ and r6,r6,#248
+ strb r6,[r1]
+ ldrb r6,[r1,#31]
+ and r6,r6,#127
+ orr r6,r6,#64
+ strb r6,[r1,#31]
+ vmov.i64 q2,#0xffffffff
+ vshr.u64 q3,q2,#7
+ vshr.u64 q2,q2,#6
+ vld1.8 {d8},[r2]
+ vld1.8 {d10},[r2]
+ add r2,r2,#6
+ vld1.8 {d12},[r2]
+ vld1.8 {d14},[r2]
+ add r2,r2,#6
+ vld1.8 {d16},[r2]
+ add r2,r2,#4
+ vld1.8 {d18},[r2]
+ vld1.8 {d20},[r2]
+ add r2,r2,#6
+ vld1.8 {d22},[r2]
+ add r2,r2,#2
+ vld1.8 {d24},[r2]
+ vld1.8 {d26},[r2]
+ vshr.u64 q5,q5,#26
+ vshr.u64 q6,q6,#3
+ vshr.u64 q7,q7,#29
+ vshr.u64 q8,q8,#6
+ vshr.u64 q10,q10,#25
+ vshr.u64 q11,q11,#3
+ vshr.u64 q12,q12,#12
+ vshr.u64 q13,q13,#38
+ vand q4,q4,q2
+ vand q6,q6,q2
+ vand q8,q8,q2
+ vand q10,q10,q2
+ vand q2,q12,q2
+ vand q5,q5,q3
+ vand q7,q7,q3
+ vand q9,q9,q3
+ vand q11,q11,q3
+ vand q3,q13,q3
+ add r2,r3,#48
+ vadd.i64 q12,q4,q1
+ vadd.i64 q13,q10,q1
+ vshr.s64 q12,q12,#26
+ vshr.s64 q13,q13,#26
+ vadd.i64 q5,q5,q12
+ vshl.i64 q12,q12,#26
+ vadd.i64 q14,q5,q0
+ vadd.i64 q11,q11,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q15,q11,q0
+ vsub.i64 q4,q4,q12
+ vshr.s64 q12,q14,#25
+ vsub.i64 q10,q10,q13
+ vshr.s64 q13,q15,#25
+ vadd.i64 q6,q6,q12
+ vshl.i64 q12,q12,#25
+ vadd.i64 q14,q6,q1
+ vadd.i64 q2,q2,q13
+ vsub.i64 q5,q5,q12
+ vshr.s64 q12,q14,#26
+ vshl.i64 q13,q13,#25
+ vadd.i64 q14,q2,q1
+ vadd.i64 q7,q7,q12
+ vshl.i64 q12,q12,#26
+ vadd.i64 q15,q7,q0
+ vsub.i64 q11,q11,q13
+ vshr.s64 q13,q14,#26
+ vsub.i64 q6,q6,q12
+ vshr.s64 q12,q15,#25
+ vadd.i64 q3,q3,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q14,q3,q0
+ vadd.i64 q8,q8,q12
+ vshl.i64 q12,q12,#25
+ vadd.i64 q15,q8,q1
+ add r2,r2,#8
+ vsub.i64 q2,q2,q13
+ vshr.s64 q13,q14,#25
+ vsub.i64 q7,q7,q12
+ vshr.s64 q12,q15,#26
+ vadd.i64 q14,q13,q13
+ vadd.i64 q9,q9,q12
+ vtrn.32 d12,d14
+ vshl.i64 q12,q12,#26
+ vtrn.32 d13,d15
+ vadd.i64 q0,q9,q0
+ vadd.i64 q4,q4,q14
+ vst1.8 d12,[r2,: 64]!
+ vshl.i64 q6,q13,#4
+ vsub.i64 q7,q8,q12
+ vshr.s64 q0,q0,#25
+ vadd.i64 q4,q4,q6
+ vadd.i64 q6,q10,q0
+ vshl.i64 q0,q0,#25
+ vadd.i64 q8,q6,q1
+ vadd.i64 q4,q4,q13
+ vshl.i64 q10,q13,#25
+ vadd.i64 q1,q4,q1
+ vsub.i64 q0,q9,q0
+ vshr.s64 q8,q8,#26
+ vsub.i64 q3,q3,q10
+ vtrn.32 d14,d0
+ vshr.s64 q1,q1,#26
+ vtrn.32 d15,d1
+ vadd.i64 q0,q11,q8
+ vst1.8 d14,[r2,: 64]
+ vshl.i64 q7,q8,#26
+ vadd.i64 q5,q5,q1
+ vtrn.32 d4,d6
+ vshl.i64 q1,q1,#26
+ vtrn.32 d5,d7
+ vsub.i64 q3,q6,q7
+ add r2,r2,#16
+ vsub.i64 q1,q4,q1
+ vst1.8 d4,[r2,: 64]
+ vtrn.32 d6,d0
+ vtrn.32 d7,d1
+ sub r2,r2,#8
+ vtrn.32 d2,d10
+ vtrn.32 d3,d11
+ vst1.8 d6,[r2,: 64]
+ sub r2,r2,#24
+ vst1.8 d2,[r2,: 64]
+ add r2,r3,#96
+ vmov.i32 q0,#0
+ vmov.i64 d2,#0xff
+ vmov.i64 d3,#0
+ vshr.u32 q1,q1,#7
+ vst1.8 {d2-d3},[r2,: 128]!
+ vst1.8 {d0-d1},[r2,: 128]!
+ vst1.8 d0,[r2,: 64]
+ add r2,r3,#144
+ vmov.i32 q0,#0
+ vst1.8 {d0-d1},[r2,: 128]!
+ vst1.8 {d0-d1},[r2,: 128]!
+ vst1.8 d0,[r2,: 64]
+ add r2,r3,#240
+ vmov.i32 q0,#0
+ vmov.i64 d2,#0xff
+ vmov.i64 d3,#0
+ vshr.u32 q1,q1,#7
+ vst1.8 {d2-d3},[r2,: 128]!
+ vst1.8 {d0-d1},[r2,: 128]!
+ vst1.8 d0,[r2,: 64]
+ add r2,r3,#48
+ add r6,r3,#192
+ vld1.8 {d0-d1},[r2,: 128]!
+ vld1.8 {d2-d3},[r2,: 128]!
+ vld1.8 {d4},[r2,: 64]
+ vst1.8 {d0-d1},[r6,: 128]!
+ vst1.8 {d2-d3},[r6,: 128]!
+ vst1.8 d4,[r6,: 64]
+ .Lmainloop:
+ mov r2,r5,LSR #3
+ and r6,r5,#7
+ ldrb r2,[r1,r2]
+ mov r2,r2,LSR r6
+ and r2,r2,#1
+ str r5,[sp,#488]
+ eor r4,r4,r2
+ str r2,[sp,#492]
+ neg r2,r4
+ add r4,r3,#96
+ add r5,r3,#192
+ add r6,r3,#144
+ vld1.8 {d8-d9},[r4,: 128]!
+ add r7,r3,#240
+ vld1.8 {d10-d11},[r5,: 128]!
+ veor q6,q4,q5
+ vld1.8 {d14-d15},[r6,: 128]!
+ vdup.i32 q8,r2
+ vld1.8 {d18-d19},[r7,: 128]!
+ veor q10,q7,q9
+ vld1.8 {d22-d23},[r4,: 128]!
+ vand q6,q6,q8
+ vld1.8 {d24-d25},[r5,: 128]!
+ vand q10,q10,q8
+ vld1.8 {d26-d27},[r6,: 128]!
+ veor q4,q4,q6
+ vld1.8 {d28-d29},[r7,: 128]!
+ veor q5,q5,q6
+ vld1.8 {d0},[r4,: 64]
+ veor q6,q7,q10
+ vld1.8 {d2},[r5,: 64]
+ veor q7,q9,q10
+ vld1.8 {d4},[r6,: 64]
+ veor q9,q11,q12
+ vld1.8 {d6},[r7,: 64]
+ veor q10,q0,q1
+ sub r2,r4,#32
+ vand q9,q9,q8
+ sub r4,r5,#32
+ vand q10,q10,q8
+ sub r5,r6,#32
+ veor q11,q11,q9
+ sub r6,r7,#32
+ veor q0,q0,q10
+ veor q9,q12,q9
+ veor q1,q1,q10
+ veor q10,q13,q14
+ veor q12,q2,q3
+ vand q10,q10,q8
+ vand q8,q12,q8
+ veor q12,q13,q10
+ veor q2,q2,q8
+ veor q10,q14,q10
+ veor q3,q3,q8
+ vadd.i32 q8,q4,q6
+ vsub.i32 q4,q4,q6
+ vst1.8 {d16-d17},[r2,: 128]!
+ vadd.i32 q6,q11,q12
+ vst1.8 {d8-d9},[r5,: 128]!
+ vsub.i32 q4,q11,q12
+ vst1.8 {d12-d13},[r2,: 128]!
+ vadd.i32 q6,q0,q2
+ vst1.8 {d8-d9},[r5,: 128]!
+ vsub.i32 q0,q0,q2
+ vst1.8 d12,[r2,: 64]
+ vadd.i32 q2,q5,q7
+ vst1.8 d0,[r5,: 64]
+ vsub.i32 q0,q5,q7
+ vst1.8 {d4-d5},[r4,: 128]!
+ vadd.i32 q2,q9,q10
+ vst1.8 {d0-d1},[r6,: 128]!
+ vsub.i32 q0,q9,q10
+ vst1.8 {d4-d5},[r4,: 128]!
+ vadd.i32 q2,q1,q3
+ vst1.8 {d0-d1},[r6,: 128]!
+ vsub.i32 q0,q1,q3
+ vst1.8 d4,[r4,: 64]
+ vst1.8 d0,[r6,: 64]
+ add r2,sp,#544
+ add r4,r3,#96
+ add r5,r3,#144
+ vld1.8 {d0-d1},[r2,: 128]
+ vld1.8 {d2-d3},[r4,: 128]!
+ vld1.8 {d4-d5},[r5,: 128]!
+ vzip.i32 q1,q2
+ vld1.8 {d6-d7},[r4,: 128]!
+ vld1.8 {d8-d9},[r5,: 128]!
+ vshl.i32 q5,q1,#1
+ vzip.i32 q3,q4
+ vshl.i32 q6,q2,#1
+ vld1.8 {d14},[r4,: 64]
+ vshl.i32 q8,q3,#1
+ vld1.8 {d15},[r5,: 64]
+ vshl.i32 q9,q4,#1
+ vmul.i32 d21,d7,d1
+ vtrn.32 d14,d15
+ vmul.i32 q11,q4,q0
+ vmul.i32 q0,q7,q0
+ vmull.s32 q12,d2,d2
+ vmlal.s32 q12,d11,d1
+ vmlal.s32 q12,d12,d0
+ vmlal.s32 q12,d13,d23
+ vmlal.s32 q12,d16,d22
+ vmlal.s32 q12,d7,d21
+ vmull.s32 q10,d2,d11
+ vmlal.s32 q10,d4,d1
+ vmlal.s32 q10,d13,d0
+ vmlal.s32 q10,d6,d23
+ vmlal.s32 q10,d17,d22
+ vmull.s32 q13,d10,d4
+ vmlal.s32 q13,d11,d3
+ vmlal.s32 q13,d13,d1
+ vmlal.s32 q13,d16,d0
+ vmlal.s32 q13,d17,d23
+ vmlal.s32 q13,d8,d22
+ vmull.s32 q1,d10,d5
+ vmlal.s32 q1,d11,d4
+ vmlal.s32 q1,d6,d1
+ vmlal.s32 q1,d17,d0
+ vmlal.s32 q1,d8,d23
+ vmull.s32 q14,d10,d6
+ vmlal.s32 q14,d11,d13
+ vmlal.s32 q14,d4,d4
+ vmlal.s32 q14,d17,d1
+ vmlal.s32 q14,d18,d0
+ vmlal.s32 q14,d9,d23
+ vmull.s32 q11,d10,d7
+ vmlal.s32 q11,d11,d6
+ vmlal.s32 q11,d12,d5
+ vmlal.s32 q11,d8,d1
+ vmlal.s32 q11,d19,d0
+ vmull.s32 q15,d10,d8
+ vmlal.s32 q15,d11,d17
+ vmlal.s32 q15,d12,d6
+ vmlal.s32 q15,d13,d5
+ vmlal.s32 q15,d19,d1
+ vmlal.s32 q15,d14,d0
+ vmull.s32 q2,d10,d9
+ vmlal.s32 q2,d11,d8
+ vmlal.s32 q2,d12,d7
+ vmlal.s32 q2,d13,d6
+ vmlal.s32 q2,d14,d1
+ vmull.s32 q0,d15,d1
+ vmlal.s32 q0,d10,d14
+ vmlal.s32 q0,d11,d19
+ vmlal.s32 q0,d12,d8
+ vmlal.s32 q0,d13,d17
+ vmlal.s32 q0,d6,d6
+ add r2,sp,#512
+ vld1.8 {d18-d19},[r2,: 128]
+ vmull.s32 q3,d16,d7
+ vmlal.s32 q3,d10,d15
+ vmlal.s32 q3,d11,d14
+ vmlal.s32 q3,d12,d9
+ vmlal.s32 q3,d13,d8
+ add r2,sp,#528
+ vld1.8 {d8-d9},[r2,: 128]
+ vadd.i64 q5,q12,q9
+ vadd.i64 q6,q15,q9
+ vshr.s64 q5,q5,#26
+ vshr.s64 q6,q6,#26
+ vadd.i64 q7,q10,q5
+ vshl.i64 q5,q5,#26
+ vadd.i64 q8,q7,q4
+ vadd.i64 q2,q2,q6
+ vshl.i64 q6,q6,#26
+ vadd.i64 q10,q2,q4
+ vsub.i64 q5,q12,q5
+ vshr.s64 q8,q8,#25
+ vsub.i64 q6,q15,q6
+ vshr.s64 q10,q10,#25
+ vadd.i64 q12,q13,q8
+ vshl.i64 q8,q8,#25
+ vadd.i64 q13,q12,q9
+ vadd.i64 q0,q0,q10
+ vsub.i64 q7,q7,q8
+ vshr.s64 q8,q13,#26
+ vshl.i64 q10,q10,#25
+ vadd.i64 q13,q0,q9
+ vadd.i64 q1,q1,q8
+ vshl.i64 q8,q8,#26
+ vadd.i64 q15,q1,q4
+ vsub.i64 q2,q2,q10
+ vshr.s64 q10,q13,#26
+ vsub.i64 q8,q12,q8
+ vshr.s64 q12,q15,#25
+ vadd.i64 q3,q3,q10
+ vshl.i64 q10,q10,#26
+ vadd.i64 q13,q3,q4
+ vadd.i64 q14,q14,q12
+ add r2,r3,#288
+ vshl.i64 q12,q12,#25
+ add r4,r3,#336
+ vadd.i64 q15,q14,q9
+ add r2,r2,#8
+ vsub.i64 q0,q0,q10
+ add r4,r4,#8
+ vshr.s64 q10,q13,#25
+ vsub.i64 q1,q1,q12
+ vshr.s64 q12,q15,#26
+ vadd.i64 q13,q10,q10
+ vadd.i64 q11,q11,q12
+ vtrn.32 d16,d2
+ vshl.i64 q12,q12,#26
+ vtrn.32 d17,d3
+ vadd.i64 q1,q11,q4
+ vadd.i64 q4,q5,q13
+ vst1.8 d16,[r2,: 64]!
+ vshl.i64 q5,q10,#4
+ vst1.8 d17,[r4,: 64]!
+ vsub.i64 q8,q14,q12
+ vshr.s64 q1,q1,#25
+ vadd.i64 q4,q4,q5
+ vadd.i64 q5,q6,q1
+ vshl.i64 q1,q1,#25
+ vadd.i64 q6,q5,q9
+ vadd.i64 q4,q4,q10
+ vshl.i64 q10,q10,#25
+ vadd.i64 q9,q4,q9
+ vsub.i64 q1,q11,q1
+ vshr.s64 q6,q6,#26
+ vsub.i64 q3,q3,q10
+ vtrn.32 d16,d2
+ vshr.s64 q9,q9,#26
+ vtrn.32 d17,d3
+ vadd.i64 q1,q2,q6
+ vst1.8 d16,[r2,: 64]
+ vshl.i64 q2,q6,#26
+ vst1.8 d17,[r4,: 64]
+ vadd.i64 q6,q7,q9
+ vtrn.32 d0,d6
+ vshl.i64 q7,q9,#26
+ vtrn.32 d1,d7
+ vsub.i64 q2,q5,q2
+ add r2,r2,#16
+ vsub.i64 q3,q4,q7
+ vst1.8 d0,[r2,: 64]
+ add r4,r4,#16
+ vst1.8 d1,[r4,: 64]
+ vtrn.32 d4,d2
+ vtrn.32 d5,d3
+ sub r2,r2,#8
+ sub r4,r4,#8
+ vtrn.32 d6,d12
+ vtrn.32 d7,d13
+ vst1.8 d4,[r2,: 64]
+ vst1.8 d5,[r4,: 64]
+ sub r2,r2,#24
+ sub r4,r4,#24
+ vst1.8 d6,[r2,: 64]
+ vst1.8 d7,[r4,: 64]
+ add r2,r3,#240
+ add r4,r3,#96
+ vld1.8 {d0-d1},[r4,: 128]!
+ vld1.8 {d2-d3},[r4,: 128]!
+ vld1.8 {d4},[r4,: 64]
+ add r4,r3,#144
+ vld1.8 {d6-d7},[r4,: 128]!
+ vtrn.32 q0,q3
+ vld1.8 {d8-d9},[r4,: 128]!
+ vshl.i32 q5,q0,#4
+ vtrn.32 q1,q4
+ vshl.i32 q6,q3,#4
+ vadd.i32 q5,q5,q0
+ vadd.i32 q6,q6,q3
+ vshl.i32 q7,q1,#4
+ vld1.8 {d5},[r4,: 64]
+ vshl.i32 q8,q4,#4
+ vtrn.32 d4,d5
+ vadd.i32 q7,q7,q1
+ vadd.i32 q8,q8,q4
+ vld1.8 {d18-d19},[r2,: 128]!
+ vshl.i32 q10,q2,#4
+ vld1.8 {d22-d23},[r2,: 128]!
+ vadd.i32 q10,q10,q2
+ vld1.8 {d24},[r2,: 64]
+ vadd.i32 q5,q5,q0
+ add r2,r3,#192
+ vld1.8 {d26-d27},[r2,: 128]!
+ vadd.i32 q6,q6,q3
+ vld1.8 {d28-d29},[r2,: 128]!
+ vadd.i32 q8,q8,q4
+ vld1.8 {d25},[r2,: 64]
+ vadd.i32 q10,q10,q2
+ vtrn.32 q9,q13
+ vadd.i32 q7,q7,q1
+ vadd.i32 q5,q5,q0
+ vtrn.32 q11,q14
+ vadd.i32 q6,q6,q3
+ add r2,sp,#560
+ vadd.i32 q10,q10,q2
+ vtrn.32 d24,d25
+ vst1.8 {d12-d13},[r2,: 128]
+ vshl.i32 q6,q13,#1
+ add r2,sp,#576
+ vst1.8 {d20-d21},[r2,: 128]
+ vshl.i32 q10,q14,#1
+ add r2,sp,#592
+ vst1.8 {d12-d13},[r2,: 128]
+ vshl.i32 q15,q12,#1
+ vadd.i32 q8,q8,q4
+ vext.32 d10,d31,d30,#0
+ vadd.i32 q7,q7,q1
+ add r2,sp,#608
+ vst1.8 {d16-d17},[r2,: 128]
+ vmull.s32 q8,d18,d5
+ vmlal.s32 q8,d26,d4
+ vmlal.s32 q8,d19,d9
+ vmlal.s32 q8,d27,d3
+ vmlal.s32 q8,d22,d8
+ vmlal.s32 q8,d28,d2
+ vmlal.s32 q8,d23,d7
+ vmlal.s32 q8,d29,d1
+ vmlal.s32 q8,d24,d6
+ vmlal.s32 q8,d25,d0
+ add r2,sp,#624
+ vst1.8 {d14-d15},[r2,: 128]
+ vmull.s32 q2,d18,d4
+ vmlal.s32 q2,d12,d9
+ vmlal.s32 q2,d13,d8
+ vmlal.s32 q2,d19,d3
+ vmlal.s32 q2,d22,d2
+ vmlal.s32 q2,d23,d1
+ vmlal.s32 q2,d24,d0
+ add r2,sp,#640
+ vst1.8 {d20-d21},[r2,: 128]
+ vmull.s32 q7,d18,d9
+ vmlal.s32 q7,d26,d3
+ vmlal.s32 q7,d19,d8
+ vmlal.s32 q7,d27,d2
+ vmlal.s32 q7,d22,d7
+ vmlal.s32 q7,d28,d1
+ vmlal.s32 q7,d23,d6
+ vmlal.s32 q7,d29,d0
+ add r2,sp,#656
+ vst1.8 {d10-d11},[r2,: 128]
+ vmull.s32 q5,d18,d3
+ vmlal.s32 q5,d19,d2
+ vmlal.s32 q5,d22,d1
+ vmlal.s32 q5,d23,d0
+ vmlal.s32 q5,d12,d8
+ add r2,sp,#672
+ vst1.8 {d16-d17},[r2,: 128]
+ vmull.s32 q4,d18,d8
+ vmlal.s32 q4,d26,d2
+ vmlal.s32 q4,d19,d7
+ vmlal.s32 q4,d27,d1
+ vmlal.s32 q4,d22,d6
+ vmlal.s32 q4,d28,d0
+ vmull.s32 q8,d18,d7
+ vmlal.s32 q8,d26,d1
+ vmlal.s32 q8,d19,d6
+ vmlal.s32 q8,d27,d0
+ add r2,sp,#576
+ vld1.8 {d20-d21},[r2,: 128]
+ vmlal.s32 q7,d24,d21
+ vmlal.s32 q7,d25,d20
+ vmlal.s32 q4,d23,d21
+ vmlal.s32 q4,d29,d20
+ vmlal.s32 q8,d22,d21
+ vmlal.s32 q8,d28,d20
+ vmlal.s32 q5,d24,d20
+ add r2,sp,#576
+ vst1.8 {d14-d15},[r2,: 128]
+ vmull.s32 q7,d18,d6
+ vmlal.s32 q7,d26,d0
+ add r2,sp,#656
+ vld1.8 {d30-d31},[r2,: 128]
+ vmlal.s32 q2,d30,d21
+ vmlal.s32 q7,d19,d21
+ vmlal.s32 q7,d27,d20
+ add r2,sp,#624
+ vld1.8 {d26-d27},[r2,: 128]
+ vmlal.s32 q4,d25,d27
+ vmlal.s32 q8,d29,d27
+ vmlal.s32 q8,d25,d26
+ vmlal.s32 q7,d28,d27
+ vmlal.s32 q7,d29,d26
+ add r2,sp,#608
+ vld1.8 {d28-d29},[r2,: 128]
+ vmlal.s32 q4,d24,d29
+ vmlal.s32 q8,d23,d29
+ vmlal.s32 q8,d24,d28
+ vmlal.s32 q7,d22,d29
+ vmlal.s32 q7,d23,d28
+ add r2,sp,#608
+ vst1.8 {d8-d9},[r2,: 128]
+ add r2,sp,#560
+ vld1.8 {d8-d9},[r2,: 128]
+ vmlal.s32 q7,d24,d9
+ vmlal.s32 q7,d25,d31
+ vmull.s32 q1,d18,d2
+ vmlal.s32 q1,d19,d1
+ vmlal.s32 q1,d22,d0
+ vmlal.s32 q1,d24,d27
+ vmlal.s32 q1,d23,d20
+ vmlal.s32 q1,d12,d7
+ vmlal.s32 q1,d13,d6
+ vmull.s32 q6,d18,d1
+ vmlal.s32 q6,d19,d0
+ vmlal.s32 q6,d23,d27
+ vmlal.s32 q6,d22,d20
+ vmlal.s32 q6,d24,d26
+ vmull.s32 q0,d18,d0
+ vmlal.s32 q0,d22,d27
+ vmlal.s32 q0,d23,d26
+ vmlal.s32 q0,d24,d31
+ vmlal.s32 q0,d19,d20
+ add r2,sp,#640
+ vld1.8 {d18-d19},[r2,: 128]
+ vmlal.s32 q2,d18,d7
+ vmlal.s32 q2,d19,d6
+ vmlal.s32 q5,d18,d6
+ vmlal.s32 q5,d19,d21
+ vmlal.s32 q1,d18,d21
+ vmlal.s32 q1,d19,d29
+ vmlal.s32 q0,d18,d28
+ vmlal.s32 q0,d19,d9
+ vmlal.s32 q6,d18,d29
+ vmlal.s32 q6,d19,d28
+ add r2,sp,#592
+ vld1.8 {d18-d19},[r2,: 128]
+ add r2,sp,#512
+ vld1.8 {d22-d23},[r2,: 128]
+ vmlal.s32 q5,d19,d7
+ vmlal.s32 q0,d18,d21
+ vmlal.s32 q0,d19,d29
+ vmlal.s32 q6,d18,d6
+ add r2,sp,#528
+ vld1.8 {d6-d7},[r2,: 128]
+ vmlal.s32 q6,d19,d21
+ add r2,sp,#576
+ vld1.8 {d18-d19},[r2,: 128]
+ vmlal.s32 q0,d30,d8
+ add r2,sp,#672
+ vld1.8 {d20-d21},[r2,: 128]
+ vmlal.s32 q5,d30,d29
+ add r2,sp,#608
+ vld1.8 {d24-d25},[r2,: 128]
+ vmlal.s32 q1,d30,d28
+ vadd.i64 q13,q0,q11
+ vadd.i64 q14,q5,q11
+ vmlal.s32 q6,d30,d9
+ vshr.s64 q4,q13,#26
+ vshr.s64 q13,q14,#26
+ vadd.i64 q7,q7,q4
+ vshl.i64 q4,q4,#26
+ vadd.i64 q14,q7,q3
+ vadd.i64 q9,q9,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q15,q9,q3
+ vsub.i64 q0,q0,q4
+ vshr.s64 q4,q14,#25
+ vsub.i64 q5,q5,q13
+ vshr.s64 q13,q15,#25
+ vadd.i64 q6,q6,q4
+ vshl.i64 q4,q4,#25
+ vadd.i64 q14,q6,q11
+ vadd.i64 q2,q2,q13
+ vsub.i64 q4,q7,q4
+ vshr.s64 q7,q14,#26
+ vshl.i64 q13,q13,#25
+ vadd.i64 q14,q2,q11
+ vadd.i64 q8,q8,q7
+ vshl.i64 q7,q7,#26
+ vadd.i64 q15,q8,q3
+ vsub.i64 q9,q9,q13
+ vshr.s64 q13,q14,#26
+ vsub.i64 q6,q6,q7
+ vshr.s64 q7,q15,#25
+ vadd.i64 q10,q10,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q14,q10,q3
+ vadd.i64 q1,q1,q7
+ add r2,r3,#144
+ vshl.i64 q7,q7,#25
+ add r4,r3,#96
+ vadd.i64 q15,q1,q11
+ add r2,r2,#8
+ vsub.i64 q2,q2,q13
+ add r4,r4,#8
+ vshr.s64 q13,q14,#25
+ vsub.i64 q7,q8,q7
+ vshr.s64 q8,q15,#26
+ vadd.i64 q14,q13,q13
+ vadd.i64 q12,q12,q8
+ vtrn.32 d12,d14
+ vshl.i64 q8,q8,#26
+ vtrn.32 d13,d15
+ vadd.i64 q3,q12,q3
+ vadd.i64 q0,q0,q14
+ vst1.8 d12,[r2,: 64]!
+ vshl.i64 q7,q13,#4
+ vst1.8 d13,[r4,: 64]!
+ vsub.i64 q1,q1,q8
+ vshr.s64 q3,q3,#25
+ vadd.i64 q0,q0,q7
+ vadd.i64 q5,q5,q3
+ vshl.i64 q3,q3,#25
+ vadd.i64 q6,q5,q11
+ vadd.i64 q0,q0,q13
+ vshl.i64 q7,q13,#25
+ vadd.i64 q8,q0,q11
+ vsub.i64 q3,q12,q3
+ vshr.s64 q6,q6,#26
+ vsub.i64 q7,q10,q7
+ vtrn.32 d2,d6
+ vshr.s64 q8,q8,#26
+ vtrn.32 d3,d7
+ vadd.i64 q3,q9,q6
+ vst1.8 d2,[r2,: 64]
+ vshl.i64 q6,q6,#26
+ vst1.8 d3,[r4,: 64]
+ vadd.i64 q1,q4,q8
+ vtrn.32 d4,d14
+ vshl.i64 q4,q8,#26
+ vtrn.32 d5,d15
+ vsub.i64 q5,q5,q6
+ add r2,r2,#16
+ vsub.i64 q0,q0,q4
+ vst1.8 d4,[r2,: 64]
+ add r4,r4,#16
+ vst1.8 d5,[r4,: 64]
+ vtrn.32 d10,d6
+ vtrn.32 d11,d7
+ sub r2,r2,#8
+ sub r4,r4,#8
+ vtrn.32 d0,d2
+ vtrn.32 d1,d3
+ vst1.8 d10,[r2,: 64]
+ vst1.8 d11,[r4,: 64]
+ sub r2,r2,#24
+ sub r4,r4,#24
+ vst1.8 d0,[r2,: 64]
+ vst1.8 d1,[r4,: 64]
+ add r2,r3,#288
+ add r4,r3,#336
+ vld1.8 {d0-d1},[r2,: 128]!
+ vld1.8 {d2-d3},[r4,: 128]!
+ vsub.i32 q0,q0,q1
+ vld1.8 {d2-d3},[r2,: 128]!
+ vld1.8 {d4-d5},[r4,: 128]!
+ vsub.i32 q1,q1,q2
+ add r5,r3,#240
+ vld1.8 {d4},[r2,: 64]
+ vld1.8 {d6},[r4,: 64]
+ vsub.i32 q2,q2,q3
+ vst1.8 {d0-d1},[r5,: 128]!
+ vst1.8 {d2-d3},[r5,: 128]!
+ vst1.8 d4,[r5,: 64]
+ add r2,r3,#144
+ add r4,r3,#96
+ add r5,r3,#144
+ add r6,r3,#192
+ vld1.8 {d0-d1},[r2,: 128]!
+ vld1.8 {d2-d3},[r4,: 128]!
+ vsub.i32 q2,q0,q1
+ vadd.i32 q0,q0,q1
+ vld1.8 {d2-d3},[r2,: 128]!
+ vld1.8 {d6-d7},[r4,: 128]!
+ vsub.i32 q4,q1,q3
+ vadd.i32 q1,q1,q3
+ vld1.8 {d6},[r2,: 64]
+ vld1.8 {d10},[r4,: 64]
+ vsub.i32 q6,q3,q5
+ vadd.i32 q3,q3,q5
+ vst1.8 {d4-d5},[r5,: 128]!
+ vst1.8 {d0-d1},[r6,: 128]!
+ vst1.8 {d8-d9},[r5,: 128]!
+ vst1.8 {d2-d3},[r6,: 128]!
+ vst1.8 d12,[r5,: 64]
+ vst1.8 d6,[r6,: 64]
+ add r2,r3,#0
+ add r4,r3,#240
+ vld1.8 {d0-d1},[r4,: 128]!
+ vld1.8 {d2-d3},[r4,: 128]!
+ vld1.8 {d4},[r4,: 64]
+ add r4,r3,#336
+ vld1.8 {d6-d7},[r4,: 128]!
+ vtrn.32 q0,q3
+ vld1.8 {d8-d9},[r4,: 128]!
+ vshl.i32 q5,q0,#4
+ vtrn.32 q1,q4
+ vshl.i32 q6,q3,#4
+ vadd.i32 q5,q5,q0
+ vadd.i32 q6,q6,q3
+ vshl.i32 q7,q1,#4
+ vld1.8 {d5},[r4,: 64]
+ vshl.i32 q8,q4,#4
+ vtrn.32 d4,d5
+ vadd.i32 q7,q7,q1
+ vadd.i32 q8,q8,q4
+ vld1.8 {d18-d19},[r2,: 128]!
+ vshl.i32 q10,q2,#4
+ vld1.8 {d22-d23},[r2,: 128]!
+ vadd.i32 q10,q10,q2
+ vld1.8 {d24},[r2,: 64]
+ vadd.i32 q5,q5,q0
+ add r2,r3,#288
+ vld1.8 {d26-d27},[r2,: 128]!
+ vadd.i32 q6,q6,q3
+ vld1.8 {d28-d29},[r2,: 128]!
+ vadd.i32 q8,q8,q4
+ vld1.8 {d25},[r2,: 64]
+ vadd.i32 q10,q10,q2
+ vtrn.32 q9,q13
+ vadd.i32 q7,q7,q1
+ vadd.i32 q5,q5,q0
+ vtrn.32 q11,q14
+ vadd.i32 q6,q6,q3
+ add r2,sp,#560
+ vadd.i32 q10,q10,q2
+ vtrn.32 d24,d25
+ vst1.8 {d12-d13},[r2,: 128]
+ vshl.i32 q6,q13,#1
+ add r2,sp,#576
+ vst1.8 {d20-d21},[r2,: 128]
+ vshl.i32 q10,q14,#1
+ add r2,sp,#592
+ vst1.8 {d12-d13},[r2,: 128]
+ vshl.i32 q15,q12,#1
+ vadd.i32 q8,q8,q4
+ vext.32 d10,d31,d30,#0
+ vadd.i32 q7,q7,q1
+ add r2,sp,#608
+ vst1.8 {d16-d17},[r2,: 128]
+ vmull.s32 q8,d18,d5
+ vmlal.s32 q8,d26,d4
+ vmlal.s32 q8,d19,d9
+ vmlal.s32 q8,d27,d3
+ vmlal.s32 q8,d22,d8
+ vmlal.s32 q8,d28,d2
+ vmlal.s32 q8,d23,d7
+ vmlal.s32 q8,d29,d1
+ vmlal.s32 q8,d24,d6
+ vmlal.s32 q8,d25,d0
+ add r2,sp,#624
+ vst1.8 {d14-d15},[r2,: 128]
+ vmull.s32 q2,d18,d4
+ vmlal.s32 q2,d12,d9
+ vmlal.s32 q2,d13,d8
+ vmlal.s32 q2,d19,d3
+ vmlal.s32 q2,d22,d2
+ vmlal.s32 q2,d23,d1
+ vmlal.s32 q2,d24,d0
+ add r2,sp,#640
+ vst1.8 {d20-d21},[r2,: 128]
+ vmull.s32 q7,d18,d9
+ vmlal.s32 q7,d26,d3
+ vmlal.s32 q7,d19,d8
+ vmlal.s32 q7,d27,d2
+ vmlal.s32 q7,d22,d7
+ vmlal.s32 q7,d28,d1
+ vmlal.s32 q7,d23,d6
+ vmlal.s32 q7,d29,d0
+ add r2,sp,#656
+ vst1.8 {d10-d11},[r2,: 128]
+ vmull.s32 q5,d18,d3
+ vmlal.s32 q5,d19,d2
+ vmlal.s32 q5,d22,d1
+ vmlal.s32 q5,d23,d0
+ vmlal.s32 q5,d12,d8
+ add r2,sp,#672
+ vst1.8 {d16-d17},[r2,: 128]
+ vmull.s32 q4,d18,d8
+ vmlal.s32 q4,d26,d2
+ vmlal.s32 q4,d19,d7
+ vmlal.s32 q4,d27,d1
+ vmlal.s32 q4,d22,d6
+ vmlal.s32 q4,d28,d0
+ vmull.s32 q8,d18,d7
+ vmlal.s32 q8,d26,d1
+ vmlal.s32 q8,d19,d6
+ vmlal.s32 q8,d27,d0
+ add r2,sp,#576
+ vld1.8 {d20-d21},[r2,: 128]
+ vmlal.s32 q7,d24,d21
+ vmlal.s32 q7,d25,d20
+ vmlal.s32 q4,d23,d21
+ vmlal.s32 q4,d29,d20
+ vmlal.s32 q8,d22,d21
+ vmlal.s32 q8,d28,d20
+ vmlal.s32 q5,d24,d20
+ add r2,sp,#576
+ vst1.8 {d14-d15},[r2,: 128]
+ vmull.s32 q7,d18,d6
+ vmlal.s32 q7,d26,d0
+ add r2,sp,#656
+ vld1.8 {d30-d31},[r2,: 128]
+ vmlal.s32 q2,d30,d21
+ vmlal.s32 q7,d19,d21
+ vmlal.s32 q7,d27,d20
+ add r2,sp,#624
+ vld1.8 {d26-d27},[r2,: 128]
+ vmlal.s32 q4,d25,d27
+ vmlal.s32 q8,d29,d27
+ vmlal.s32 q8,d25,d26
+ vmlal.s32 q7,d28,d27
+ vmlal.s32 q7,d29,d26
+ add r2,sp,#608
+ vld1.8 {d28-d29},[r2,: 128]
+ vmlal.s32 q4,d24,d29
+ vmlal.s32 q8,d23,d29
+ vmlal.s32 q8,d24,d28
+ vmlal.s32 q7,d22,d29
+ vmlal.s32 q7,d23,d28
+ add r2,sp,#608
+ vst1.8 {d8-d9},[r2,: 128]
+ add r2,sp,#560
+ vld1.8 {d8-d9},[r2,: 128]
+ vmlal.s32 q7,d24,d9
+ vmlal.s32 q7,d25,d31
+ vmull.s32 q1,d18,d2
+ vmlal.s32 q1,d19,d1
+ vmlal.s32 q1,d22,d0
+ vmlal.s32 q1,d24,d27
+ vmlal.s32 q1,d23,d20
+ vmlal.s32 q1,d12,d7
+ vmlal.s32 q1,d13,d6
+ vmull.s32 q6,d18,d1
+ vmlal.s32 q6,d19,d0
+ vmlal.s32 q6,d23,d27
+ vmlal.s32 q6,d22,d20
+ vmlal.s32 q6,d24,d26
+ vmull.s32 q0,d18,d0
+ vmlal.s32 q0,d22,d27
+ vmlal.s32 q0,d23,d26
+ vmlal.s32 q0,d24,d31
+ vmlal.s32 q0,d19,d20
+ add r2,sp,#640
+ vld1.8 {d18-d19},[r2,: 128]
+ vmlal.s32 q2,d18,d7
+ vmlal.s32 q2,d19,d6
+ vmlal.s32 q5,d18,d6
+ vmlal.s32 q5,d19,d21
+ vmlal.s32 q1,d18,d21
+ vmlal.s32 q1,d19,d29
+ vmlal.s32 q0,d18,d28
+ vmlal.s32 q0,d19,d9
+ vmlal.s32 q6,d18,d29
+ vmlal.s32 q6,d19,d28
+ add r2,sp,#592
+ vld1.8 {d18-d19},[r2,: 128]
+ add r2,sp,#512
+ vld1.8 {d22-d23},[r2,: 128]
+ vmlal.s32 q5,d19,d7
+ vmlal.s32 q0,d18,d21
+ vmlal.s32 q0,d19,d29
+ vmlal.s32 q6,d18,d6
+ add r2,sp,#528
+ vld1.8 {d6-d7},[r2,: 128]
+ vmlal.s32 q6,d19,d21
+ add r2,sp,#576
+ vld1.8 {d18-d19},[r2,: 128]
+ vmlal.s32 q0,d30,d8
+ add r2,sp,#672
+ vld1.8 {d20-d21},[r2,: 128]
+ vmlal.s32 q5,d30,d29
+ add r2,sp,#608
+ vld1.8 {d24-d25},[r2,: 128]
+ vmlal.s32 q1,d30,d28
+ vadd.i64 q13,q0,q11
+ vadd.i64 q14,q5,q11
+ vmlal.s32 q6,d30,d9
+ vshr.s64 q4,q13,#26
+ vshr.s64 q13,q14,#26
+ vadd.i64 q7,q7,q4
+ vshl.i64 q4,q4,#26
+ vadd.i64 q14,q7,q3
+ vadd.i64 q9,q9,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q15,q9,q3
+ vsub.i64 q0,q0,q4
+ vshr.s64 q4,q14,#25
+ vsub.i64 q5,q5,q13
+ vshr.s64 q13,q15,#25
+ vadd.i64 q6,q6,q4
+ vshl.i64 q4,q4,#25
+ vadd.i64 q14,q6,q11
+ vadd.i64 q2,q2,q13
+ vsub.i64 q4,q7,q4
+ vshr.s64 q7,q14,#26
+ vshl.i64 q13,q13,#25
+ vadd.i64 q14,q2,q11
+ vadd.i64 q8,q8,q7
+ vshl.i64 q7,q7,#26
+ vadd.i64 q15,q8,q3
+ vsub.i64 q9,q9,q13
+ vshr.s64 q13,q14,#26
+ vsub.i64 q6,q6,q7
+ vshr.s64 q7,q15,#25
+ vadd.i64 q10,q10,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q14,q10,q3
+ vadd.i64 q1,q1,q7
+ add r2,r3,#288
+ vshl.i64 q7,q7,#25
+ add r4,r3,#96
+ vadd.i64 q15,q1,q11
+ add r2,r2,#8
+ vsub.i64 q2,q2,q13
+ add r4,r4,#8
+ vshr.s64 q13,q14,#25
+ vsub.i64 q7,q8,q7
+ vshr.s64 q8,q15,#26
+ vadd.i64 q14,q13,q13
+ vadd.i64 q12,q12,q8
+ vtrn.32 d12,d14
+ vshl.i64 q8,q8,#26
+ vtrn.32 d13,d15
+ vadd.i64 q3,q12,q3
+ vadd.i64 q0,q0,q14
+ vst1.8 d12,[r2,: 64]!
+ vshl.i64 q7,q13,#4
+ vst1.8 d13,[r4,: 64]!
+ vsub.i64 q1,q1,q8
+ vshr.s64 q3,q3,#25
+ vadd.i64 q0,q0,q7
+ vadd.i64 q5,q5,q3
+ vshl.i64 q3,q3,#25
+ vadd.i64 q6,q5,q11
+ vadd.i64 q0,q0,q13
+ vshl.i64 q7,q13,#25
+ vadd.i64 q8,q0,q11
+ vsub.i64 q3,q12,q3
+ vshr.s64 q6,q6,#26
+ vsub.i64 q7,q10,q7
+ vtrn.32 d2,d6
+ vshr.s64 q8,q8,#26
+ vtrn.32 d3,d7
+ vadd.i64 q3,q9,q6
+ vst1.8 d2,[r2,: 64]
+ vshl.i64 q6,q6,#26
+ vst1.8 d3,[r4,: 64]
+ vadd.i64 q1,q4,q8
+ vtrn.32 d4,d14
+ vshl.i64 q4,q8,#26
+ vtrn.32 d5,d15
+ vsub.i64 q5,q5,q6
+ add r2,r2,#16
+ vsub.i64 q0,q0,q4
+ vst1.8 d4,[r2,: 64]
+ add r4,r4,#16
+ vst1.8 d5,[r4,: 64]
+ vtrn.32 d10,d6
+ vtrn.32 d11,d7
+ sub r2,r2,#8
+ sub r4,r4,#8
+ vtrn.32 d0,d2
+ vtrn.32 d1,d3
+ vst1.8 d10,[r2,: 64]
+ vst1.8 d11,[r4,: 64]
+ sub r2,r2,#24
+ sub r4,r4,#24
+ vst1.8 d0,[r2,: 64]
+ vst1.8 d1,[r4,: 64]
+ add r2,sp,#544
+ add r4,r3,#144
+ add r5,r3,#192
+ vld1.8 {d0-d1},[r2,: 128]
+ vld1.8 {d2-d3},[r4,: 128]!
+ vld1.8 {d4-d5},[r5,: 128]!
+ vzip.i32 q1,q2
+ vld1.8 {d6-d7},[r4,: 128]!
+ vld1.8 {d8-d9},[r5,: 128]!
+ vshl.i32 q5,q1,#1
+ vzip.i32 q3,q4
+ vshl.i32 q6,q2,#1
+ vld1.8 {d14},[r4,: 64]
+ vshl.i32 q8,q3,#1
+ vld1.8 {d15},[r5,: 64]
+ vshl.i32 q9,q4,#1
+ vmul.i32 d21,d7,d1
+ vtrn.32 d14,d15
+ vmul.i32 q11,q4,q0
+ vmul.i32 q0,q7,q0
+ vmull.s32 q12,d2,d2
+ vmlal.s32 q12,d11,d1
+ vmlal.s32 q12,d12,d0
+ vmlal.s32 q12,d13,d23
+ vmlal.s32 q12,d16,d22
+ vmlal.s32 q12,d7,d21
+ vmull.s32 q10,d2,d11
+ vmlal.s32 q10,d4,d1
+ vmlal.s32 q10,d13,d0
+ vmlal.s32 q10,d6,d23
+ vmlal.s32 q10,d17,d22
+ vmull.s32 q13,d10,d4
+ vmlal.s32 q13,d11,d3
+ vmlal.s32 q13,d13,d1
+ vmlal.s32 q13,d16,d0
+ vmlal.s32 q13,d17,d23
+ vmlal.s32 q13,d8,d22
+ vmull.s32 q1,d10,d5
+ vmlal.s32 q1,d11,d4
+ vmlal.s32 q1,d6,d1
+ vmlal.s32 q1,d17,d0
+ vmlal.s32 q1,d8,d23
+ vmull.s32 q14,d10,d6
+ vmlal.s32 q14,d11,d13
+ vmlal.s32 q14,d4,d4
+ vmlal.s32 q14,d17,d1
+ vmlal.s32 q14,d18,d0
+ vmlal.s32 q14,d9,d23
+ vmull.s32 q11,d10,d7
+ vmlal.s32 q11,d11,d6
+ vmlal.s32 q11,d12,d5
+ vmlal.s32 q11,d8,d1
+ vmlal.s32 q11,d19,d0
+ vmull.s32 q15,d10,d8
+ vmlal.s32 q15,d11,d17
+ vmlal.s32 q15,d12,d6
+ vmlal.s32 q15,d13,d5
+ vmlal.s32 q15,d19,d1
+ vmlal.s32 q15,d14,d0
+ vmull.s32 q2,d10,d9
+ vmlal.s32 q2,d11,d8
+ vmlal.s32 q2,d12,d7
+ vmlal.s32 q2,d13,d6
+ vmlal.s32 q2,d14,d1
+ vmull.s32 q0,d15,d1
+ vmlal.s32 q0,d10,d14
+ vmlal.s32 q0,d11,d19
+ vmlal.s32 q0,d12,d8
+ vmlal.s32 q0,d13,d17
+ vmlal.s32 q0,d6,d6
+ add r2,sp,#512
+ vld1.8 {d18-d19},[r2,: 128]
+ vmull.s32 q3,d16,d7
+ vmlal.s32 q3,d10,d15
+ vmlal.s32 q3,d11,d14
+ vmlal.s32 q3,d12,d9
+ vmlal.s32 q3,d13,d8
+ add r2,sp,#528
+ vld1.8 {d8-d9},[r2,: 128]
+ vadd.i64 q5,q12,q9
+ vadd.i64 q6,q15,q9
+ vshr.s64 q5,q5,#26
+ vshr.s64 q6,q6,#26
+ vadd.i64 q7,q10,q5
+ vshl.i64 q5,q5,#26
+ vadd.i64 q8,q7,q4
+ vadd.i64 q2,q2,q6
+ vshl.i64 q6,q6,#26
+ vadd.i64 q10,q2,q4
+ vsub.i64 q5,q12,q5
+ vshr.s64 q8,q8,#25
+ vsub.i64 q6,q15,q6
+ vshr.s64 q10,q10,#25
+ vadd.i64 q12,q13,q8
+ vshl.i64 q8,q8,#25
+ vadd.i64 q13,q12,q9
+ vadd.i64 q0,q0,q10
+ vsub.i64 q7,q7,q8
+ vshr.s64 q8,q13,#26
+ vshl.i64 q10,q10,#25
+ vadd.i64 q13,q0,q9
+ vadd.i64 q1,q1,q8
+ vshl.i64 q8,q8,#26
+ vadd.i64 q15,q1,q4
+ vsub.i64 q2,q2,q10
+ vshr.s64 q10,q13,#26
+ vsub.i64 q8,q12,q8
+ vshr.s64 q12,q15,#25
+ vadd.i64 q3,q3,q10
+ vshl.i64 q10,q10,#26
+ vadd.i64 q13,q3,q4
+ vadd.i64 q14,q14,q12
+ add r2,r3,#144
+ vshl.i64 q12,q12,#25
+ add r4,r3,#192
+ vadd.i64 q15,q14,q9
+ add r2,r2,#8
+ vsub.i64 q0,q0,q10
+ add r4,r4,#8
+ vshr.s64 q10,q13,#25
+ vsub.i64 q1,q1,q12
+ vshr.s64 q12,q15,#26
+ vadd.i64 q13,q10,q10
+ vadd.i64 q11,q11,q12
+ vtrn.32 d16,d2
+ vshl.i64 q12,q12,#26
+ vtrn.32 d17,d3
+ vadd.i64 q1,q11,q4
+ vadd.i64 q4,q5,q13
+ vst1.8 d16,[r2,: 64]!
+ vshl.i64 q5,q10,#4
+ vst1.8 d17,[r4,: 64]!
+ vsub.i64 q8,q14,q12
+ vshr.s64 q1,q1,#25
+ vadd.i64 q4,q4,q5
+ vadd.i64 q5,q6,q1
+ vshl.i64 q1,q1,#25
+ vadd.i64 q6,q5,q9
+ vadd.i64 q4,q4,q10
+ vshl.i64 q10,q10,#25
+ vadd.i64 q9,q4,q9
+ vsub.i64 q1,q11,q1
+ vshr.s64 q6,q6,#26
+ vsub.i64 q3,q3,q10
+ vtrn.32 d16,d2
+ vshr.s64 q9,q9,#26
+ vtrn.32 d17,d3
+ vadd.i64 q1,q2,q6
+ vst1.8 d16,[r2,: 64]
+ vshl.i64 q2,q6,#26
+ vst1.8 d17,[r4,: 64]
+ vadd.i64 q6,q7,q9
+ vtrn.32 d0,d6
+ vshl.i64 q7,q9,#26
+ vtrn.32 d1,d7
+ vsub.i64 q2,q5,q2
+ add r2,r2,#16
+ vsub.i64 q3,q4,q7
+ vst1.8 d0,[r2,: 64]
+ add r4,r4,#16
+ vst1.8 d1,[r4,: 64]
+ vtrn.32 d4,d2
+ vtrn.32 d5,d3
+ sub r2,r2,#8
+ sub r4,r4,#8
+ vtrn.32 d6,d12
+ vtrn.32 d7,d13
+ vst1.8 d4,[r2,: 64]
+ vst1.8 d5,[r4,: 64]
+ sub r2,r2,#24
+ sub r4,r4,#24
+ vst1.8 d6,[r2,: 64]
+ vst1.8 d7,[r4,: 64]
+ add r2,r3,#336
+ add r4,r3,#288
+ vld1.8 {d0-d1},[r2,: 128]!
+ vld1.8 {d2-d3},[r4,: 128]!
+ vadd.i32 q0,q0,q1
+ vld1.8 {d2-d3},[r2,: 128]!
+ vld1.8 {d4-d5},[r4,: 128]!
+ vadd.i32 q1,q1,q2
+ add r5,r3,#288
+ vld1.8 {d4},[r2,: 64]
+ vld1.8 {d6},[r4,: 64]
+ vadd.i32 q2,q2,q3
+ vst1.8 {d0-d1},[r5,: 128]!
+ vst1.8 {d2-d3},[r5,: 128]!
+ vst1.8 d4,[r5,: 64]
+ add r2,r3,#48
+ add r4,r3,#144
+ vld1.8 {d0-d1},[r4,: 128]!
+ vld1.8 {d2-d3},[r4,: 128]!
+ vld1.8 {d4},[r4,: 64]
+ add r4,r3,#288
+ vld1.8 {d6-d7},[r4,: 128]!
+ vtrn.32 q0,q3
+ vld1.8 {d8-d9},[r4,: 128]!
+ vshl.i32 q5,q0,#4
+ vtrn.32 q1,q4
+ vshl.i32 q6,q3,#4
+ vadd.i32 q5,q5,q0
+ vadd.i32 q6,q6,q3
+ vshl.i32 q7,q1,#4
+ vld1.8 {d5},[r4,: 64]
+ vshl.i32 q8,q4,#4
+ vtrn.32 d4,d5
+ vadd.i32 q7,q7,q1
+ vadd.i32 q8,q8,q4
+ vld1.8 {d18-d19},[r2,: 128]!
+ vshl.i32 q10,q2,#4
+ vld1.8 {d22-d23},[r2,: 128]!
+ vadd.i32 q10,q10,q2
+ vld1.8 {d24},[r2,: 64]
+ vadd.i32 q5,q5,q0
+ add r2,r3,#240
+ vld1.8 {d26-d27},[r2,: 128]!
+ vadd.i32 q6,q6,q3
+ vld1.8 {d28-d29},[r2,: 128]!
+ vadd.i32 q8,q8,q4
+ vld1.8 {d25},[r2,: 64]
+ vadd.i32 q10,q10,q2
+ vtrn.32 q9,q13
+ vadd.i32 q7,q7,q1
+ vadd.i32 q5,q5,q0
+ vtrn.32 q11,q14
+ vadd.i32 q6,q6,q3
+ add r2,sp,#560
+ vadd.i32 q10,q10,q2
+ vtrn.32 d24,d25
+ vst1.8 {d12-d13},[r2,: 128]
+ vshl.i32 q6,q13,#1
+ add r2,sp,#576
+ vst1.8 {d20-d21},[r2,: 128]
+ vshl.i32 q10,q14,#1
+ add r2,sp,#592
+ vst1.8 {d12-d13},[r2,: 128]
+ vshl.i32 q15,q12,#1
+ vadd.i32 q8,q8,q4
+ vext.32 d10,d31,d30,#0
+ vadd.i32 q7,q7,q1
+ add r2,sp,#608
+ vst1.8 {d16-d17},[r2,: 128]
+ vmull.s32 q8,d18,d5
+ vmlal.s32 q8,d26,d4
+ vmlal.s32 q8,d19,d9
+ vmlal.s32 q8,d27,d3
+ vmlal.s32 q8,d22,d8
+ vmlal.s32 q8,d28,d2
+ vmlal.s32 q8,d23,d7
+ vmlal.s32 q8,d29,d1
+ vmlal.s32 q8,d24,d6
+ vmlal.s32 q8,d25,d0
+ add r2,sp,#624
+ vst1.8 {d14-d15},[r2,: 128]
+ vmull.s32 q2,d18,d4
+ vmlal.s32 q2,d12,d9
+ vmlal.s32 q2,d13,d8
+ vmlal.s32 q2,d19,d3
+ vmlal.s32 q2,d22,d2
+ vmlal.s32 q2,d23,d1
+ vmlal.s32 q2,d24,d0
+ add r2,sp,#640
+ vst1.8 {d20-d21},[r2,: 128]
+ vmull.s32 q7,d18,d9
+ vmlal.s32 q7,d26,d3
+ vmlal.s32 q7,d19,d8
+ vmlal.s32 q7,d27,d2
+ vmlal.s32 q7,d22,d7
+ vmlal.s32 q7,d28,d1
+ vmlal.s32 q7,d23,d6
+ vmlal.s32 q7,d29,d0
+ add r2,sp,#656
+ vst1.8 {d10-d11},[r2,: 128]
+ vmull.s32 q5,d18,d3
+ vmlal.s32 q5,d19,d2
+ vmlal.s32 q5,d22,d1
+ vmlal.s32 q5,d23,d0
+ vmlal.s32 q5,d12,d8
+ add r2,sp,#672
+ vst1.8 {d16-d17},[r2,: 128]
+ vmull.s32 q4,d18,d8
+ vmlal.s32 q4,d26,d2
+ vmlal.s32 q4,d19,d7
+ vmlal.s32 q4,d27,d1
+ vmlal.s32 q4,d22,d6
+ vmlal.s32 q4,d28,d0
+ vmull.s32 q8,d18,d7
+ vmlal.s32 q8,d26,d1
+ vmlal.s32 q8,d19,d6
+ vmlal.s32 q8,d27,d0
+ add r2,sp,#576
+ vld1.8 {d20-d21},[r2,: 128]
+ vmlal.s32 q7,d24,d21
+ vmlal.s32 q7,d25,d20
+ vmlal.s32 q4,d23,d21
+ vmlal.s32 q4,d29,d20
+ vmlal.s32 q8,d22,d21
+ vmlal.s32 q8,d28,d20
+ vmlal.s32 q5,d24,d20
+ add r2,sp,#576
+ vst1.8 {d14-d15},[r2,: 128]
+ vmull.s32 q7,d18,d6
+ vmlal.s32 q7,d26,d0
+ add r2,sp,#656
+ vld1.8 {d30-d31},[r2,: 128]
+ vmlal.s32 q2,d30,d21
+ vmlal.s32 q7,d19,d21
+ vmlal.s32 q7,d27,d20
+ add r2,sp,#624
+ vld1.8 {d26-d27},[r2,: 128]
+ vmlal.s32 q4,d25,d27
+ vmlal.s32 q8,d29,d27
+ vmlal.s32 q8,d25,d26
+ vmlal.s32 q7,d28,d27
+ vmlal.s32 q7,d29,d26
+ add r2,sp,#608
+ vld1.8 {d28-d29},[r2,: 128]
+ vmlal.s32 q4,d24,d29
+ vmlal.s32 q8,d23,d29
+ vmlal.s32 q8,d24,d28
+ vmlal.s32 q7,d22,d29
+ vmlal.s32 q7,d23,d28
+ add r2,sp,#608
+ vst1.8 {d8-d9},[r2,: 128]
+ add r2,sp,#560
+ vld1.8 {d8-d9},[r2,: 128]
+ vmlal.s32 q7,d24,d9
+ vmlal.s32 q7,d25,d31
+ vmull.s32 q1,d18,d2
+ vmlal.s32 q1,d19,d1
+ vmlal.s32 q1,d22,d0
+ vmlal.s32 q1,d24,d27
+ vmlal.s32 q1,d23,d20
+ vmlal.s32 q1,d12,d7
+ vmlal.s32 q1,d13,d6
+ vmull.s32 q6,d18,d1
+ vmlal.s32 q6,d19,d0
+ vmlal.s32 q6,d23,d27
+ vmlal.s32 q6,d22,d20
+ vmlal.s32 q6,d24,d26
+ vmull.s32 q0,d18,d0
+ vmlal.s32 q0,d22,d27
+ vmlal.s32 q0,d23,d26
+ vmlal.s32 q0,d24,d31
+ vmlal.s32 q0,d19,d20
+ add r2,sp,#640
+ vld1.8 {d18-d19},[r2,: 128]
+ vmlal.s32 q2,d18,d7
+ vmlal.s32 q2,d19,d6
+ vmlal.s32 q5,d18,d6
+ vmlal.s32 q5,d19,d21
+ vmlal.s32 q1,d18,d21
+ vmlal.s32 q1,d19,d29
+ vmlal.s32 q0,d18,d28
+ vmlal.s32 q0,d19,d9
+ vmlal.s32 q6,d18,d29
+ vmlal.s32 q6,d19,d28
+ add r2,sp,#592
+ vld1.8 {d18-d19},[r2,: 128]
+ add r2,sp,#512
+ vld1.8 {d22-d23},[r2,: 128]
+ vmlal.s32 q5,d19,d7
+ vmlal.s32 q0,d18,d21
+ vmlal.s32 q0,d19,d29
+ vmlal.s32 q6,d18,d6
+ add r2,sp,#528
+ vld1.8 {d6-d7},[r2,: 128]
+ vmlal.s32 q6,d19,d21
+ add r2,sp,#576
+ vld1.8 {d18-d19},[r2,: 128]
+ vmlal.s32 q0,d30,d8
+ add r2,sp,#672
+ vld1.8 {d20-d21},[r2,: 128]
+ vmlal.s32 q5,d30,d29
+ add r2,sp,#608
+ vld1.8 {d24-d25},[r2,: 128]
+ vmlal.s32 q1,d30,d28
+ vadd.i64 q13,q0,q11
+ vadd.i64 q14,q5,q11
+ vmlal.s32 q6,d30,d9
+ vshr.s64 q4,q13,#26
+ vshr.s64 q13,q14,#26
+ vadd.i64 q7,q7,q4
+ vshl.i64 q4,q4,#26
+ vadd.i64 q14,q7,q3
+ vadd.i64 q9,q9,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q15,q9,q3
+ vsub.i64 q0,q0,q4
+ vshr.s64 q4,q14,#25
+ vsub.i64 q5,q5,q13
+ vshr.s64 q13,q15,#25
+ vadd.i64 q6,q6,q4
+ vshl.i64 q4,q4,#25
+ vadd.i64 q14,q6,q11
+ vadd.i64 q2,q2,q13
+ vsub.i64 q4,q7,q4
+ vshr.s64 q7,q14,#26
+ vshl.i64 q13,q13,#25
+ vadd.i64 q14,q2,q11
+ vadd.i64 q8,q8,q7
+ vshl.i64 q7,q7,#26
+ vadd.i64 q15,q8,q3
+ vsub.i64 q9,q9,q13
+ vshr.s64 q13,q14,#26
+ vsub.i64 q6,q6,q7
+ vshr.s64 q7,q15,#25
+ vadd.i64 q10,q10,q13
+ vshl.i64 q13,q13,#26
+ vadd.i64 q14,q10,q3
+ vadd.i64 q1,q1,q7
+ add r2,r3,#240
+ vshl.i64 q7,q7,#25
+ add r4,r3,#144
+ vadd.i64 q15,q1,q11
+ add r2,r2,#8
+ vsub.i64 q2,q2,q13
+ add r4,r4,#8
+ vshr.s64 q13,q14,#25
+ vsub.i64 q7,q8,q7
+ vshr.s64 q8,q15,#26
+ vadd.i64 q14,q13,q13
+ vadd.i64 q12,q12,q8
+ vtrn.32 d12,d14
+ vshl.i64 q8,q8,#26
+ vtrn.32 d13,d15
+ vadd.i64 q3,q12,q3
+ vadd.i64 q0,q0,q14
+ vst1.8 d12,[r2,: 64]!
+ vshl.i64 q7,q13,#4
+ vst1.8 d13,[r4,: 64]!
+ vsub.i64 q1,q1,q8
+ vshr.s64 q3,q3,#25
+ vadd.i64 q0,q0,q7
+ vadd.i64 q5,q5,q3
+ vshl.i64 q3,q3,#25
+ vadd.i64 q6,q5,q11
+ vadd.i64 q0,q0,q13
+ vshl.i64 q7,q13,#25
+ vadd.i64 q8,q0,q11
+ vsub.i64 q3,q12,q3
+ vshr.s64 q6,q6,#26
+ vsub.i64 q7,q10,q7
+ vtrn.32 d2,d6
+ vshr.s64 q8,q8,#26
+ vtrn.32 d3,d7
+ vadd.i64 q3,q9,q6
+ vst1.8 d2,[r2,: 64]
+ vshl.i64 q6,q6,#26
+ vst1.8 d3,[r4,: 64]
+ vadd.i64 q1,q4,q8
+ vtrn.32 d4,d14
+ vshl.i64 q4,q8,#26
+ vtrn.32 d5,d15
+ vsub.i64 q5,q5,q6
+ add r2,r2,#16
+ vsub.i64 q0,q0,q4
+ vst1.8 d4,[r2,: 64]
+ add r4,r4,#16
+ vst1.8 d5,[r4,: 64]
+ vtrn.32 d10,d6
+ vtrn.32 d11,d7
+ sub r2,r2,#8
+ sub r4,r4,#8
+ vtrn.32 d0,d2
+ vtrn.32 d1,d3
+ vst1.8 d10,[r2,: 64]
+ vst1.8 d11,[r4,: 64]
+ sub r2,r2,#24
+ sub r4,r4,#24
+ vst1.8 d0,[r2,: 64]
+ vst1.8 d1,[r4,: 64]
+ ldr r2,[sp,#488]
+ ldr r4,[sp,#492]
+ subs r5,r2,#1
+ bge .Lmainloop
+ add r1,r3,#144
+ add r2,r3,#336
+ vld1.8 {d0-d1},[r1,: 128]!
+ vld1.8 {d2-d3},[r1,: 128]!
+ vld1.8 {d4},[r1,: 64]
+ vst1.8 {d0-d1},[r2,: 128]!
+ vst1.8 {d2-d3},[r2,: 128]!
+ vst1.8 d4,[r2,: 64]
+ ldr r1,=0
+ .Linvertloop:
+ add r2,r3,#144
+ ldr r4,=0
+ ldr r5,=2
+ cmp r1,#1
+ ldreq r5,=1
+ addeq r2,r3,#336
+ addeq r4,r3,#48
+ cmp r1,#2
+ ldreq r5,=1
+ addeq r2,r3,#48
+ cmp r1,#3
+ ldreq r5,=5
+ addeq r4,r3,#336
+ cmp r1,#4
+ ldreq r5,=10
+ cmp r1,#5
+ ldreq r5,=20
+ cmp r1,#6
+ ldreq r5,=10
+ addeq r2,r3,#336
+ addeq r4,r3,#336
+ cmp r1,#7
+ ldreq r5,=50
+ cmp r1,#8
+ ldreq r5,=100
+ cmp r1,#9
+ ldreq r5,=50
+ addeq r2,r3,#336
+ cmp r1,#10
+ ldreq r5,=5
+ addeq r2,r3,#48
+ cmp r1,#11
+ ldreq r5,=0
+ addeq r2,r3,#96
+ add r6,r3,#144
+ add r7,r3,#288
+ vld1.8 {d0-d1},[r6,: 128]!
+ vld1.8 {d2-d3},[r6,: 128]!
+ vld1.8 {d4},[r6,: 64]
+ vst1.8 {d0-d1},[r7,: 128]!
+ vst1.8 {d2-d3},[r7,: 128]!
+ vst1.8 d4,[r7,: 64]
+ cmp r5,#0
+ beq .Lskipsquaringloop
+ .Lsquaringloop:
+ add r6,r3,#288
+ add r7,r3,#288
+ add r8,r3,#288
+ vmov.i32 q0,#19
+ vmov.i32 q1,#0
+ vmov.i32 q2,#1
+ vzip.i32 q1,q2
+ vld1.8 {d4-d5},[r7,: 128]!
+ vld1.8 {d6-d7},[r7,: 128]!
+ vld1.8 {d9},[r7,: 64]
+ vld1.8 {d10-d11},[r6,: 128]!
+ add r7,sp,#416
+ vld1.8 {d12-d13},[r6,: 128]!
+ vmul.i32 q7,q2,q0
+ vld1.8 {d8},[r6,: 64]
+ vext.32 d17,d11,d10,#1
+ vmul.i32 q9,q3,q0
+ vext.32 d16,d10,d8,#1
+ vshl.u32 q10,q5,q1
+ vext.32 d22,d14,d4,#1
+ vext.32 d24,d18,d6,#1
+ vshl.u32 q13,q6,q1
+ vshl.u32 d28,d8,d2
+ vrev64.i32 d22,d22
+ vmul.i32 d1,d9,d1
+ vrev64.i32 d24,d24
+ vext.32 d29,d8,d13,#1
+ vext.32 d0,d1,d9,#1
+ vrev64.i32 d0,d0
+ vext.32 d2,d9,d1,#1
+ vext.32 d23,d15,d5,#1
+ vmull.s32 q4,d20,d4
+ vrev64.i32 d23,d23
+ vmlal.s32 q4,d21,d1
+ vrev64.i32 d2,d2
+ vmlal.s32 q4,d26,d19
+ vext.32 d3,d5,d15,#1
+ vmlal.s32 q4,d27,d18
+ vrev64.i32 d3,d3
+ vmlal.s32 q4,d28,d15
+ vext.32 d14,d12,d11,#1
+ vmull.s32 q5,d16,d23
+ vext.32 d15,d13,d12,#1
+ vmlal.s32 q5,d17,d4
+ vst1.8 d8,[r7,: 64]!
+ vmlal.s32 q5,d14,d1
+ vext.32 d12,d9,d8,#0
+ vmlal.s32 q5,d15,d19
+ vmov.i64 d13,#0
+ vmlal.s32 q5,d29,d18
+ vext.32 d25,d19,d7,#1
+ vmlal.s32 q6,d20,d5
+ vrev64.i32 d25,d25
+ vmlal.s32 q6,d21,d4
+ vst1.8 d11,[r7,: 64]!
+ vmlal.s32 q6,d26,d1
+ vext.32 d9,d10,d10,#0
+ vmlal.s32 q6,d27,d19
+ vmov.i64 d8,#0
+ vmlal.s32 q6,d28,d18
+ vmlal.s32 q4,d16,d24
+ vmlal.s32 q4,d17,d5
+ vmlal.s32 q4,d14,d4
+ vst1.8 d12,[r7,: 64]!
+ vmlal.s32 q4,d15,d1
+ vext.32 d10,d13,d12,#0
+ vmlal.s32 q4,d29,d19
+ vmov.i64 d11,#0
+ vmlal.s32 q5,d20,d6
+ vmlal.s32 q5,d21,d5
+ vmlal.s32 q5,d26,d4
+ vext.32 d13,d8,d8,#0
+ vmlal.s32 q5,d27,d1
+ vmov.i64 d12,#0
+ vmlal.s32 q5,d28,d19
+ vst1.8 d9,[r7,: 64]!
+ vmlal.s32 q6,d16,d25
+ vmlal.s32 q6,d17,d6
+ vst1.8 d10,[r7,: 64]
+ vmlal.s32 q6,d14,d5
+ vext.32 d8,d11,d10,#0
+ vmlal.s32 q6,d15,d4
+ vmov.i64 d9,#0
+ vmlal.s32 q6,d29,d1
+ vmlal.s32 q4,d20,d7
+ vmlal.s32 q4,d21,d6
+ vmlal.s32 q4,d26,d5
+ vext.32 d11,d12,d12,#0
+ vmlal.s32 q4,d27,d4
+ vmov.i64 d10,#0
+ vmlal.s32 q4,d28,d1
+ vmlal.s32 q5,d16,d0
+ sub r6,r7,#32
+ vmlal.s32 q5,d17,d7
+ vmlal.s32 q5,d14,d6
+ vext.32 d30,d9,d8,#0
+ vmlal.s32 q5,d15,d5
+ vld1.8 {d31},[r6,: 64]!
+ vmlal.s32 q5,d29,d4
+ vmlal.s32 q15,d20,d0
+ vext.32 d0,d6,d18,#1
+ vmlal.s32 q15,d21,d25
+ vrev64.i32 d0,d0
+ vmlal.s32 q15,d26,d24
+ vext.32 d1,d7,d19,#1
+ vext.32 d7,d10,d10,#0
+ vmlal.s32 q15,d27,d23
+ vrev64.i32 d1,d1
+ vld1.8 {d6},[r6,: 64]
+ vmlal.s32 q15,d28,d22
+ vmlal.s32 q3,d16,d4
+ add r6,r6,#24
+ vmlal.s32 q3,d17,d2
+ vext.32 d4,d31,d30,#0
+ vmov d17,d11
+ vmlal.s32 q3,d14,d1
+ vext.32 d11,d13,d13,#0
+ vext.32 d13,d30,d30,#0
+ vmlal.s32 q3,d15,d0
+ vext.32 d1,d8,d8,#0
+ vmlal.s32 q3,d29,d3
+ vld1.8 {d5},[r6,: 64]
+ sub r6,r6,#16
+ vext.32 d10,d6,d6,#0
+ vmov.i32 q1,#0xffffffff
+ vshl.i64 q4,q1,#25
+ add r7,sp,#512
+ vld1.8 {d14-d15},[r7,: 128]
+ vadd.i64 q9,q2,q7
+ vshl.i64 q1,q1,#26
+ vshr.s64 q10,q9,#26
+ vld1.8 {d0},[r6,: 64]!
+ vadd.i64 q5,q5,q10
+ vand q9,q9,q1
+ vld1.8 {d16},[r6,: 64]!
+ add r6,sp,#528
+ vld1.8 {d20-d21},[r6,: 128]
+ vadd.i64 q11,q5,q10
+ vsub.i64 q2,q2,q9
+ vshr.s64 q9,q11,#25
+ vext.32 d12,d5,d4,#0
+ vand q11,q11,q4
+ vadd.i64 q0,q0,q9
+ vmov d19,d7
+ vadd.i64 q3,q0,q7
+ vsub.i64 q5,q5,q11
+ vshr.s64 q11,q3,#26
+ vext.32 d18,d11,d10,#0
+ vand q3,q3,q1
+ vadd.i64 q8,q8,q11
+ vadd.i64 q11,q8,q10
+ vsub.i64 q0,q0,q3
+ vshr.s64 q3,q11,#25
+ vand q11,q11,q4
+ vadd.i64 q3,q6,q3
+ vadd.i64 q6,q3,q7
+ vsub.i64 q8,q8,q11
+ vshr.s64 q11,q6,#26
+ vand q6,q6,q1
+ vadd.i64 q9,q9,q11
+ vadd.i64 d25,d19,d21
+ vsub.i64 q3,q3,q6
+ vshr.s64 d23,d25,#25
+ vand q4,q12,q4
+ vadd.i64 d21,d23,d23
+ vshl.i64 d25,d23,#4
+ vadd.i64 d21,d21,d23
+ vadd.i64 d25,d25,d21
+ vadd.i64 d4,d4,d25
+ vzip.i32 q0,q8
+ vadd.i64 d12,d4,d14
+ add r6,r8,#8
+ vst1.8 d0,[r6,: 64]
+ vsub.i64 d19,d19,d9
+ add r6,r6,#16
+ vst1.8 d16,[r6,: 64]
+ vshr.s64 d22,d12,#26
+ vand q0,q6,q1
+ vadd.i64 d10,d10,d22
+ vzip.i32 q3,q9
+ vsub.i64 d4,d4,d0
+ sub r6,r6,#8
+ vst1.8 d6,[r6,: 64]
+ add r6,r6,#16
+ vst1.8 d18,[r6,: 64]
+ vzip.i32 q2,q5
+ sub r6,r6,#32
+ vst1.8 d4,[r6,: 64]
+ subs r5,r5,#1
+ bhi .Lsquaringloop
+ .Lskipsquaringloop:
+ mov r2,r2
+ add r5,r3,#288
+ add r6,r3,#144
+ vmov.i32 q0,#19
+ vmov.i32 q1,#0
+ vmov.i32 q2,#1
+ vzip.i32 q1,q2
+ vld1.8 {d4-d5},[r5,: 128]!
+ vld1.8 {d6-d7},[r5,: 128]!
+ vld1.8 {d9},[r5,: 64]
+ vld1.8 {d10-d11},[r2,: 128]!
+ add r5,sp,#416
+ vld1.8 {d12-d13},[r2,: 128]!
+ vmul.i32 q7,q2,q0
+ vld1.8 {d8},[r2,: 64]
+ vext.32 d17,d11,d10,#1
+ vmul.i32 q9,q3,q0
+ vext.32 d16,d10,d8,#1
+ vshl.u32 q10,q5,q1
+ vext.32 d22,d14,d4,#1
+ vext.32 d24,d18,d6,#1
+ vshl.u32 q13,q6,q1
+ vshl.u32 d28,d8,d2
+ vrev64.i32 d22,d22
+ vmul.i32 d1,d9,d1
+ vrev64.i32 d24,d24
+ vext.32 d29,d8,d13,#1
+ vext.32 d0,d1,d9,#1
+ vrev64.i32 d0,d0
+ vext.32 d2,d9,d1,#1
+ vext.32 d23,d15,d5,#1
+ vmull.s32 q4,d20,d4
+ vrev64.i32 d23,d23
+ vmlal.s32 q4,d21,d1
+ vrev64.i32 d2,d2
+ vmlal.s32 q4,d26,d19
+ vext.32 d3,d5,d15,#1
+ vmlal.s32 q4,d27,d18
+ vrev64.i32 d3,d3
+ vmlal.s32 q4,d28,d15
+ vext.32 d14,d12,d11,#1
+ vmull.s32 q5,d16,d23
+ vext.32 d15,d13,d12,#1
+ vmlal.s32 q5,d17,d4
+ vst1.8 d8,[r5,: 64]!
+ vmlal.s32 q5,d14,d1
+ vext.32 d12,d9,d8,#0
+ vmlal.s32 q5,d15,d19
+ vmov.i64 d13,#0
+ vmlal.s32 q5,d29,d18
+ vext.32 d25,d19,d7,#1
+ vmlal.s32 q6,d20,d5
+ vrev64.i32 d25,d25
+ vmlal.s32 q6,d21,d4
+ vst1.8 d11,[r5,: 64]!
+ vmlal.s32 q6,d26,d1
+ vext.32 d9,d10,d10,#0
+ vmlal.s32 q6,d27,d19
+ vmov.i64 d8,#0
+ vmlal.s32 q6,d28,d18
+ vmlal.s32 q4,d16,d24
+ vmlal.s32 q4,d17,d5
+ vmlal.s32 q4,d14,d4
+ vst1.8 d12,[r5,: 64]!
+ vmlal.s32 q4,d15,d1
+ vext.32 d10,d13,d12,#0
+ vmlal.s32 q4,d29,d19
+ vmov.i64 d11,#0
+ vmlal.s32 q5,d20,d6
+ vmlal.s32 q5,d21,d5
+ vmlal.s32 q5,d26,d4
+ vext.32 d13,d8,d8,#0
+ vmlal.s32 q5,d27,d1
+ vmov.i64 d12,#0
+ vmlal.s32 q5,d28,d19
+ vst1.8 d9,[r5,: 64]!
+ vmlal.s32 q6,d16,d25
+ vmlal.s32 q6,d17,d6
+ vst1.8 d10,[r5,: 64]
+ vmlal.s32 q6,d14,d5
+ vext.32 d8,d11,d10,#0
+ vmlal.s32 q6,d15,d4
+ vmov.i64 d9,#0
+ vmlal.s32 q6,d29,d1
+ vmlal.s32 q4,d20,d7
+ vmlal.s32 q4,d21,d6
+ vmlal.s32 q4,d26,d5
+ vext.32 d11,d12,d12,#0
+ vmlal.s32 q4,d27,d4
+ vmov.i64 d10,#0
+ vmlal.s32 q4,d28,d1
+ vmlal.s32 q5,d16,d0
+ sub r2,r5,#32
+ vmlal.s32 q5,d17,d7
+ vmlal.s32 q5,d14,d6
+ vext.32 d30,d9,d8,#0
+ vmlal.s32 q5,d15,d5
+ vld1.8 {d31},[r2,: 64]!
+ vmlal.s32 q5,d29,d4
+ vmlal.s32 q15,d20,d0
+ vext.32 d0,d6,d18,#1
+ vmlal.s32 q15,d21,d25
+ vrev64.i32 d0,d0
+ vmlal.s32 q15,d26,d24
+ vext.32 d1,d7,d19,#1
+ vext.32 d7,d10,d10,#0
+ vmlal.s32 q15,d27,d23
+ vrev64.i32 d1,d1
+ vld1.8 {d6},[r2,: 64]
+ vmlal.s32 q15,d28,d22
+ vmlal.s32 q3,d16,d4
+ add r2,r2,#24
+ vmlal.s32 q3,d17,d2
+ vext.32 d4,d31,d30,#0
+ vmov d17,d11
+ vmlal.s32 q3,d14,d1
+ vext.32 d11,d13,d13,#0
+ vext.32 d13,d30,d30,#0
+ vmlal.s32 q3,d15,d0
+ vext.32 d1,d8,d8,#0
+ vmlal.s32 q3,d29,d3
+ vld1.8 {d5},[r2,: 64]
+ sub r2,r2,#16
+ vext.32 d10,d6,d6,#0
+ vmov.i32 q1,#0xffffffff
+ vshl.i64 q4,q1,#25
+ add r5,sp,#512
+ vld1.8 {d14-d15},[r5,: 128]
+ vadd.i64 q9,q2,q7
+ vshl.i64 q1,q1,#26
+ vshr.s64 q10,q9,#26
+ vld1.8 {d0},[r2,: 64]!
+ vadd.i64 q5,q5,q10
+ vand q9,q9,q1
+ vld1.8 {d16},[r2,: 64]!
+ add r2,sp,#528
+ vld1.8 {d20-d21},[r2,: 128]
+ vadd.i64 q11,q5,q10
+ vsub.i64 q2,q2,q9
+ vshr.s64 q9,q11,#25
+ vext.32 d12,d5,d4,#0
+ vand q11,q11,q4
+ vadd.i64 q0,q0,q9
+ vmov d19,d7
+ vadd.i64 q3,q0,q7
+ vsub.i64 q5,q5,q11
+ vshr.s64 q11,q3,#26
+ vext.32 d18,d11,d10,#0
+ vand q3,q3,q1
+ vadd.i64 q8,q8,q11
+ vadd.i64 q11,q8,q10
+ vsub.i64 q0,q0,q3
+ vshr.s64 q3,q11,#25
+ vand q11,q11,q4
+ vadd.i64 q3,q6,q3
+ vadd.i64 q6,q3,q7
+ vsub.i64 q8,q8,q11
+ vshr.s64 q11,q6,#26
+ vand q6,q6,q1
+ vadd.i64 q9,q9,q11
+ vadd.i64 d25,d19,d21
+ vsub.i64 q3,q3,q6
+ vshr.s64 d23,d25,#25
+ vand q4,q12,q4
+ vadd.i64 d21,d23,d23
+ vshl.i64 d25,d23,#4
+ vadd.i64 d21,d21,d23
+ vadd.i64 d25,d25,d21
+ vadd.i64 d4,d4,d25
+ vzip.i32 q0,q8
+ vadd.i64 d12,d4,d14
+ add r2,r6,#8
+ vst1.8 d0,[r2,: 64]
+ vsub.i64 d19,d19,d9
+ add r2,r2,#16
+ vst1.8 d16,[r2,: 64]
+ vshr.s64 d22,d12,#26
+ vand q0,q6,q1
+ vadd.i64 d10,d10,d22
+ vzip.i32 q3,q9
+ vsub.i64 d4,d4,d0
+ sub r2,r2,#8
+ vst1.8 d6,[r2,: 64]
+ add r2,r2,#16
+ vst1.8 d18,[r2,: 64]
+ vzip.i32 q2,q5
+ sub r2,r2,#32
+ vst1.8 d4,[r2,: 64]
+ cmp r4,#0
+ beq .Lskippostcopy
+ add r2,r3,#144
+ mov r4,r4
+ vld1.8 {d0-d1},[r2,: 128]!
+ vld1.8 {d2-d3},[r2,: 128]!
+ vld1.8 {d4},[r2,: 64]
+ vst1.8 {d0-d1},[r4,: 128]!
+ vst1.8 {d2-d3},[r4,: 128]!
+ vst1.8 d4,[r4,: 64]
+ .Lskippostcopy:
+ cmp r1,#1
+ bne .Lskipfinalcopy
+ add r2,r3,#288
+ add r4,r3,#144
+ vld1.8 {d0-d1},[r2,: 128]!
+ vld1.8 {d2-d3},[r2,: 128]!
+ vld1.8 {d4},[r2,: 64]
+ vst1.8 {d0-d1},[r4,: 128]!
+ vst1.8 {d2-d3},[r4,: 128]!
+ vst1.8 d4,[r4,: 64]
+ .Lskipfinalcopy:
+ add r1,r1,#1
+ cmp r1,#12
+ blo .Linvertloop
+ add r1,r3,#144
+ ldr r2,[r1],#4
+ ldr r3,[r1],#4
+ ldr r4,[r1],#4
+ ldr r5,[r1],#4
+ ldr r6,[r1],#4
+ ldr r7,[r1],#4
+ ldr r8,[r1],#4
+ ldr r9,[r1],#4
+ ldr r10,[r1],#4
+ ldr r1,[r1]
+ add r11,r1,r1,LSL #4
+ add r11,r11,r1,LSL #1
+ add r11,r11,#16777216
+ mov r11,r11,ASR #25
+ add r11,r11,r2
+ mov r11,r11,ASR #26
+ add r11,r11,r3
+ mov r11,r11,ASR #25
+ add r11,r11,r4
+ mov r11,r11,ASR #26
+ add r11,r11,r5
+ mov r11,r11,ASR #25
+ add r11,r11,r6
+ mov r11,r11,ASR #26
+ add r11,r11,r7
+ mov r11,r11,ASR #25
+ add r11,r11,r8
+ mov r11,r11,ASR #26
+ add r11,r11,r9
+ mov r11,r11,ASR #25
+ add r11,r11,r10
+ mov r11,r11,ASR #26
+ add r11,r11,r1
+ mov r11,r11,ASR #25
+ add r2,r2,r11
+ add r2,r2,r11,LSL #1
+ add r2,r2,r11,LSL #4
+ mov r11,r2,ASR #26
+ add r3,r3,r11
+ sub r2,r2,r11,LSL #26
+ mov r11,r3,ASR #25
+ add r4,r4,r11
+ sub r3,r3,r11,LSL #25
+ mov r11,r4,ASR #26
+ add r5,r5,r11
+ sub r4,r4,r11,LSL #26
+ mov r11,r5,ASR #25
+ add r6,r6,r11
+ sub r5,r5,r11,LSL #25
+ mov r11,r6,ASR #26
+ add r7,r7,r11
+ sub r6,r6,r11,LSL #26
+ mov r11,r7,ASR #25
+ add r8,r8,r11
+ sub r7,r7,r11,LSL #25
+ mov r11,r8,ASR #26
+ add r9,r9,r11
+ sub r8,r8,r11,LSL #26
+ mov r11,r9,ASR #25
+ add r10,r10,r11
+ sub r9,r9,r11,LSL #25
+ mov r11,r10,ASR #26
+ add r1,r1,r11
+ sub r10,r10,r11,LSL #26
+ mov r11,r1,ASR #25
+ sub r1,r1,r11,LSL #25
+ add r2,r2,r3,LSL #26
+ mov r3,r3,LSR #6
+ add r3,r3,r4,LSL #19
+ mov r4,r4,LSR #13
+ add r4,r4,r5,LSL #13
+ mov r5,r5,LSR #19
+ add r5,r5,r6,LSL #6
+ add r6,r7,r8,LSL #25
+ mov r7,r8,LSR #7
+ add r7,r7,r9,LSL #19
+ mov r8,r9,LSR #13
+ add r8,r8,r10,LSL #12
+ mov r9,r10,LSR #20
+ add r1,r9,r1,LSL #6
+ str r2,[r0],#4
+ str r3,[r0],#4
+ str r4,[r0],#4
+ str r5,[r0],#4
+ str r6,[r0],#4
+ str r7,[r0],#4
+ str r8,[r0],#4
+ str r1,[r0]
+ ldrd r4,[sp,#0]
+ ldrd r6,[sp,#8]
+ ldrd r8,[sp,#16]
+ ldrd r10,[sp,#24]
+ ldr r12,[sp,#480]
+ ldr r14,[sp,#484]
+ ldr r0,=0
+ mov sp,r12
+ vpop {q4,q5,q6,q7}
+ bx lr
+ENDPROC(curve25519_neon)
+#endif
diff --git a/lib/zinc/curve25519/curve25519-arm.h b/lib/zinc/curve25519/curve25519-arm.h
new file mode 100644
index 000000000000..4271ab582933
--- /dev/null
+++ b/lib/zinc/curve25519/curve25519-arm.h
@@ -0,0 +1,14 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#include <asm/hwcap.h>
+#include <asm/neon.h>
+#include <asm/simd.h>
+asmlinkage void curve25519_neon(u8 mypublic[CURVE25519_POINT_SIZE], const u8 secret[CURVE25519_POINT_SIZE], const u8 basepoint[CURVE25519_POINT_SIZE]);
+static bool curve25519_use_neon __ro_after_init;
+void __init curve25519_fpu_init(void)
+{
+ curve25519_use_neon = elf_hwcap & HWCAP_NEON;
+}
diff --git a/lib/zinc/curve25519/curve25519-fiat32.h b/lib/zinc/curve25519/curve25519-fiat32.h
new file mode 100644
index 000000000000..f1e21a416a31
--- /dev/null
+++ b/lib/zinc/curve25519/curve25519-fiat32.h
@@ -0,0 +1,838 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2016 The fiat-crypto Authors.
+ * Copyright (C) 2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ *
+ * This is a machine-generated formally verified implementation of curve25519 DH from:
+ * https://github.com/mit-plv/fiat-crypto
+ */
+
+/* fe means field element. Here the field is \Z/(2^255-19). An element t,
+ * entries t[0]...t[9], represents the integer t[0]+2^26 t[1]+2^51 t[2]+2^77
+ * t[3]+2^102 t[4]+...+2^230 t[9].
+ * fe limbs are bounded by 1.125*2^26,1.125*2^25,1.125*2^26,1.125*2^25,etc.
+ * Multiplication and carrying produce fe from fe_loose.
+ */
+typedef struct fe { u32 v[10]; } fe;
+
+/* fe_loose limbs are bounded by 3.375*2^26,3.375*2^25,3.375*2^26,3.375*2^25,etc.
+ * Addition and subtraction produce fe_loose from (fe, fe).
+ */
+typedef struct fe_loose { u32 v[10]; } fe_loose;
+
+static __always_inline void fe_frombytes_impl(u32 h[10], const u8 *s)
+{
+ /* Ignores top bit of s. */
+ u32 a0 = le32_to_cpup((__force __le32 *)(s));
+ u32 a1 = le32_to_cpup((__force __le32 *)(s+4));
+ u32 a2 = le32_to_cpup((__force __le32 *)(s+8));
+ u32 a3 = le32_to_cpup((__force __le32 *)(s+12));
+ u32 a4 = le32_to_cpup((__force __le32 *)(s+16));
+ u32 a5 = le32_to_cpup((__force __le32 *)(s+20));
+ u32 a6 = le32_to_cpup((__force __le32 *)(s+24));
+ u32 a7 = le32_to_cpup((__force __le32 *)(s+28));
+ h[0] = a0&((1<<26)-1); /* 26 used, 32-26 left. 26 */
+ h[1] = (a0>>26) | ((a1&((1<<19)-1))<< 6); /* (32-26) + 19 = 6+19 = 25 */
+ h[2] = (a1>>19) | ((a2&((1<<13)-1))<<13); /* (32-19) + 13 = 13+13 = 26 */
+ h[3] = (a2>>13) | ((a3&((1<< 6)-1))<<19); /* (32-13) + 6 = 19+ 6 = 25 */
+ h[4] = (a3>> 6); /* (32- 6) = 26 */
+ h[5] = a4&((1<<25)-1); /* 25 */
+ h[6] = (a4>>25) | ((a5&((1<<19)-1))<< 7); /* (32-25) + 19 = 7+19 = 26 */
+ h[7] = (a5>>19) | ((a6&((1<<12)-1))<<13); /* (32-19) + 12 = 13+12 = 25 */
+ h[8] = (a6>>12) | ((a7&((1<< 6)-1))<<20); /* (32-12) + 6 = 20+ 6 = 26 */
+ h[9] = (a7>> 6)&((1<<25)-1); /* 25 */
+}
+
+static __always_inline void fe_frombytes(fe *h, const u8 *s)
+{
+ fe_frombytes_impl(h->v, s);
+}
+
+static __always_inline u8 /*bool*/ addcarryx_u25(u8 /*bool*/ c, u32 a, u32 b, u32 *low)
+{
+ /* This function extracts 25 bits of result and 1 bit of carry (26 total), so
+ * a 32-bit intermediate is sufficient.
+ */
+ u32 x = a + b + c;
+ *low = x & ((1 << 25) - 1);
+ return (x >> 25) & 1;
+}
+
+static __always_inline u8 /*bool*/ addcarryx_u26(u8 /*bool*/ c, u32 a, u32 b, u32 *low)
+{
+ /* This function extracts 26 bits of result and 1 bit of carry (27 total), so
+ * a 32-bit intermediate is sufficient.
+ */
+ u32 x = a + b + c;
+ *low = x & ((1 << 26) - 1);
+ return (x >> 26) & 1;
+}
+
+static __always_inline u8 /*bool*/ subborrow_u25(u8 /*bool*/ c, u32 a, u32 b, u32 *low)
+{
+ /* This function extracts 25 bits of result and 1 bit of borrow (26 total), so
+ * a 32-bit intermediate is sufficient.
+ */
+ u32 x = a - b - c;
+ *low = x & ((1 << 25) - 1);
+ return x >> 31;
+}
+
+static __always_inline u8 /*bool*/ subborrow_u26(u8 /*bool*/ c, u32 a, u32 b, u32 *low)
+{
+ /* This function extracts 26 bits of result and 1 bit of borrow (27 total), so
+ * a 32-bit intermediate is sufficient.
+ */
+ u32 x = a - b - c;
+ *low = x & ((1 << 26) - 1);
+ return x >> 31;
+}
+
+static __always_inline u32 cmovznz32(u32 t, u32 z, u32 nz)
+{
+ t = -!!t; /* all set if nonzero, 0 if 0 */
+ return (t&nz) | ((~t)&z);
+}
+
+static __always_inline void fe_freeze(u32 out[10], const u32 in1[10])
+{
+ { const u32 x17 = in1[9];
+ { const u32 x18 = in1[8];
+ { const u32 x16 = in1[7];
+ { const u32 x14 = in1[6];
+ { const u32 x12 = in1[5];
+ { const u32 x10 = in1[4];
+ { const u32 x8 = in1[3];
+ { const u32 x6 = in1[2];
+ { const u32 x4 = in1[1];
+ { const u32 x2 = in1[0];
+ { u32 x20; u8/*bool*/ x21 = subborrow_u26(0x0, x2, 0x3ffffed, &x20);
+ { u32 x23; u8/*bool*/ x24 = subborrow_u25(x21, x4, 0x1ffffff, &x23);
+ { u32 x26; u8/*bool*/ x27 = subborrow_u26(x24, x6, 0x3ffffff, &x26);
+ { u32 x29; u8/*bool*/ x30 = subborrow_u25(x27, x8, 0x1ffffff, &x29);
+ { u32 x32; u8/*bool*/ x33 = subborrow_u26(x30, x10, 0x3ffffff, &x32);
+ { u32 x35; u8/*bool*/ x36 = subborrow_u25(x33, x12, 0x1ffffff, &x35);
+ { u32 x38; u8/*bool*/ x39 = subborrow_u26(x36, x14, 0x3ffffff, &x38);
+ { u32 x41; u8/*bool*/ x42 = subborrow_u25(x39, x16, 0x1ffffff, &x41);
+ { u32 x44; u8/*bool*/ x45 = subborrow_u26(x42, x18, 0x3ffffff, &x44);
+ { u32 x47; u8/*bool*/ x48 = subborrow_u25(x45, x17, 0x1ffffff, &x47);
+ { u32 x49 = cmovznz32(x48, 0x0, 0xffffffff);
+ { u32 x50 = (x49 & 0x3ffffed);
+ { u32 x52; u8/*bool*/ x53 = addcarryx_u26(0x0, x20, x50, &x52);
+ { u32 x54 = (x49 & 0x1ffffff);
+ { u32 x56; u8/*bool*/ x57 = addcarryx_u25(x53, x23, x54, &x56);
+ { u32 x58 = (x49 & 0x3ffffff);
+ { u32 x60; u8/*bool*/ x61 = addcarryx_u26(x57, x26, x58, &x60);
+ { u32 x62 = (x49 & 0x1ffffff);
+ { u32 x64; u8/*bool*/ x65 = addcarryx_u25(x61, x29, x62, &x64);
+ { u32 x66 = (x49 & 0x3ffffff);
+ { u32 x68; u8/*bool*/ x69 = addcarryx_u26(x65, x32, x66, &x68);
+ { u32 x70 = (x49 & 0x1ffffff);
+ { u32 x72; u8/*bool*/ x73 = addcarryx_u25(x69, x35, x70, &x72);
+ { u32 x74 = (x49 & 0x3ffffff);
+ { u32 x76; u8/*bool*/ x77 = addcarryx_u26(x73, x38, x74, &x76);
+ { u32 x78 = (x49 & 0x1ffffff);
+ { u32 x80; u8/*bool*/ x81 = addcarryx_u25(x77, x41, x78, &x80);
+ { u32 x82 = (x49 & 0x3ffffff);
+ { u32 x84; u8/*bool*/ x85 = addcarryx_u26(x81, x44, x82, &x84);
+ { u32 x86 = (x49 & 0x1ffffff);
+ { u32 x88; addcarryx_u25(x85, x47, x86, &x88);
+ out[0] = x52;
+ out[1] = x56;
+ out[2] = x60;
+ out[3] = x64;
+ out[4] = x68;
+ out[5] = x72;
+ out[6] = x76;
+ out[7] = x80;
+ out[8] = x84;
+ out[9] = x88;
+ }}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}
+}
+
+static __always_inline void fe_tobytes(u8 s[32], const fe *f)
+{
+ u32 h[10];
+ fe_freeze(h, f->v);
+ s[0] = h[0] >> 0;
+ s[1] = h[0] >> 8;
+ s[2] = h[0] >> 16;
+ s[3] = (h[0] >> 24) | (h[1] << 2);
+ s[4] = h[1] >> 6;
+ s[5] = h[1] >> 14;
+ s[6] = (h[1] >> 22) | (h[2] << 3);
+ s[7] = h[2] >> 5;
+ s[8] = h[2] >> 13;
+ s[9] = (h[2] >> 21) | (h[3] << 5);
+ s[10] = h[3] >> 3;
+ s[11] = h[3] >> 11;
+ s[12] = (h[3] >> 19) | (h[4] << 6);
+ s[13] = h[4] >> 2;
+ s[14] = h[4] >> 10;
+ s[15] = h[4] >> 18;
+ s[16] = h[5] >> 0;
+ s[17] = h[5] >> 8;
+ s[18] = h[5] >> 16;
+ s[19] = (h[5] >> 24) | (h[6] << 1);
+ s[20] = h[6] >> 7;
+ s[21] = h[6] >> 15;
+ s[22] = (h[6] >> 23) | (h[7] << 3);
+ s[23] = h[7] >> 5;
+ s[24] = h[7] >> 13;
+ s[25] = (h[7] >> 21) | (h[8] << 4);
+ s[26] = h[8] >> 4;
+ s[27] = h[8] >> 12;
+ s[28] = (h[8] >> 20) | (h[9] << 6);
+ s[29] = h[9] >> 2;
+ s[30] = h[9] >> 10;
+ s[31] = h[9] >> 18;
+}
+
+/* h = f */
+static __always_inline void fe_copy(fe *h, const fe *f)
+{
+ memmove(h, f, sizeof(u32) * 10);
+}
+
+static __always_inline void fe_copy_lt(fe_loose *h, const fe *f)
+{
+ memmove(h, f, sizeof(u32) * 10);
+}
+
+/* h = 0 */
+static __always_inline void fe_0(fe *h)
+{
+ memset(h, 0, sizeof(u32) * 10);
+}
+
+/* h = 1 */
+static __always_inline void fe_1(fe *h)
+{
+ memset(h, 0, sizeof(u32) * 10);
+ h->v[0] = 1;
+}
+
+static void fe_add_impl(u32 out[10], const u32 in1[10], const u32 in2[10])
+{
+ { const u32 x20 = in1[9];
+ { const u32 x21 = in1[8];
+ { const u32 x19 = in1[7];
+ { const u32 x17 = in1[6];
+ { const u32 x15 = in1[5];
+ { const u32 x13 = in1[4];
+ { const u32 x11 = in1[3];
+ { const u32 x9 = in1[2];
+ { const u32 x7 = in1[1];
+ { const u32 x5 = in1[0];
+ { const u32 x38 = in2[9];
+ { const u32 x39 = in2[8];
+ { const u32 x37 = in2[7];
+ { const u32 x35 = in2[6];
+ { const u32 x33 = in2[5];
+ { const u32 x31 = in2[4];
+ { const u32 x29 = in2[3];
+ { const u32 x27 = in2[2];
+ { const u32 x25 = in2[1];
+ { const u32 x23 = in2[0];
+ out[0] = (x5 + x23);
+ out[1] = (x7 + x25);
+ out[2] = (x9 + x27);
+ out[3] = (x11 + x29);
+ out[4] = (x13 + x31);
+ out[5] = (x15 + x33);
+ out[6] = (x17 + x35);
+ out[7] = (x19 + x37);
+ out[8] = (x21 + x39);
+ out[9] = (x20 + x38);
+ }}}}}}}}}}}}}}}}}}}}
+}
+
+/* h = f + g
+ * Can overlap h with f or g.
+ */
+static __always_inline void fe_add(fe_loose *h, const fe *f, const fe *g)
+{
+ fe_add_impl(h->v, f->v, g->v);
+}
+
+static void fe_sub_impl(u32 out[10], const u32 in1[10], const u32 in2[10])
+{
+ { const u32 x20 = in1[9];
+ { const u32 x21 = in1[8];
+ { const u32 x19 = in1[7];
+ { const u32 x17 = in1[6];
+ { const u32 x15 = in1[5];
+ { const u32 x13 = in1[4];
+ { const u32 x11 = in1[3];
+ { const u32 x9 = in1[2];
+ { const u32 x7 = in1[1];
+ { const u32 x5 = in1[0];
+ { const u32 x38 = in2[9];
+ { const u32 x39 = in2[8];
+ { const u32 x37 = in2[7];
+ { const u32 x35 = in2[6];
+ { const u32 x33 = in2[5];
+ { const u32 x31 = in2[4];
+ { const u32 x29 = in2[3];
+ { const u32 x27 = in2[2];
+ { const u32 x25 = in2[1];
+ { const u32 x23 = in2[0];
+ out[0] = ((0x7ffffda + x5) - x23);
+ out[1] = ((0x3fffffe + x7) - x25);
+ out[2] = ((0x7fffffe + x9) - x27);
+ out[3] = ((0x3fffffe + x11) - x29);
+ out[4] = ((0x7fffffe + x13) - x31);
+ out[5] = ((0x3fffffe + x15) - x33);
+ out[6] = ((0x7fffffe + x17) - x35);
+ out[7] = ((0x3fffffe + x19) - x37);
+ out[8] = ((0x7fffffe + x21) - x39);
+ out[9] = ((0x3fffffe + x20) - x38);
+ }}}}}}}}}}}}}}}}}}}}
+}
+
+/* h = f - g
+ * Can overlap h with f or g.
+ */
+static __always_inline void fe_sub(fe_loose *h, const fe *f, const fe *g)
+{
+ fe_sub_impl(h->v, f->v, g->v);
+}
+
+static void fe_mul_impl(u32 out[10], const u32 in1[10], const u32 in2[10])
+{
+ { const u32 x20 = in1[9];
+ { const u32 x21 = in1[8];
+ { const u32 x19 = in1[7];
+ { const u32 x17 = in1[6];
+ { const u32 x15 = in1[5];
+ { const u32 x13 = in1[4];
+ { const u32 x11 = in1[3];
+ { const u32 x9 = in1[2];
+ { const u32 x7 = in1[1];
+ { const u32 x5 = in1[0];
+ { const u32 x38 = in2[9];
+ { const u32 x39 = in2[8];
+ { const u32 x37 = in2[7];
+ { const u32 x35 = in2[6];
+ { const u32 x33 = in2[5];
+ { const u32 x31 = in2[4];
+ { const u32 x29 = in2[3];
+ { const u32 x27 = in2[2];
+ { const u32 x25 = in2[1];
+ { const u32 x23 = in2[0];
+ { u64 x40 = ((u64)x23 * x5);
+ { u64 x41 = (((u64)x23 * x7) + ((u64)x25 * x5));
+ { u64 x42 = ((((u64)(0x2 * x25) * x7) + ((u64)x23 * x9)) + ((u64)x27 * x5));
+ { u64 x43 = (((((u64)x25 * x9) + ((u64)x27 * x7)) + ((u64)x23 * x11)) + ((u64)x29 * x5));
+ { u64 x44 = (((((u64)x27 * x9) + (0x2 * (((u64)x25 * x11) + ((u64)x29 * x7)))) + ((u64)x23 * x13)) + ((u64)x31 * x5));
+ { u64 x45 = (((((((u64)x27 * x11) + ((u64)x29 * x9)) + ((u64)x25 * x13)) + ((u64)x31 * x7)) + ((u64)x23 * x15)) + ((u64)x33 * x5));
+ { u64 x46 = (((((0x2 * ((((u64)x29 * x11) + ((u64)x25 * x15)) + ((u64)x33 * x7))) + ((u64)x27 * x13)) + ((u64)x31 * x9)) + ((u64)x23 * x17)) + ((u64)x35 * x5));
+ { u64 x47 = (((((((((u64)x29 * x13) + ((u64)x31 * x11)) + ((u64)x27 * x15)) + ((u64)x33 * x9)) + ((u64)x25 * x17)) + ((u64)x35 * x7)) + ((u64)x23 * x19)) + ((u64)x37 * x5));
+ { u64 x48 = (((((((u64)x31 * x13) + (0x2 * (((((u64)x29 * x15) + ((u64)x33 * x11)) + ((u64)x25 * x19)) + ((u64)x37 * x7)))) + ((u64)x27 * x17)) + ((u64)x35 * x9)) + ((u64)x23 * x21)) + ((u64)x39 * x5));
+ { u64 x49 = (((((((((((u64)x31 * x15) + ((u64)x33 * x13)) + ((u64)x29 * x17)) + ((u64)x35 * x11)) + ((u64)x27 * x19)) + ((u64)x37 * x9)) + ((u64)x25 * x21)) + ((u64)x39 * x7)) + ((u64)x23 * x20)) + ((u64)x38 * x5));
+ { u64 x50 = (((((0x2 * ((((((u64)x33 * x15) + ((u64)x29 * x19)) + ((u64)x37 * x11)) + ((u64)x25 * x20)) + ((u64)x38 * x7))) + ((u64)x31 * x17)) + ((u64)x35 * x13)) + ((u64)x27 * x21)) + ((u64)x39 * x9));
+ { u64 x51 = (((((((((u64)x33 * x17) + ((u64)x35 * x15)) + ((u64)x31 * x19)) + ((u64)x37 * x13)) + ((u64)x29 * x21)) + ((u64)x39 * x11)) + ((u64)x27 * x20)) + ((u64)x38 * x9));
+ { u64 x52 = (((((u64)x35 * x17) + (0x2 * (((((u64)x33 * x19) + ((u64)x37 * x15)) + ((u64)x29 * x20)) + ((u64)x38 * x11)))) + ((u64)x31 * x21)) + ((u64)x39 * x13));
+ { u64 x53 = (((((((u64)x35 * x19) + ((u64)x37 * x17)) + ((u64)x33 * x21)) + ((u64)x39 * x15)) + ((u64)x31 * x20)) + ((u64)x38 * x13));
+ { u64 x54 = (((0x2 * ((((u64)x37 * x19) + ((u64)x33 * x20)) + ((u64)x38 * x15))) + ((u64)x35 * x21)) + ((u64)x39 * x17));
+ { u64 x55 = (((((u64)x37 * x21) + ((u64)x39 * x19)) + ((u64)x35 * x20)) + ((u64)x38 * x17));
+ { u64 x56 = (((u64)x39 * x21) + (0x2 * (((u64)x37 * x20) + ((u64)x38 * x19))));
+ { u64 x57 = (((u64)x39 * x20) + ((u64)x38 * x21));
+ { u64 x58 = ((u64)(0x2 * x38) * x20);
+ { u64 x59 = (x48 + (x58 << 0x4));
+ { u64 x60 = (x59 + (x58 << 0x1));
+ { u64 x61 = (x60 + x58);
+ { u64 x62 = (x47 + (x57 << 0x4));
+ { u64 x63 = (x62 + (x57 << 0x1));
+ { u64 x64 = (x63 + x57);
+ { u64 x65 = (x46 + (x56 << 0x4));
+ { u64 x66 = (x65 + (x56 << 0x1));
+ { u64 x67 = (x66 + x56);
+ { u64 x68 = (x45 + (x55 << 0x4));
+ { u64 x69 = (x68 + (x55 << 0x1));
+ { u64 x70 = (x69 + x55);
+ { u64 x71 = (x44 + (x54 << 0x4));
+ { u64 x72 = (x71 + (x54 << 0x1));
+ { u64 x73 = (x72 + x54);
+ { u64 x74 = (x43 + (x53 << 0x4));
+ { u64 x75 = (x74 + (x53 << 0x1));
+ { u64 x76 = (x75 + x53);
+ { u64 x77 = (x42 + (x52 << 0x4));
+ { u64 x78 = (x77 + (x52 << 0x1));
+ { u64 x79 = (x78 + x52);
+ { u64 x80 = (x41 + (x51 << 0x4));
+ { u64 x81 = (x80 + (x51 << 0x1));
+ { u64 x82 = (x81 + x51);
+ { u64 x83 = (x40 + (x50 << 0x4));
+ { u64 x84 = (x83 + (x50 << 0x1));
+ { u64 x85 = (x84 + x50);
+ { u64 x86 = (x85 >> 0x1a);
+ { u32 x87 = ((u32)x85 & 0x3ffffff);
+ { u64 x88 = (x86 + x82);
+ { u64 x89 = (x88 >> 0x19);
+ { u32 x90 = ((u32)x88 & 0x1ffffff);
+ { u64 x91 = (x89 + x79);
+ { u64 x92 = (x91 >> 0x1a);
+ { u32 x93 = ((u32)x91 & 0x3ffffff);
+ { u64 x94 = (x92 + x76);
+ { u64 x95 = (x94 >> 0x19);
+ { u32 x96 = ((u32)x94 & 0x1ffffff);
+ { u64 x97 = (x95 + x73);
+ { u64 x98 = (x97 >> 0x1a);
+ { u32 x99 = ((u32)x97 & 0x3ffffff);
+ { u64 x100 = (x98 + x70);
+ { u64 x101 = (x100 >> 0x19);
+ { u32 x102 = ((u32)x100 & 0x1ffffff);
+ { u64 x103 = (x101 + x67);
+ { u64 x104 = (x103 >> 0x1a);
+ { u32 x105 = ((u32)x103 & 0x3ffffff);
+ { u64 x106 = (x104 + x64);
+ { u64 x107 = (x106 >> 0x19);
+ { u32 x108 = ((u32)x106 & 0x1ffffff);
+ { u64 x109 = (x107 + x61);
+ { u64 x110 = (x109 >> 0x1a);
+ { u32 x111 = ((u32)x109 & 0x3ffffff);
+ { u64 x112 = (x110 + x49);
+ { u64 x113 = (x112 >> 0x19);
+ { u32 x114 = ((u32)x112 & 0x1ffffff);
+ { u64 x115 = (x87 + (0x13 * x113));
+ { u32 x116 = (u32) (x115 >> 0x1a);
+ { u32 x117 = ((u32)x115 & 0x3ffffff);
+ { u32 x118 = (x116 + x90);
+ { u32 x119 = (x118 >> 0x19);
+ { u32 x120 = (x118 & 0x1ffffff);
+ out[0] = x117;
+ out[1] = x120;
+ out[2] = (x119 + x93);
+ out[3] = x96;
+ out[4] = x99;
+ out[5] = x102;
+ out[6] = x105;
+ out[7] = x108;
+ out[8] = x111;
+ out[9] = x114;
+ }}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}
+}
+
+static __always_inline void fe_mul_ttt(fe *h, const fe *f, const fe *g)
+{
+ fe_mul_impl(h->v, f->v, g->v);
+}
+
+static __always_inline void fe_mul_tlt(fe *h, const fe_loose *f, const fe *g)
+{
+ fe_mul_impl(h->v, f->v, g->v);
+}
+
+static __always_inline void fe_mul_tll(fe *h, const fe_loose *f, const fe_loose *g)
+{
+ fe_mul_impl(h->v, f->v, g->v);
+}
+
+static void fe_sqr_impl(u32 out[10], const u32 in1[10])
+{
+ { const u32 x17 = in1[9];
+ { const u32 x18 = in1[8];
+ { const u32 x16 = in1[7];
+ { const u32 x14 = in1[6];
+ { const u32 x12 = in1[5];
+ { const u32 x10 = in1[4];
+ { const u32 x8 = in1[3];
+ { const u32 x6 = in1[2];
+ { const u32 x4 = in1[1];
+ { const u32 x2 = in1[0];
+ { u64 x19 = ((u64)x2 * x2);
+ { u64 x20 = ((u64)(0x2 * x2) * x4);
+ { u64 x21 = (0x2 * (((u64)x4 * x4) + ((u64)x2 * x6)));
+ { u64 x22 = (0x2 * (((u64)x4 * x6) + ((u64)x2 * x8)));
+ { u64 x23 = ((((u64)x6 * x6) + ((u64)(0x4 * x4) * x8)) + ((u64)(0x2 * x2) * x10));
+ { u64 x24 = (0x2 * ((((u64)x6 * x8) + ((u64)x4 * x10)) + ((u64)x2 * x12)));
+ { u64 x25 = (0x2 * (((((u64)x8 * x8) + ((u64)x6 * x10)) + ((u64)x2 * x14)) + ((u64)(0x2 * x4) * x12)));
+ { u64 x26 = (0x2 * (((((u64)x8 * x10) + ((u64)x6 * x12)) + ((u64)x4 * x14)) + ((u64)x2 * x16)));
+ { u64 x27 = (((u64)x10 * x10) + (0x2 * ((((u64)x6 * x14) + ((u64)x2 * x18)) + (0x2 * (((u64)x4 * x16) + ((u64)x8 * x12))))));
+ { u64 x28 = (0x2 * ((((((u64)x10 * x12) + ((u64)x8 * x14)) + ((u64)x6 * x16)) + ((u64)x4 * x18)) + ((u64)x2 * x17)));
+ { u64 x29 = (0x2 * (((((u64)x12 * x12) + ((u64)x10 * x14)) + ((u64)x6 * x18)) + (0x2 * (((u64)x8 * x16) + ((u64)x4 * x17)))));
+ { u64 x30 = (0x2 * (((((u64)x12 * x14) + ((u64)x10 * x16)) + ((u64)x8 * x18)) + ((u64)x6 * x17)));
+ { u64 x31 = (((u64)x14 * x14) + (0x2 * (((u64)x10 * x18) + (0x2 * (((u64)x12 * x16) + ((u64)x8 * x17))))));
+ { u64 x32 = (0x2 * ((((u64)x14 * x16) + ((u64)x12 * x18)) + ((u64)x10 * x17)));
+ { u64 x33 = (0x2 * ((((u64)x16 * x16) + ((u64)x14 * x18)) + ((u64)(0x2 * x12) * x17)));
+ { u64 x34 = (0x2 * (((u64)x16 * x18) + ((u64)x14 * x17)));
+ { u64 x35 = (((u64)x18 * x18) + ((u64)(0x4 * x16) * x17));
+ { u64 x36 = ((u64)(0x2 * x18) * x17);
+ { u64 x37 = ((u64)(0x2 * x17) * x17);
+ { u64 x38 = (x27 + (x37 << 0x4));
+ { u64 x39 = (x38 + (x37 << 0x1));
+ { u64 x40 = (x39 + x37);
+ { u64 x41 = (x26 + (x36 << 0x4));
+ { u64 x42 = (x41 + (x36 << 0x1));
+ { u64 x43 = (x42 + x36);
+ { u64 x44 = (x25 + (x35 << 0x4));
+ { u64 x45 = (x44 + (x35 << 0x1));
+ { u64 x46 = (x45 + x35);
+ { u64 x47 = (x24 + (x34 << 0x4));
+ { u64 x48 = (x47 + (x34 << 0x1));
+ { u64 x49 = (x48 + x34);
+ { u64 x50 = (x23 + (x33 << 0x4));
+ { u64 x51 = (x50 + (x33 << 0x1));
+ { u64 x52 = (x51 + x33);
+ { u64 x53 = (x22 + (x32 << 0x4));
+ { u64 x54 = (x53 + (x32 << 0x1));
+ { u64 x55 = (x54 + x32);
+ { u64 x56 = (x21 + (x31 << 0x4));
+ { u64 x57 = (x56 + (x31 << 0x1));
+ { u64 x58 = (x57 + x31);
+ { u64 x59 = (x20 + (x30 << 0x4));
+ { u64 x60 = (x59 + (x30 << 0x1));
+ { u64 x61 = (x60 + x30);
+ { u64 x62 = (x19 + (x29 << 0x4));
+ { u64 x63 = (x62 + (x29 << 0x1));
+ { u64 x64 = (x63 + x29);
+ { u64 x65 = (x64 >> 0x1a);
+ { u32 x66 = ((u32)x64 & 0x3ffffff);
+ { u64 x67 = (x65 + x61);
+ { u64 x68 = (x67 >> 0x19);
+ { u32 x69 = ((u32)x67 & 0x1ffffff);
+ { u64 x70 = (x68 + x58);
+ { u64 x71 = (x70 >> 0x1a);
+ { u32 x72 = ((u32)x70 & 0x3ffffff);
+ { u64 x73 = (x71 + x55);
+ { u64 x74 = (x73 >> 0x19);
+ { u32 x75 = ((u32)x73 & 0x1ffffff);
+ { u64 x76 = (x74 + x52);
+ { u64 x77 = (x76 >> 0x1a);
+ { u32 x78 = ((u32)x76 & 0x3ffffff);
+ { u64 x79 = (x77 + x49);
+ { u64 x80 = (x79 >> 0x19);
+ { u32 x81 = ((u32)x79 & 0x1ffffff);
+ { u64 x82 = (x80 + x46);
+ { u64 x83 = (x82 >> 0x1a);
+ { u32 x84 = ((u32)x82 & 0x3ffffff);
+ { u64 x85 = (x83 + x43);
+ { u64 x86 = (x85 >> 0x19);
+ { u32 x87 = ((u32)x85 & 0x1ffffff);
+ { u64 x88 = (x86 + x40);
+ { u64 x89 = (x88 >> 0x1a);
+ { u32 x90 = ((u32)x88 & 0x3ffffff);
+ { u64 x91 = (x89 + x28);
+ { u64 x92 = (x91 >> 0x19);
+ { u32 x93 = ((u32)x91 & 0x1ffffff);
+ { u64 x94 = (x66 + (0x13 * x92));
+ { u32 x95 = (u32) (x94 >> 0x1a);
+ { u32 x96 = ((u32)x94 & 0x3ffffff);
+ { u32 x97 = (x95 + x69);
+ { u32 x98 = (x97 >> 0x19);
+ { u32 x99 = (x97 & 0x1ffffff);
+ out[0] = x96;
+ out[1] = x99;
+ out[2] = (x98 + x72);
+ out[3] = x75;
+ out[4] = x78;
+ out[5] = x81;
+ out[6] = x84;
+ out[7] = x87;
+ out[8] = x90;
+ out[9] = x93;
+ }}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}
+}
+
+static __always_inline void fe_sq_tl(fe *h, const fe_loose *f)
+{
+ fe_sqr_impl(h->v, f->v);
+}
+
+static __always_inline void fe_sq_tt(fe *h, const fe *f)
+{
+ fe_sqr_impl(h->v, f->v);
+}
+
+static __always_inline void fe_loose_invert(fe *out, const fe_loose *z)
+{
+ fe t0;
+ fe t1;
+ fe t2;
+ fe t3;
+ int i;
+
+ fe_sq_tl(&t0, z);
+ fe_sq_tt(&t1, &t0);
+ for (i = 1; i < 2; ++i)
+ fe_sq_tt(&t1, &t1);
+ fe_mul_tlt(&t1, z, &t1);
+ fe_mul_ttt(&t0, &t0, &t1);
+ fe_sq_tt(&t2, &t0);
+ fe_mul_ttt(&t1, &t1, &t2);
+ fe_sq_tt(&t2, &t1);
+ for (i = 1; i < 5; ++i)
+ fe_sq_tt(&t2, &t2);
+ fe_mul_ttt(&t1, &t2, &t1);
+ fe_sq_tt(&t2, &t1);
+ for (i = 1; i < 10; ++i)
+ fe_sq_tt(&t2, &t2);
+ fe_mul_ttt(&t2, &t2, &t1);
+ fe_sq_tt(&t3, &t2);
+ for (i = 1; i < 20; ++i)
+ fe_sq_tt(&t3, &t3);
+ fe_mul_ttt(&t2, &t3, &t2);
+ fe_sq_tt(&t2, &t2);
+ for (i = 1; i < 10; ++i)
+ fe_sq_tt(&t2, &t2);
+ fe_mul_ttt(&t1, &t2, &t1);
+ fe_sq_tt(&t2, &t1);
+ for (i = 1; i < 50; ++i)
+ fe_sq_tt(&t2, &t2);
+ fe_mul_ttt(&t2, &t2, &t1);
+ fe_sq_tt(&t3, &t2);
+ for (i = 1; i < 100; ++i)
+ fe_sq_tt(&t3, &t3);
+ fe_mul_ttt(&t2, &t3, &t2);
+ fe_sq_tt(&t2, &t2);
+ for (i = 1; i < 50; ++i)
+ fe_sq_tt(&t2, &t2);
+ fe_mul_ttt(&t1, &t2, &t1);
+ fe_sq_tt(&t1, &t1);
+ for (i = 1; i < 5; ++i)
+ fe_sq_tt(&t1, &t1);
+ fe_mul_ttt(out, &t1, &t0);
+}
+
+static __always_inline void fe_invert(fe *out, const fe *z)
+{
+ fe_loose l;
+ fe_copy_lt(&l, z);
+ fe_loose_invert(out, &l);
+}
+
+/* Replace (f,g) with (g,f) if b == 1;
+ * replace (f,g) with (f,g) if b == 0.
+ *
+ * Preconditions: b in {0,1}
+ */
+static __always_inline void fe_cswap(fe *f, fe *g, unsigned int b)
+{
+ unsigned i;
+ b = 0-b;
+ for (i = 0; i < 10; i++) {
+ u32 x = f->v[i] ^ g->v[i];
+ x &= b;
+ f->v[i] ^= x;
+ g->v[i] ^= x;
+ }
+}
+
+/* NOTE: based on fiat-crypto fe_mul, edited for in2=121666, 0, 0.*/
+static __always_inline void fe_mul_121666_impl(u32 out[10], const u32 in1[10])
+{
+ { const u32 x20 = in1[9];
+ { const u32 x21 = in1[8];
+ { const u32 x19 = in1[7];
+ { const u32 x17 = in1[6];
+ { const u32 x15 = in1[5];
+ { const u32 x13 = in1[4];
+ { const u32 x11 = in1[3];
+ { const u32 x9 = in1[2];
+ { const u32 x7 = in1[1];
+ { const u32 x5 = in1[0];
+ { const u32 x38 = 0;
+ { const u32 x39 = 0;
+ { const u32 x37 = 0;
+ { const u32 x35 = 0;
+ { const u32 x33 = 0;
+ { const u32 x31 = 0;
+ { const u32 x29 = 0;
+ { const u32 x27 = 0;
+ { const u32 x25 = 0;
+ { const u32 x23 = 121666;
+ { u64 x40 = ((u64)x23 * x5);
+ { u64 x41 = (((u64)x23 * x7) + ((u64)x25 * x5));
+ { u64 x42 = ((((u64)(0x2 * x25) * x7) + ((u64)x23 * x9)) + ((u64)x27 * x5));
+ { u64 x43 = (((((u64)x25 * x9) + ((u64)x27 * x7)) + ((u64)x23 * x11)) + ((u64)x29 * x5));
+ { u64 x44 = (((((u64)x27 * x9) + (0x2 * (((u64)x25 * x11) + ((u64)x29 * x7)))) + ((u64)x23 * x13)) + ((u64)x31 * x5));
+ { u64 x45 = (((((((u64)x27 * x11) + ((u64)x29 * x9)) + ((u64)x25 * x13)) + ((u64)x31 * x7)) + ((u64)x23 * x15)) + ((u64)x33 * x5));
+ { u64 x46 = (((((0x2 * ((((u64)x29 * x11) + ((u64)x25 * x15)) + ((u64)x33 * x7))) + ((u64)x27 * x13)) + ((u64)x31 * x9)) + ((u64)x23 * x17)) + ((u64)x35 * x5));
+ { u64 x47 = (((((((((u64)x29 * x13) + ((u64)x31 * x11)) + ((u64)x27 * x15)) + ((u64)x33 * x9)) + ((u64)x25 * x17)) + ((u64)x35 * x7)) + ((u64)x23 * x19)) + ((u64)x37 * x5));
+ { u64 x48 = (((((((u64)x31 * x13) + (0x2 * (((((u64)x29 * x15) + ((u64)x33 * x11)) + ((u64)x25 * x19)) + ((u64)x37 * x7)))) + ((u64)x27 * x17)) + ((u64)x35 * x9)) + ((u64)x23 * x21)) + ((u64)x39 * x5));
+ { u64 x49 = (((((((((((u64)x31 * x15) + ((u64)x33 * x13)) + ((u64)x29 * x17)) + ((u64)x35 * x11)) + ((u64)x27 * x19)) + ((u64)x37 * x9)) + ((u64)x25 * x21)) + ((u64)x39 * x7)) + ((u64)x23 * x20)) + ((u64)x38 * x5));
+ { u64 x50 = (((((0x2 * ((((((u64)x33 * x15) + ((u64)x29 * x19)) + ((u64)x37 * x11)) + ((u64)x25 * x20)) + ((u64)x38 * x7))) + ((u64)x31 * x17)) + ((u64)x35 * x13)) + ((u64)x27 * x21)) + ((u64)x39 * x9));
+ { u64 x51 = (((((((((u64)x33 * x17) + ((u64)x35 * x15)) + ((u64)x31 * x19)) + ((u64)x37 * x13)) + ((u64)x29 * x21)) + ((u64)x39 * x11)) + ((u64)x27 * x20)) + ((u64)x38 * x9));
+ { u64 x52 = (((((u64)x35 * x17) + (0x2 * (((((u64)x33 * x19) + ((u64)x37 * x15)) + ((u64)x29 * x20)) + ((u64)x38 * x11)))) + ((u64)x31 * x21)) + ((u64)x39 * x13));
+ { u64 x53 = (((((((u64)x35 * x19) + ((u64)x37 * x17)) + ((u64)x33 * x21)) + ((u64)x39 * x15)) + ((u64)x31 * x20)) + ((u64)x38 * x13));
+ { u64 x54 = (((0x2 * ((((u64)x37 * x19) + ((u64)x33 * x20)) + ((u64)x38 * x15))) + ((u64)x35 * x21)) + ((u64)x39 * x17));
+ { u64 x55 = (((((u64)x37 * x21) + ((u64)x39 * x19)) + ((u64)x35 * x20)) + ((u64)x38 * x17));
+ { u64 x56 = (((u64)x39 * x21) + (0x2 * (((u64)x37 * x20) + ((u64)x38 * x19))));
+ { u64 x57 = (((u64)x39 * x20) + ((u64)x38 * x21));
+ { u64 x58 = ((u64)(0x2 * x38) * x20);
+ { u64 x59 = (x48 + (x58 << 0x4));
+ { u64 x60 = (x59 + (x58 << 0x1));
+ { u64 x61 = (x60 + x58);
+ { u64 x62 = (x47 + (x57 << 0x4));
+ { u64 x63 = (x62 + (x57 << 0x1));
+ { u64 x64 = (x63 + x57);
+ { u64 x65 = (x46 + (x56 << 0x4));
+ { u64 x66 = (x65 + (x56 << 0x1));
+ { u64 x67 = (x66 + x56);
+ { u64 x68 = (x45 + (x55 << 0x4));
+ { u64 x69 = (x68 + (x55 << 0x1));
+ { u64 x70 = (x69 + x55);
+ { u64 x71 = (x44 + (x54 << 0x4));
+ { u64 x72 = (x71 + (x54 << 0x1));
+ { u64 x73 = (x72 + x54);
+ { u64 x74 = (x43 + (x53 << 0x4));
+ { u64 x75 = (x74 + (x53 << 0x1));
+ { u64 x76 = (x75 + x53);
+ { u64 x77 = (x42 + (x52 << 0x4));
+ { u64 x78 = (x77 + (x52 << 0x1));
+ { u64 x79 = (x78 + x52);
+ { u64 x80 = (x41 + (x51 << 0x4));
+ { u64 x81 = (x80 + (x51 << 0x1));
+ { u64 x82 = (x81 + x51);
+ { u64 x83 = (x40 + (x50 << 0x4));
+ { u64 x84 = (x83 + (x50 << 0x1));
+ { u64 x85 = (x84 + x50);
+ { u64 x86 = (x85 >> 0x1a);
+ { u32 x87 = ((u32)x85 & 0x3ffffff);
+ { u64 x88 = (x86 + x82);
+ { u64 x89 = (x88 >> 0x19);
+ { u32 x90 = ((u32)x88 & 0x1ffffff);
+ { u64 x91 = (x89 + x79);
+ { u64 x92 = (x91 >> 0x1a);
+ { u32 x93 = ((u32)x91 & 0x3ffffff);
+ { u64 x94 = (x92 + x76);
+ { u64 x95 = (x94 >> 0x19);
+ { u32 x96 = ((u32)x94 & 0x1ffffff);
+ { u64 x97 = (x95 + x73);
+ { u64 x98 = (x97 >> 0x1a);
+ { u32 x99 = ((u32)x97 & 0x3ffffff);
+ { u64 x100 = (x98 + x70);
+ { u64 x101 = (x100 >> 0x19);
+ { u32 x102 = ((u32)x100 & 0x1ffffff);
+ { u64 x103 = (x101 + x67);
+ { u64 x104 = (x103 >> 0x1a);
+ { u32 x105 = ((u32)x103 & 0x3ffffff);
+ { u64 x106 = (x104 + x64);
+ { u64 x107 = (x106 >> 0x19);
+ { u32 x108 = ((u32)x106 & 0x1ffffff);
+ { u64 x109 = (x107 + x61);
+ { u64 x110 = (x109 >> 0x1a);
+ { u32 x111 = ((u32)x109 & 0x3ffffff);
+ { u64 x112 = (x110 + x49);
+ { u64 x113 = (x112 >> 0x19);
+ { u32 x114 = ((u32)x112 & 0x1ffffff);
+ { u64 x115 = (x87 + (0x13 * x113));
+ { u32 x116 = (u32) (x115 >> 0x1a);
+ { u32 x117 = ((u32)x115 & 0x3ffffff);
+ { u32 x118 = (x116 + x90);
+ { u32 x119 = (x118 >> 0x19);
+ { u32 x120 = (x118 & 0x1ffffff);
+ out[0] = x117;
+ out[1] = x120;
+ out[2] = (x119 + x93);
+ out[3] = x96;
+ out[4] = x99;
+ out[5] = x102;
+ out[6] = x105;
+ out[7] = x108;
+ out[8] = x111;
+ out[9] = x114;
+ }}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}}
+}
+
+static __always_inline void fe_mul121666(fe *h, const fe_loose *f)
+{
+ fe_mul_121666_impl(h->v, f->v);
+}
+
+static void curve25519_generic(u8 out[CURVE25519_POINT_SIZE], const u8 scalar[CURVE25519_POINT_SIZE], const u8 point[CURVE25519_POINT_SIZE])
+{
+ fe x1, x2, z2, x3, z3, tmp0, tmp1;
+ fe_loose x2l, z2l, x3l, tmp0l, tmp1l;
+ unsigned swap = 0;
+ int pos;
+ u8 e[32];
+
+ memcpy(e, scalar, 32);
+ normalize_secret(e);
+
+ /* The following implementation was transcribed to Coq and proven to
+ * correspond to unary scalar multiplication in affine coordinates given that
+ * x1 != 0 is the x coordinate of some point on the curve. It was also checked
+ * in Coq that doing a ladderstep with x1 = x3 = 0 gives z2' = z3' = 0, and z2
+ * = z3 = 0 gives z2' = z3' = 0. The statement was quantified over the
+ * underlying field, so it applies to Curve25519 itself and the quadratic
+ * twist of Curve25519. It was not proven in Coq that prime-field arithmetic
+ * correctly simulates extension-field arithmetic on prime-field values.
+ * The decoding of the byte array representation of e was not considered.
+ * Specification of Montgomery curves in affine coordinates:
+ * <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Spec/MontgomeryCurve.v#L27>
+ * Proof that these form a group that is isomorphic to a Weierstrass curve:
+ * <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Curves/Montgomery/AffineProofs.v#L35>
+ * Coq transcription and correctness proof of the loop (where scalarbits=255):
+ * <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Curves/Montgomery/XZ.v#L118>
+ * <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Curves/Montgomery/XZProofs.v#L278>
+ * preconditions: 0 <= e < 2^255 (not necessarily e < order), fe_invert(0) = 0
+ */
+ fe_frombytes(&x1, point);
+ fe_1(&x2);
+ fe_0(&z2);
+ fe_copy(&x3, &x1);
+ fe_1(&z3);
+
+ for (pos = 254; pos >= 0; --pos) {
+ /* loop invariant as of right before the test, for the case where x1 != 0:
+ * pos >= -1; if z2 = 0 then x2 is nonzero; if z3 = 0 then x3 is nonzero
+ * let r := e >> (pos+1) in the following equalities of projective points:
+ * to_xz (r*P) === if swap then (x3, z3) else (x2, z2)
+ * to_xz ((r+1)*P) === if swap then (x2, z2) else (x3, z3)
+ * x1 is the nonzero x coordinate of the nonzero point (r*P-(r+1)*P)
+ */
+ unsigned b = 1 & (e[pos / 8] >> (pos & 7));
+ swap ^= b;
+ fe_cswap(&x2, &x3, swap);
+ fe_cswap(&z2, &z3, swap);
+ swap = b;
+ /* Coq transcription of ladderstep formula (called from transcribed loop):
+ * <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Curves/Montgomery/XZ.v#L89>
+ * <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Curves/Montgomery/XZProofs.v#L131>
+ * x1 != 0 <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Curves/Montgomery/XZProofs.v#L217>
+ * x1 = 0 <https://github.com/mit-plv/fiat-crypto/blob/2456d821825521f7e03e65882cc3521795b0320f/src/Curves/Montgomery/XZProofs.v#L147>
+ */
+ fe_sub(&tmp0l, &x3, &z3);
+ fe_sub(&tmp1l, &x2, &z2);
+ fe_add(&x2l, &x2, &z2);
+ fe_add(&z2l, &x3, &z3);
+ fe_mul_tll(&z3, &tmp0l, &x2l);
+ fe_mul_tll(&z2, &z2l, &tmp1l);
+ fe_sq_tl(&tmp0, &tmp1l);
+ fe_sq_tl(&tmp1, &x2l);
+ fe_add(&x3l, &z3, &z2);
+ fe_sub(&z2l, &z3, &z2);
+ fe_mul_ttt(&x2, &tmp1, &tmp0);
+ fe_sub(&tmp1l, &tmp1, &tmp0);
+ fe_sq_tl(&z2, &z2l);
+ fe_mul121666(&z3, &tmp1l);
+ fe_sq_tl(&x3, &x3l);
+ fe_add(&tmp0l, &tmp0, &z3);
+ fe_mul_ttt(&z3, &x1, &z2);
+ fe_mul_tll(&z2, &tmp1l, &tmp0l);
+ }
+ /* here pos=-1, so r=e, so to_xz (e*P) === if swap then (x3, z3) else (x2, z2) */
+ fe_cswap(&x2, &x3, swap);
+ fe_cswap(&z2, &z3, swap);
+
+ fe_invert(&z2, &z2);
+ fe_mul_ttt(&x2, &x2, &z2);
+ fe_tobytes(out, &x2);
+
+ memzero_explicit(&x1, sizeof(x1));
+ memzero_explicit(&x2, sizeof(x2));
+ memzero_explicit(&z2, sizeof(z2));
+ memzero_explicit(&x3, sizeof(x3));
+ memzero_explicit(&z3, sizeof(z3));
+ memzero_explicit(&tmp0, sizeof(tmp0));
+ memzero_explicit(&tmp1, sizeof(tmp1));
+ memzero_explicit(&x2l, sizeof(x2l));
+ memzero_explicit(&z2l, sizeof(z2l));
+ memzero_explicit(&x3l, sizeof(x3l));
+ memzero_explicit(&tmp0l, sizeof(tmp0l));
+ memzero_explicit(&tmp1l, sizeof(tmp1l));
+ memzero_explicit(&e, sizeof(e));
+}
diff --git a/lib/zinc/curve25519/curve25519-hacl64.h b/lib/zinc/curve25519/curve25519-hacl64.h
new file mode 100644
index 000000000000..4fd95cbd9a83
--- /dev/null
+++ b/lib/zinc/curve25519/curve25519-hacl64.h
@@ -0,0 +1,751 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2016-2017 INRIA and Microsoft Corporation.
+ * Copyright (C) 2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ *
+ * This is a machine-generated formally verified implementation of curve25519 DH from:
+ * https://github.com/mitls/hacl-star
+ */
+
+typedef __uint128_t u128;
+static __always_inline u64 u64_eq_mask(u64 x, u64 y)
+{
+ x = ~(x ^ y);
+ x &= x << 32;
+ x &= x << 16;
+ x &= x << 8;
+ x &= x << 4;
+ x &= x << 2;
+ x &= x << 1;
+ return ((s64)x) >> 63;
+}
+
+static __always_inline u64 u64_gte_mask(u64 x, u64 y)
+{
+ u64 low63 = ~((u64)((s64)((s64)(x & 0x7fffffffffffffffLLU) - (s64)(y & 0x7fffffffffffffffLLU)) >> 63));
+ u64 high_bit = ~((u64)((s64)((s64)(x & 0x8000000000000000LLU) - (s64)(y & 0x8000000000000000LLU)) >> 63));
+ return low63 & high_bit;
+}
+
+static __always_inline void modulo_carry_top(u64 *b)
+{
+ u64 b4 = b[4];
+ u64 b0 = b[0];
+ u64 b4_ = b4 & 0x7ffffffffffffLLU;
+ u64 b0_ = b0 + 19 * (b4 >> 51);
+ b[4] = b4_;
+ b[0] = b0_;
+}
+
+static __always_inline void fproduct_copy_from_wide_(u64 *output, u128 *input)
+{
+ {
+ u128 xi = input[0];
+ output[0] = ((u64)(xi));
+ }
+ {
+ u128 xi = input[1];
+ output[1] = ((u64)(xi));
+ }
+ {
+ u128 xi = input[2];
+ output[2] = ((u64)(xi));
+ }
+ {
+ u128 xi = input[3];
+ output[3] = ((u64)(xi));
+ }
+ {
+ u128 xi = input[4];
+ output[4] = ((u64)(xi));
+ }
+}
+
+static __always_inline void fproduct_sum_scalar_multiplication_(u128 *output, u64 *input, u64 s)
+{
+ output[0] += (u128)input[0] * s;
+ output[1] += (u128)input[1] * s;
+ output[2] += (u128)input[2] * s;
+ output[3] += (u128)input[3] * s;
+ output[4] += (u128)input[4] * s;
+}
+
+static __always_inline void fproduct_carry_wide_(u128 *tmp)
+{
+ {
+ u32 ctr = 0;
+ u128 tctr = tmp[ctr];
+ u128 tctrp1 = tmp[ctr + 1];
+ u64 r0 = ((u64)(tctr)) & 0x7ffffffffffffLLU;
+ u128 c = ((tctr) >> (51));
+ tmp[ctr] = ((u128)(r0));
+ tmp[ctr + 1] = ((tctrp1) + (c));
+ }
+ {
+ u32 ctr = 1;
+ u128 tctr = tmp[ctr];
+ u128 tctrp1 = tmp[ctr + 1];
+ u64 r0 = ((u64)(tctr)) & 0x7ffffffffffffLLU;
+ u128 c = ((tctr) >> (51));
+ tmp[ctr] = ((u128)(r0));
+ tmp[ctr + 1] = ((tctrp1) + (c));
+ }
+
+ {
+ u32 ctr = 2;
+ u128 tctr = tmp[ctr];
+ u128 tctrp1 = tmp[ctr + 1];
+ u64 r0 = ((u64)(tctr)) & 0x7ffffffffffffLLU;
+ u128 c = ((tctr) >> (51));
+ tmp[ctr] = ((u128)(r0));
+ tmp[ctr + 1] = ((tctrp1) + (c));
+ }
+ {
+ u32 ctr = 3;
+ u128 tctr = tmp[ctr];
+ u128 tctrp1 = tmp[ctr + 1];
+ u64 r0 = ((u64)(tctr)) & 0x7ffffffffffffLLU;
+ u128 c = ((tctr) >> (51));
+ tmp[ctr] = ((u128)(r0));
+ tmp[ctr + 1] = ((tctrp1) + (c));
+ }
+}
+
+static __always_inline void fmul_shift_reduce(u64 *output)
+{
+ u64 tmp = output[4];
+ u64 b0;
+ {
+ u32 ctr = 5 - 0 - 1;
+ u64 z = output[ctr - 1];
+ output[ctr] = z;
+ }
+ {
+ u32 ctr = 5 - 1 - 1;
+ u64 z = output[ctr - 1];
+ output[ctr] = z;
+ }
+ {
+ u32 ctr = 5 - 2 - 1;
+ u64 z = output[ctr - 1];
+ output[ctr] = z;
+ }
+ {
+ u32 ctr = 5 - 3 - 1;
+ u64 z = output[ctr - 1];
+ output[ctr] = z;
+ }
+ output[0] = tmp;
+ b0 = output[0];
+ output[0] = 19 * b0;
+}
+
+static __always_inline void fmul_mul_shift_reduce_(u128 *output, u64 *input, u64 *input21)
+{
+ u32 i;
+ u64 input2i;
+ {
+ u64 input2i = input21[0];
+ fproduct_sum_scalar_multiplication_(output, input, input2i);
+ fmul_shift_reduce(input);
+ }
+ {
+ u64 input2i = input21[1];
+ fproduct_sum_scalar_multiplication_(output, input, input2i);
+ fmul_shift_reduce(input);
+ }
+ {
+ u64 input2i = input21[2];
+ fproduct_sum_scalar_multiplication_(output, input, input2i);
+ fmul_shift_reduce(input);
+ }
+ {
+ u64 input2i = input21[3];
+ fproduct_sum_scalar_multiplication_(output, input, input2i);
+ fmul_shift_reduce(input);
+ }
+ i = 4;
+ input2i = input21[i];
+ fproduct_sum_scalar_multiplication_(output, input, input2i);
+}
+
+static __always_inline void fmul_fmul(u64 *output, u64 *input, u64 *input21)
+{
+ u64 tmp[5];
+ memcpy(tmp, input, 5 * sizeof(*input));
+ {
+ u128 b4;
+ u128 b0;
+ u128 b4_;
+ u128 b0_;
+ u64 i0;
+ u64 i1;
+ u64 i0_;
+ u64 i1_;
+ u128 t[5] = { 0 };
+ fmul_mul_shift_reduce_(t, tmp, input21);
+ fproduct_carry_wide_(t);
+ b4 = t[4];
+ b0 = t[0];
+ b4_ = ((b4) & (((u128)(0x7ffffffffffffLLU))));
+ b0_ = ((b0) + (((u128)(19) * (((u64)(((b4) >> (51))))))));
+ t[4] = b4_;
+ t[0] = b0_;
+ fproduct_copy_from_wide_(output, t);
+ i0 = output[0];
+ i1 = output[1];
+ i0_ = i0 & 0x7ffffffffffffLLU;
+ i1_ = i1 + (i0 >> 51);
+ output[0] = i0_;
+ output[1] = i1_;
+ }
+}
+
+static __always_inline void fsquare_fsquare__(u128 *tmp, u64 *output)
+{
+ u64 r0 = output[0];
+ u64 r1 = output[1];
+ u64 r2 = output[2];
+ u64 r3 = output[3];
+ u64 r4 = output[4];
+ u64 d0 = r0 * 2;
+ u64 d1 = r1 * 2;
+ u64 d2 = r2 * 2 * 19;
+ u64 d419 = r4 * 19;
+ u64 d4 = d419 * 2;
+ u128 s0 = ((((((u128)(r0) * (r0))) + (((u128)(d4) * (r1))))) + (((u128)(d2) * (r3))));
+ u128 s1 = ((((((u128)(d0) * (r1))) + (((u128)(d4) * (r2))))) + (((u128)(r3 * 19) * (r3))));
+ u128 s2 = ((((((u128)(d0) * (r2))) + (((u128)(r1) * (r1))))) + (((u128)(d4) * (r3))));
+ u128 s3 = ((((((u128)(d0) * (r3))) + (((u128)(d1) * (r2))))) + (((u128)(r4) * (d419))));
+ u128 s4 = ((((((u128)(d0) * (r4))) + (((u128)(d1) * (r3))))) + (((u128)(r2) * (r2))));
+ tmp[0] = s0;
+ tmp[1] = s1;
+ tmp[2] = s2;
+ tmp[3] = s3;
+ tmp[4] = s4;
+}
+
+static __always_inline void fsquare_fsquare_(u128 *tmp, u64 *output)
+{
+ u128 b4;
+ u128 b0;
+ u128 b4_;
+ u128 b0_;
+ u64 i0;
+ u64 i1;
+ u64 i0_;
+ u64 i1_;
+ fsquare_fsquare__(tmp, output);
+ fproduct_carry_wide_(tmp);
+ b4 = tmp[4];
+ b0 = tmp[0];
+ b4_ = ((b4) & (((u128)(0x7ffffffffffffLLU))));
+ b0_ = ((b0) + (((u128)(19) * (((u64)(((b4) >> (51))))))));
+ tmp[4] = b4_;
+ tmp[0] = b0_;
+ fproduct_copy_from_wide_(output, tmp);
+ i0 = output[0];
+ i1 = output[1];
+ i0_ = i0 & 0x7ffffffffffffLLU;
+ i1_ = i1 + (i0 >> 51);
+ output[0] = i0_;
+ output[1] = i1_;
+}
+
+static __always_inline void fsquare_fsquare_times_(u64 *output, u128 *tmp, u32 count1)
+{
+ u32 i;
+ fsquare_fsquare_(tmp, output);
+ for (i = 1; i < count1; ++i)
+ fsquare_fsquare_(tmp, output);
+}
+
+static __always_inline void fsquare_fsquare_times(u64 *output, u64 *input, u32 count1)
+{
+ u128 t[5];
+ memcpy(output, input, 5 * sizeof(*input));
+ fsquare_fsquare_times_(output, t, count1);
+}
+
+static __always_inline void fsquare_fsquare_times_inplace(u64 *output, u32 count1)
+{
+ u128 t[5];
+ fsquare_fsquare_times_(output, t, count1);
+}
+
+static __always_inline void crecip_crecip(u64 *out, u64 *z)
+{
+ u64 buf[20] = { 0 };
+ u64 *a0 = buf;
+ u64 *t00 = buf + 5;
+ u64 *b0 = buf + 10;
+ u64 *t01;
+ u64 *b1;
+ u64 *c0;
+ u64 *a;
+ u64 *t0;
+ u64 *b;
+ u64 *c;
+ fsquare_fsquare_times(a0, z, 1);
+ fsquare_fsquare_times(t00, a0, 2);
+ fmul_fmul(b0, t00, z);
+ fmul_fmul(a0, b0, a0);
+ fsquare_fsquare_times(t00, a0, 1);
+ fmul_fmul(b0, t00, b0);
+ fsquare_fsquare_times(t00, b0, 5);
+ t01 = buf + 5;
+ b1 = buf + 10;
+ c0 = buf + 15;
+ fmul_fmul(b1, t01, b1);
+ fsquare_fsquare_times(t01, b1, 10);
+ fmul_fmul(c0, t01, b1);
+ fsquare_fsquare_times(t01, c0, 20);
+ fmul_fmul(t01, t01, c0);
+ fsquare_fsquare_times_inplace(t01, 10);
+ fmul_fmul(b1, t01, b1);
+ fsquare_fsquare_times(t01, b1, 50);
+ a = buf;
+ t0 = buf + 5;
+ b = buf + 10;
+ c = buf + 15;
+ fmul_fmul(c, t0, b);
+ fsquare_fsquare_times(t0, c, 100);
+ fmul_fmul(t0, t0, c);
+ fsquare_fsquare_times_inplace(t0, 50);
+ fmul_fmul(t0, t0, b);
+ fsquare_fsquare_times_inplace(t0, 5);
+ fmul_fmul(out, t0, a);
+}
+
+static __always_inline void fsum(u64 *a, u64 *b)
+{
+ a[0] += b[0];
+ a[1] += b[1];
+ a[2] += b[2];
+ a[3] += b[3];
+ a[4] += b[4];
+}
+
+static __always_inline void fdifference(u64 *a, u64 *b)
+{
+ u64 tmp[5] = { 0 };
+ u64 b0;
+ u64 b1;
+ u64 b2;
+ u64 b3;
+ u64 b4;
+ memcpy(tmp, b, 5 * sizeof(*b));
+ b0 = tmp[0];
+ b1 = tmp[1];
+ b2 = tmp[2];
+ b3 = tmp[3];
+ b4 = tmp[4];
+ tmp[0] = b0 + 0x3fffffffffff68LLU;
+ tmp[1] = b1 + 0x3ffffffffffff8LLU;
+ tmp[2] = b2 + 0x3ffffffffffff8LLU;
+ tmp[3] = b3 + 0x3ffffffffffff8LLU;
+ tmp[4] = b4 + 0x3ffffffffffff8LLU;
+ {
+ u64 xi = a[0];
+ u64 yi = tmp[0];
+ a[0] = yi - xi;
+ }
+ {
+ u64 xi = a[1];
+ u64 yi = tmp[1];
+ a[1] = yi - xi;
+ }
+ {
+ u64 xi = a[2];
+ u64 yi = tmp[2];
+ a[2] = yi - xi;
+ }
+ {
+ u64 xi = a[3];
+ u64 yi = tmp[3];
+ a[3] = yi - xi;
+ }
+ {
+ u64 xi = a[4];
+ u64 yi = tmp[4];
+ a[4] = yi - xi;
+ }
+}
+
+static __always_inline void fscalar(u64 *output, u64 *b, u64 s)
+{
+ u128 tmp[5];
+ u128 b4;
+ u128 b0;
+ u128 b4_;
+ u128 b0_;
+ {
+ u64 xi = b[0];
+ tmp[0] = ((u128)(xi) * (s));
+ }
+ {
+ u64 xi = b[1];
+ tmp[1] = ((u128)(xi) * (s));
+ }
+ {
+ u64 xi = b[2];
+ tmp[2] = ((u128)(xi) * (s));
+ }
+ {
+ u64 xi = b[3];
+ tmp[3] = ((u128)(xi) * (s));
+ }
+ {
+ u64 xi = b[4];
+ tmp[4] = ((u128)(xi) * (s));
+ }
+ fproduct_carry_wide_(tmp);
+ b4 = tmp[4];
+ b0 = tmp[0];
+ b4_ = ((b4) & (((u128)(0x7ffffffffffffLLU))));
+ b0_ = ((b0) + (((u128)(19) * (((u64)(((b4) >> (51))))))));
+ tmp[4] = b4_;
+ tmp[0] = b0_;
+ fproduct_copy_from_wide_(output, tmp);
+}
+
+static __always_inline void fmul(u64 *output, u64 *a, u64 *b)
+{
+ fmul_fmul(output, a, b);
+}
+
+static __always_inline void crecip(u64 *output, u64 *input)
+{
+ crecip_crecip(output, input);
+}
+
+static __always_inline void point_swap_conditional_step(u64 *a, u64 *b, u64 swap1, u32 ctr)
+{
+ u32 i = ctr - 1;
+ u64 ai = a[i];
+ u64 bi = b[i];
+ u64 x = swap1 & (ai ^ bi);
+ u64 ai1 = ai ^ x;
+ u64 bi1 = bi ^ x;
+ a[i] = ai1;
+ b[i] = bi1;
+}
+
+static __always_inline void point_swap_conditional5(u64 *a, u64 *b, u64 swap1)
+{
+ point_swap_conditional_step(a, b, swap1, 5);
+ point_swap_conditional_step(a, b, swap1, 4);
+ point_swap_conditional_step(a, b, swap1, 3);
+ point_swap_conditional_step(a, b, swap1, 2);
+ point_swap_conditional_step(a, b, swap1, 1);
+}
+
+static __always_inline void point_swap_conditional(u64 *a, u64 *b, u64 iswap)
+{
+ u64 swap1 = 0 - iswap;
+ point_swap_conditional5(a, b, swap1);
+ point_swap_conditional5(a + 5, b + 5, swap1);
+}
+
+static __always_inline void point_copy(u64 *output, u64 *input)
+{
+ memcpy(output, input, 5 * sizeof(*input));
+ memcpy(output + 5, input + 5, 5 * sizeof(*input));
+}
+
+static __always_inline void addanddouble_fmonty(u64 *pp, u64 *ppq, u64 *p, u64 *pq, u64 *qmqp)
+{
+ u64 *qx = qmqp;
+ u64 *x2 = pp;
+ u64 *z2 = pp + 5;
+ u64 *x3 = ppq;
+ u64 *z3 = ppq + 5;
+ u64 *x = p;
+ u64 *z = p + 5;
+ u64 *xprime = pq;
+ u64 *zprime = pq + 5;
+ u64 buf[40] = { 0 };
+ u64 *origx = buf;
+ u64 *origxprime0 = buf + 5;
+ u64 *xxprime0;
+ u64 *zzprime0;
+ u64 *origxprime;
+ xxprime0 = buf + 25;
+ zzprime0 = buf + 30;
+ memcpy(origx, x, 5 * sizeof(*x));
+ fsum(x, z);
+ fdifference(z, origx);
+ memcpy(origxprime0, xprime, 5 * sizeof(*xprime));
+ fsum(xprime, zprime);
+ fdifference(zprime, origxprime0);
+ fmul(xxprime0, xprime, z);
+ fmul(zzprime0, x, zprime);
+ origxprime = buf + 5;
+ {
+ u64 *xx0;
+ u64 *zz0;
+ u64 *xxprime;
+ u64 *zzprime;
+ u64 *zzzprime;
+ xx0 = buf + 15;
+ zz0 = buf + 20;
+ xxprime = buf + 25;
+ zzprime = buf + 30;
+ zzzprime = buf + 35;
+ memcpy(origxprime, xxprime, 5 * sizeof(*xxprime));
+ fsum(xxprime, zzprime);
+ fdifference(zzprime, origxprime);
+ fsquare_fsquare_times(x3, xxprime, 1);
+ fsquare_fsquare_times(zzzprime, zzprime, 1);
+ fmul(z3, zzzprime, qx);
+ fsquare_fsquare_times(xx0, x, 1);
+ fsquare_fsquare_times(zz0, z, 1);
+ {
+ u64 *zzz;
+ u64 *xx;
+ u64 *zz;
+ u64 scalar;
+ zzz = buf + 10;
+ xx = buf + 15;
+ zz = buf + 20;
+ fmul(x2, xx, zz);
+ fdifference(zz, xx);
+ scalar = 121665;
+ fscalar(zzz, zz, scalar);
+ fsum(zzz, xx);
+ fmul(z2, zzz, zz);
+ }
+ }
+}
+
+static __always_inline void ladder_smallloop_cmult_small_loop_step(u64 *nq, u64 *nqpq, u64 *nq2, u64 *nqpq2, u64 *q, u8 byt)
+{
+ u64 bit0 = (u64)(byt >> 7);
+ u64 bit;
+ point_swap_conditional(nq, nqpq, bit0);
+ addanddouble_fmonty(nq2, nqpq2, nq, nqpq, q);
+ bit = (u64)(byt >> 7);
+ point_swap_conditional(nq2, nqpq2, bit);
+}
+
+static __always_inline void ladder_smallloop_cmult_small_loop_double_step(u64 *nq, u64 *nqpq, u64 *nq2, u64 *nqpq2, u64 *q, u8 byt)
+{
+ u8 byt1;
+ ladder_smallloop_cmult_small_loop_step(nq, nqpq, nq2, nqpq2, q, byt);
+ byt1 = byt << 1;
+ ladder_smallloop_cmult_small_loop_step(nq2, nqpq2, nq, nqpq, q, byt1);
+}
+
+static __always_inline void ladder_smallloop_cmult_small_loop(u64 *nq, u64 *nqpq, u64 *nq2, u64 *nqpq2, u64 *q, u8 byt, u32 i)
+{
+ while (i--) {
+ ladder_smallloop_cmult_small_loop_double_step(nq, nqpq, nq2, nqpq2, q, byt);
+ byt <<= 2;
+ }
+}
+
+static __always_inline void ladder_bigloop_cmult_big_loop(u8 *n1, u64 *nq, u64 *nqpq, u64 *nq2, u64 *nqpq2, u64 *q, u32 i)
+{
+ while (i--) {
+ u8 byte = n1[i];
+ ladder_smallloop_cmult_small_loop(nq, nqpq, nq2, nqpq2, q, byte, 4);
+ }
+}
+
+static __always_inline void ladder_cmult(u64 *result, u8 *n1, u64 *q)
+{
+ u64 point_buf[40] = { 0 };
+ u64 *nq = point_buf;
+ u64 *nqpq = point_buf + 10;
+ u64 *nq2 = point_buf + 20;
+ u64 *nqpq2 = point_buf + 30;
+ point_copy(nqpq, q);
+ nq[0] = 1;
+ ladder_bigloop_cmult_big_loop(n1, nq, nqpq, nq2, nqpq2, q, 32);
+ point_copy(result, nq);
+}
+
+static __always_inline void format_fexpand(u64 *output, const u8 *input)
+{
+ const u8 *x00 = input + 6;
+ const u8 *x01 = input + 12;
+ const u8 *x02 = input + 19;
+ const u8 *x0 = input + 24;
+ u64 i0, i1, i2, i3, i4, output0, output1, output2, output3, output4;
+ i0 = le64_to_cpup((__force __le64 *)input);
+ i1 = le64_to_cpup((__force __le64 *)x00);
+ i2 = le64_to_cpup((__force __le64 *)x01);
+ i3 = le64_to_cpup((__force __le64 *)x02);
+ i4 = le64_to_cpup((__force __le64 *)x0);
+ output0 = i0 & 0x7ffffffffffffLLU;
+ output1 = i1 >> 3 & 0x7ffffffffffffLLU;
+ output2 = i2 >> 6 & 0x7ffffffffffffLLU;
+ output3 = i3 >> 1 & 0x7ffffffffffffLLU;
+ output4 = i4 >> 12 & 0x7ffffffffffffLLU;
+ output[0] = output0;
+ output[1] = output1;
+ output[2] = output2;
+ output[3] = output3;
+ output[4] = output4;
+}
+
+static __always_inline void format_fcontract_first_carry_pass(u64 *input)
+{
+ u64 t0 = input[0];
+ u64 t1 = input[1];
+ u64 t2 = input[2];
+ u64 t3 = input[3];
+ u64 t4 = input[4];
+ u64 t1_ = t1 + (t0 >> 51);
+ u64 t0_ = t0 & 0x7ffffffffffffLLU;
+ u64 t2_ = t2 + (t1_ >> 51);
+ u64 t1__ = t1_ & 0x7ffffffffffffLLU;
+ u64 t3_ = t3 + (t2_ >> 51);
+ u64 t2__ = t2_ & 0x7ffffffffffffLLU;
+ u64 t4_ = t4 + (t3_ >> 51);
+ u64 t3__ = t3_ & 0x7ffffffffffffLLU;
+ input[0] = t0_;
+ input[1] = t1__;
+ input[2] = t2__;
+ input[3] = t3__;
+ input[4] = t4_;
+}
+
+static __always_inline void format_fcontract_first_carry_full(u64 *input)
+{
+ format_fcontract_first_carry_pass(input);
+ modulo_carry_top(input);
+}
+
+static __always_inline void format_fcontract_second_carry_pass(u64 *input)
+{
+ u64 t0 = input[0];
+ u64 t1 = input[1];
+ u64 t2 = input[2];
+ u64 t3 = input[3];
+ u64 t4 = input[4];
+ u64 t1_ = t1 + (t0 >> 51);
+ u64 t0_ = t0 & 0x7ffffffffffffLLU;
+ u64 t2_ = t2 + (t1_ >> 51);
+ u64 t1__ = t1_ & 0x7ffffffffffffLLU;
+ u64 t3_ = t3 + (t2_ >> 51);
+ u64 t2__ = t2_ & 0x7ffffffffffffLLU;
+ u64 t4_ = t4 + (t3_ >> 51);
+ u64 t3__ = t3_ & 0x7ffffffffffffLLU;
+ input[0] = t0_;
+ input[1] = t1__;
+ input[2] = t2__;
+ input[3] = t3__;
+ input[4] = t4_;
+}
+
+static __always_inline void format_fcontract_second_carry_full(u64 *input)
+{
+ u64 i0;
+ u64 i1;
+ u64 i0_;
+ u64 i1_;
+ format_fcontract_second_carry_pass(input);
+ modulo_carry_top(input);
+ i0 = input[0];
+ i1 = input[1];
+ i0_ = i0 & 0x7ffffffffffffLLU;
+ i1_ = i1 + (i0 >> 51);
+ input[0] = i0_;
+ input[1] = i1_;
+}
+
+static __always_inline void format_fcontract_trim(u64 *input)
+{
+ u64 a0 = input[0];
+ u64 a1 = input[1];
+ u64 a2 = input[2];
+ u64 a3 = input[3];
+ u64 a4 = input[4];
+ u64 mask0 = u64_gte_mask(a0, 0x7ffffffffffedLLU);
+ u64 mask1 = u64_eq_mask(a1, 0x7ffffffffffffLLU);
+ u64 mask2 = u64_eq_mask(a2, 0x7ffffffffffffLLU);
+ u64 mask3 = u64_eq_mask(a3, 0x7ffffffffffffLLU);
+ u64 mask4 = u64_eq_mask(a4, 0x7ffffffffffffLLU);
+ u64 mask = (((mask0 & mask1) & mask2) & mask3) & mask4;
+ u64 a0_ = a0 - (0x7ffffffffffedLLU & mask);
+ u64 a1_ = a1 - (0x7ffffffffffffLLU & mask);
+ u64 a2_ = a2 - (0x7ffffffffffffLLU & mask);
+ u64 a3_ = a3 - (0x7ffffffffffffLLU & mask);
+ u64 a4_ = a4 - (0x7ffffffffffffLLU & mask);
+ input[0] = a0_;
+ input[1] = a1_;
+ input[2] = a2_;
+ input[3] = a3_;
+ input[4] = a4_;
+}
+
+static __always_inline void format_fcontract_store(u8 *output, u64 *input)
+{
+ u64 t0 = input[0];
+ u64 t1 = input[1];
+ u64 t2 = input[2];
+ u64 t3 = input[3];
+ u64 t4 = input[4];
+ u64 o0 = t1 << 51 | t0;
+ u64 o1 = t2 << 38 | t1 >> 13;
+ u64 o2 = t3 << 25 | t2 >> 26;
+ u64 o3 = t4 << 12 | t3 >> 39;
+ u8 *b0 = output;
+ u8 *b1 = output + 8;
+ u8 *b2 = output + 16;
+ u8 *b3 = output + 24;
+ *(__force __le64 *)b0 = cpu_to_le64(o0);
+ *(__force __le64 *)b1 = cpu_to_le64(o1);
+ *(__force __le64 *)b2 = cpu_to_le64(o2);
+ *(__force __le64 *)b3 = cpu_to_le64(o3);
+}
+
+static __always_inline void format_fcontract(u8 *output, u64 *input)
+{
+ format_fcontract_first_carry_full(input);
+ format_fcontract_second_carry_full(input);
+ format_fcontract_trim(input);
+ format_fcontract_store(output, input);
+}
+
+static __always_inline void format_scalar_of_point(u8 *scalar, u64 *point)
+{
+ u64 *x = point;
+ u64 *z = point + 5;
+ u64 buf[10] __aligned(32) = { 0 };
+ u64 *zmone = buf;
+ u64 *sc = buf + 5;
+ crecip(zmone, z);
+ fmul(sc, x, zmone);
+ format_fcontract(scalar, sc);
+}
+
+static void curve25519_generic(u8 mypublic[CURVE25519_POINT_SIZE], const u8 secret[CURVE25519_POINT_SIZE], const u8 basepoint[CURVE25519_POINT_SIZE])
+{
+ u64 buf0[10] __aligned(32) = { 0 };
+ u64 *x0 = buf0;
+ u64 *z = buf0 + 5;
+ u64 *q;
+ format_fexpand(x0, basepoint);
+ z[0] = 1;
+ q = buf0;
+ {
+ u8 e[32] __aligned(32) = { 0 };
+ u8 *scalar;
+ memcpy(e, secret, 32);
+ normalize_secret(e);
+ scalar = e;
+ {
+ u64 buf[15] = { 0 };
+ u64 *nq = buf;
+ u64 *x = nq;
+ x[0] = 1;
+ ladder_cmult(nq, scalar, q);
+ format_scalar_of_point(mypublic, nq);
+ memzero_explicit(buf, sizeof(buf));
+ }
+ memzero_explicit(e, sizeof(e));
+ }
+ memzero_explicit(buf0, sizeof(buf0));
+}
diff --git a/lib/zinc/curve25519/curve25519-x86_64.h b/lib/zinc/curve25519/curve25519-x86_64.h
new file mode 100644
index 000000000000..b1c3766702f2
--- /dev/null
+++ b/lib/zinc/curve25519/curve25519-x86_64.h
@@ -0,0 +1,2060 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (c) 2017 Armando Faz <armfazh@...unicamp.br>. All Rights Reserved.
+ * Copyright (C) 2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright (C) 2018 Samuel Neves <sneves@....uc.pt>. All Rights Reserved.
+ */
+
+#include <asm/cpufeature.h>
+#include <asm/processor.h>
+
+static bool curve25519_use_bmi2 __ro_after_init;
+static bool curve25519_use_adx __ro_after_init;
+void __init curve25519_fpu_init(void)
+{
+ curve25519_use_bmi2 = boot_cpu_has(X86_FEATURE_BMI2);
+ curve25519_use_adx = boot_cpu_has(X86_FEATURE_BMI2) && boot_cpu_has(X86_FEATURE_ADX);
+}
+
+enum { NUM_WORDS_ELTFP25519 = 4 };
+typedef __aligned(32) u64 eltfp25519_1w[NUM_WORDS_ELTFP25519];
+typedef __aligned(32) u64 eltfp25519_1w_buffer[2 * NUM_WORDS_ELTFP25519];
+
+#define mul_eltfp25519_1w_adx(c, a, b) do { \
+ mul_256x256_integer_adx(m.buffer, a, b); \
+ red_eltfp25519_1w_adx(c, m.buffer); \
+} while (0)
+
+#define mul_eltfp25519_1w_bmi2(c, a, b) do { \
+ mul_256x256_integer_bmi2(m.buffer, a, b); \
+ red_eltfp25519_1w_bmi2(c, m.buffer); \
+} while(0)
+
+#define sqr_eltfp25519_1w_adx(a) do { \
+ sqr_256x256_integer_adx(m.buffer, a); \
+ red_eltfp25519_1w_adx(a, m.buffer); \
+} while (0)
+
+#define sqr_eltfp25519_1w_bmi2(a) do { \
+ sqr_256x256_integer_bmi2(m.buffer, a); \
+ red_eltfp25519_1w_bmi2(a, m.buffer); \
+} while (0)
+
+#define mul_eltfp25519_2w_adx(c, a, b) do { \
+ mul2_256x256_integer_adx(m.buffer, a, b); \
+ red_eltfp25519_2w_adx(c, m.buffer); \
+} while (0)
+
+#define mul_eltfp25519_2w_bmi2(c, a, b) do { \
+ mul2_256x256_integer_bmi2(m.buffer, a, b); \
+ red_eltfp25519_2w_bmi2(c, m.buffer); \
+} while (0)
+
+#define sqr_eltfp25519_2w_adx(a) do { \
+ sqr2_256x256_integer_adx(m.buffer, a); \
+ red_eltfp25519_2w_adx(a, m.buffer); \
+} while (0)
+
+#define sqr_eltfp25519_2w_bmi2(a) do { \
+ sqr2_256x256_integer_bmi2(m.buffer, a); \
+ red_eltfp25519_2w_bmi2(a, m.buffer); \
+} while (0)
+
+#define sqrn_eltfp25519_1w_adx(a, times) do { \
+ int ____counter = (times); \
+ while (____counter-- > 0) \
+ sqr_eltfp25519_1w_adx(a); \
+} while (0)
+
+#define sqrn_eltfp25519_1w_bmi2(a, times) do { \
+ int ____counter = (times); \
+ while (____counter-- > 0) \
+ sqr_eltfp25519_1w_bmi2(a); \
+} while (0)
+
+#define copy_eltfp25519_1w(C, A) do { \
+ (C)[0] = (A)[0]; \
+ (C)[1] = (A)[1]; \
+ (C)[2] = (A)[2]; \
+ (C)[3] = (A)[3]; \
+} while (0)
+
+#define setzero_eltfp25519_1w(C) do { \
+ (C)[0] = 0; \
+ (C)[1] = 0; \
+ (C)[2] = 0; \
+ (C)[3] = 0; \
+} while (0)
+
+__aligned(32) static const u64 table_ladder_8k[252 * NUM_WORDS_ELTFP25519] = {
+ /* 1 */ 0xfffffffffffffff3UL, 0xffffffffffffffffUL, 0xffffffffffffffffUL, 0x5fffffffffffffffUL,
+ /* 2 */ 0x6b8220f416aafe96UL, 0x82ebeb2b4f566a34UL, 0xd5a9a5b075a5950fUL, 0x5142b2cf4b2488f4UL,
+ /* 3 */ 0x6aaebc750069680cUL, 0x89cf7820a0f99c41UL, 0x2a58d9183b56d0f4UL, 0x4b5aca80e36011a4UL,
+ /* 4 */ 0x329132348c29745dUL, 0xf4a2e616e1642fd7UL, 0x1e45bb03ff67bc34UL, 0x306912d0f42a9b4aUL,
+ /* 5 */ 0xff886507e6af7154UL, 0x04f50e13dfeec82fUL, 0xaa512fe82abab5ceUL, 0x174e251a68d5f222UL,
+ /* 6 */ 0xcf96700d82028898UL, 0x1743e3370a2c02c5UL, 0x379eec98b4e86eaaUL, 0x0c59888a51e0482eUL,
+ /* 7 */ 0xfbcbf1d699b5d189UL, 0xacaef0d58e9fdc84UL, 0xc1c20d06231f7614UL, 0x2938218da274f972UL,
+ /* 8 */ 0xf6af49beff1d7f18UL, 0xcc541c22387ac9c2UL, 0x96fcc9ef4015c56bUL, 0x69c1627c690913a9UL,
+ /* 9 */ 0x7a86fd2f4733db0eUL, 0xfdb8c4f29e087de9UL, 0x095e4b1a8ea2a229UL, 0x1ad7a7c829b37a79UL,
+ /* 10 */ 0x342d89cad17ea0c0UL, 0x67bedda6cced2051UL, 0x19ca31bf2bb42f74UL, 0x3df7b4c84980acbbUL,
+ /* 11 */ 0xa8c6444dc80ad883UL, 0xb91e440366e3ab85UL, 0xc215cda00164f6d8UL, 0x3d867c6ef247e668UL,
+ /* 12 */ 0xc7dd582bcc3e658cUL, 0xfd2c4748ee0e5528UL, 0xa0fd9b95cc9f4f71UL, 0x7529d871b0675ddfUL,
+ /* 13 */ 0xb8f568b42d3cbd78UL, 0x1233011b91f3da82UL, 0x2dce6ccd4a7c3b62UL, 0x75e7fc8e9e498603UL,
+ /* 14 */ 0x2f4f13f1fcd0b6ecUL, 0xf1a8ca1f29ff7a45UL, 0xc249c1a72981e29bUL, 0x6ebe0dbb8c83b56aUL,
+ /* 15 */ 0x7114fa8d170bb222UL, 0x65a2dcd5bf93935fUL, 0xbdc41f68b59c979aUL, 0x2f0eef79a2ce9289UL,
+ /* 16 */ 0x42ecbf0c083c37ceUL, 0x2930bc09ec496322UL, 0xf294b0c19cfeac0dUL, 0x3780aa4bedfabb80UL,
+ /* 17 */ 0x56c17d3e7cead929UL, 0xe7cb4beb2e5722c5UL, 0x0ce931732dbfe15aUL, 0x41b883c7621052f8UL,
+ /* 18 */ 0xdbf75ca0c3d25350UL, 0x2936be086eb1e351UL, 0xc936e03cb4a9b212UL, 0x1d45bf82322225aaUL,
+ /* 19 */ 0xe81ab1036a024cc5UL, 0xe212201c304c9a72UL, 0xc5d73fba6832b1fcUL, 0x20ffdb5a4d839581UL,
+ /* 20 */ 0xa283d367be5d0fadUL, 0x6c2b25ca8b164475UL, 0x9d4935467caaf22eUL, 0x5166408eee85ff49UL,
+ /* 21 */ 0x3c67baa2fab4e361UL, 0xb3e433c67ef35cefUL, 0x5259729241159b1cUL, 0x6a621892d5b0ab33UL,
+ /* 22 */ 0x20b74a387555cdcbUL, 0x532aa10e1208923fUL, 0xeaa17b7762281dd1UL, 0x61ab3443f05c44bfUL,
+ /* 23 */ 0x257a6c422324def8UL, 0x131c6c1017e3cf7fUL, 0x23758739f630a257UL, 0x295a407a01a78580UL,
+ /* 24 */ 0xf8c443246d5da8d9UL, 0x19d775450c52fa5dUL, 0x2afcfc92731bf83dUL, 0x7d10c8e81b2b4700UL,
+ /* 25 */ 0xc8e0271f70baa20bUL, 0x993748867ca63957UL, 0x5412efb3cb7ed4bbUL, 0x3196d36173e62975UL,
+ /* 26 */ 0xde5bcad141c7dffcUL, 0x47cc8cd2b395c848UL, 0xa34cd942e11af3cbUL, 0x0256dbf2d04ecec2UL,
+ /* 27 */ 0x875ab7e94b0e667fUL, 0xcad4dd83c0850d10UL, 0x47f12e8f4e72c79fUL, 0x5f1a87bb8c85b19bUL,
+ /* 28 */ 0x7ae9d0b6437f51b8UL, 0x12c7ce5518879065UL, 0x2ade09fe5cf77aeeUL, 0x23a05a2f7d2c5627UL,
+ /* 29 */ 0x5908e128f17c169aUL, 0xf77498dd8ad0852dUL, 0x74b4c4ceab102f64UL, 0x183abadd10139845UL,
+ /* 30 */ 0xb165ba8daa92aaacUL, 0xd5c5ef9599386705UL, 0xbe2f8f0cf8fc40d1UL, 0x2701e635ee204514UL,
+ /* 31 */ 0x629fa80020156514UL, 0xf223868764a8c1ceUL, 0x5b894fff0b3f060eUL, 0x60d9944cf708a3faUL,
+ /* 32 */ 0xaeea001a1c7a201fUL, 0xebf16a633ee2ce63UL, 0x6f7709594c7a07e1UL, 0x79b958150d0208cbUL,
+ /* 33 */ 0x24b55e5301d410e7UL, 0xe3a34edff3fdc84dUL, 0xd88768e4904032d8UL, 0x131384427b3aaeecUL,
+ /* 34 */ 0x8405e51286234f14UL, 0x14dc4739adb4c529UL, 0xb8a2b5b250634ffdUL, 0x2fe2a94ad8a7ff93UL,
+ /* 35 */ 0xec5c57efe843faddUL, 0x2843ce40f0bb9918UL, 0xa4b561d6cf3d6305UL, 0x743629bde8fb777eUL,
+ /* 36 */ 0x343edd46bbaf738fUL, 0xed981828b101a651UL, 0xa401760b882c797aUL, 0x1fc223e28dc88730UL,
+ /* 37 */ 0x48604e91fc0fba0eUL, 0xb637f78f052c6fa4UL, 0x91ccac3d09e9239cUL, 0x23f7eed4437a687cUL,
+ /* 38 */ 0x5173b1118d9bd800UL, 0x29d641b63189d4a7UL, 0xfdbf177988bbc586UL, 0x2959894fcad81df5UL,
+ /* 39 */ 0xaebc8ef3b4bbc899UL, 0x4148995ab26992b9UL, 0x24e20b0134f92cfbUL, 0x40d158894a05dee8UL,
+ /* 40 */ 0x46b00b1185af76f6UL, 0x26bac77873187a79UL, 0x3dc0bf95ab8fff5fUL, 0x2a608bd8945524d7UL,
+ /* 41 */ 0x26449588bd446302UL, 0x7c4bc21c0388439cUL, 0x8e98a4f383bd11b2UL, 0x26218d7bc9d876b9UL,
+ /* 42 */ 0xe3081542997c178aUL, 0x3c2d29a86fb6606fUL, 0x5c217736fa279374UL, 0x7dde05734afeb1faUL,
+ /* 43 */ 0x3bf10e3906d42babUL, 0xe4f7803e1980649cUL, 0xe6053bf89595bf7aUL, 0x394faf38da245530UL,
+ /* 44 */ 0x7a8efb58896928f4UL, 0xfbc778e9cc6a113cUL, 0x72670ce330af596fUL, 0x48f222a81d3d6cf7UL,
+ /* 45 */ 0xf01fce410d72caa7UL, 0x5a20ecc7213b5595UL, 0x7bc21165c1fa1483UL, 0x07f89ae31da8a741UL,
+ /* 46 */ 0x05d2c2b4c6830ff9UL, 0xd43e330fc6316293UL, 0xa5a5590a96d3a904UL, 0x705edb91a65333b6UL,
+ /* 47 */ 0x048ee15e0bb9a5f7UL, 0x3240cfca9e0aaf5dUL, 0x8f4b71ceedc4a40bUL, 0x621c0da3de544a6dUL,
+ /* 48 */ 0x92872836a08c4091UL, 0xce8375b010c91445UL, 0x8a72eb524f276394UL, 0x2667fcfa7ec83635UL,
+ /* 49 */ 0x7f4c173345e8752aUL, 0x061b47feee7079a5UL, 0x25dd9afa9f86ff34UL, 0x3780cef5425dc89cUL,
+ /* 50 */ 0x1a46035a513bb4e9UL, 0x3e1ef379ac575adaUL, 0xc78c5f1c5fa24b50UL, 0x321a967634fd9f22UL,
+ /* 51 */ 0x946707b8826e27faUL, 0x3dca84d64c506fd0UL, 0xc189218075e91436UL, 0x6d9284169b3b8484UL,
+ /* 52 */ 0x3a67e840383f2ddfUL, 0x33eec9a30c4f9b75UL, 0x3ec7c86fa783ef47UL, 0x26ec449fbac9fbc4UL,
+ /* 53 */ 0x5c0f38cba09b9e7dUL, 0x81168cc762a3478cUL, 0x3e23b0d306fc121cUL, 0x5a238aa0a5efdcddUL,
+ /* 54 */ 0x1ba26121c4ea43ffUL, 0x36f8c77f7c8832b5UL, 0x88fbea0b0adcf99aUL, 0x5ca9938ec25bebf9UL,
+ /* 55 */ 0xd5436a5e51fccda0UL, 0x1dbc4797c2cd893bUL, 0x19346a65d3224a08UL, 0x0f5034e49b9af466UL,
+ /* 56 */ 0xf23c3967a1e0b96eUL, 0xe58b08fa867a4d88UL, 0xfb2fabc6a7341679UL, 0x2a75381eb6026946UL,
+ /* 57 */ 0xc80a3be4c19420acUL, 0x66b1f6c681f2b6dcUL, 0x7cf7036761e93388UL, 0x25abbbd8a660a4c4UL,
+ /* 58 */ 0x91ea12ba14fd5198UL, 0x684950fc4a3cffa9UL, 0xf826842130f5ad28UL, 0x3ea988f75301a441UL,
+ /* 59 */ 0xc978109a695f8c6fUL, 0x1746eb4a0530c3f3UL, 0x444d6d77b4459995UL, 0x75952b8c054e5cc7UL,
+ /* 60 */ 0xa3703f7915f4d6aaUL, 0x66c346202f2647d8UL, 0xd01469df811d644bUL, 0x77fea47d81a5d71fUL,
+ /* 61 */ 0xc5e9529ef57ca381UL, 0x6eeeb4b9ce2f881aUL, 0xb6e91a28e8009bd6UL, 0x4b80be3e9afc3fecUL,
+ /* 62 */ 0x7e3773c526aed2c5UL, 0x1b4afcb453c9a49dUL, 0xa920bdd7baffb24dUL, 0x7c54699f122d400eUL,
+ /* 63 */ 0xef46c8e14fa94bc8UL, 0xe0b074ce2952ed5eUL, 0xbea450e1dbd885d5UL, 0x61b68649320f712cUL,
+ /* 64 */ 0x8a485f7309ccbdd1UL, 0xbd06320d7d4d1a2dUL, 0x25232973322dbef4UL, 0x445dc4758c17f770UL,
+ /* 65 */ 0xdb0434177cc8933cUL, 0xed6fe82175ea059fUL, 0x1efebefdc053db34UL, 0x4adbe867c65daf99UL,
+ /* 66 */ 0x3acd71a2a90609dfUL, 0xe5e991856dd04050UL, 0x1ec69b688157c23cUL, 0x697427f6885cfe4dUL,
+ /* 67 */ 0xd7be7b9b65e1a851UL, 0xa03d28d522c536ddUL, 0x28399d658fd2b645UL, 0x49e5b7e17c2641e1UL,
+ /* 68 */ 0x6f8c3a98700457a4UL, 0x5078f0a25ebb6778UL, 0xd13c3ccbc382960fUL, 0x2e003258a7df84b1UL,
+ /* 69 */ 0x8ad1f39be6296a1cUL, 0xc1eeaa652a5fbfb2UL, 0x33ee0673fd26f3cbUL, 0x59256173a69d2cccUL,
+ /* 70 */ 0x41ea07aa4e18fc41UL, 0xd9fc19527c87a51eUL, 0xbdaacb805831ca6fUL, 0x445b652dc916694fUL,
+ /* 71 */ 0xce92a3a7f2172315UL, 0x1edc282de11b9964UL, 0xa1823aafe04c314aUL, 0x790a2d94437cf586UL,
+ /* 72 */ 0x71c447fb93f6e009UL, 0x8922a56722845276UL, 0xbf70903b204f5169UL, 0x2f7a89891ba319feUL,
+ /* 73 */ 0x02a08eb577e2140cUL, 0xed9a4ed4427bdcf4UL, 0x5253ec44e4323cd1UL, 0x3e88363c14e9355bUL,
+ /* 74 */ 0xaa66c14277110b8cUL, 0x1ae0391610a23390UL, 0x2030bd12c93fc2a2UL, 0x3ee141579555c7abUL,
+ /* 75 */ 0x9214de3a6d6e7d41UL, 0x3ccdd88607f17efeUL, 0x674f1288f8e11217UL, 0x5682250f329f93d0UL,
+ /* 76 */ 0x6cf00b136d2e396eUL, 0x6e4cf86f1014debfUL, 0x5930b1b5bfcc4e83UL, 0x047069b48aba16b6UL,
+ /* 77 */ 0x0d4ce4ab69b20793UL, 0xb24db91a97d0fb9eUL, 0xcdfa50f54e00d01dUL, 0x221b1085368bddb5UL,
+ /* 78 */ 0xe7e59468b1e3d8d2UL, 0x53c56563bd122f93UL, 0xeee8a903e0663f09UL, 0x61efa662cbbe3d42UL,
+ /* 79 */ 0x2cf8ddddde6eab2aUL, 0x9bf80ad51435f231UL, 0x5deadacec9f04973UL, 0x29275b5d41d29b27UL,
+ /* 80 */ 0xcfde0f0895ebf14fUL, 0xb9aab96b054905a7UL, 0xcae80dd9a1c420fdUL, 0x0a63bf2f1673bbc7UL,
+ /* 81 */ 0x092f6e11958fbc8cUL, 0x672a81e804822fadUL, 0xcac8351560d52517UL, 0x6f3f7722c8f192f8UL,
+ /* 82 */ 0xf8ba90ccc2e894b7UL, 0x2c7557a438ff9f0dUL, 0x894d1d855ae52359UL, 0x68e122157b743d69UL,
+ /* 83 */ 0xd87e5570cfb919f3UL, 0x3f2cdecd95798db9UL, 0x2121154710c0a2ceUL, 0x3c66a115246dc5b2UL,
+ /* 84 */ 0xcbedc562294ecb72UL, 0xba7143c36a280b16UL, 0x9610c2efd4078b67UL, 0x6144735d946a4b1eUL,
+ /* 85 */ 0x536f111ed75b3350UL, 0x0211db8c2041d81bUL, 0xf93cb1000e10413cUL, 0x149dfd3c039e8876UL,
+ /* 86 */ 0xd479dde46b63155bUL, 0xb66e15e93c837976UL, 0xdafde43b1f13e038UL, 0x5fafda1a2e4b0b35UL,
+ /* 87 */ 0x3600bbdf17197581UL, 0x3972050bbe3cd2c2UL, 0x5938906dbdd5be86UL, 0x34fce5e43f9b860fUL,
+ /* 88 */ 0x75a8a4cd42d14d02UL, 0x828dabc53441df65UL, 0x33dcabedd2e131d3UL, 0x3ebad76fb814d25fUL,
+ /* 89 */ 0xd4906f566f70e10fUL, 0x5d12f7aa51690f5aUL, 0x45adb16e76cefcf2UL, 0x01f768aead232999UL,
+ /* 90 */ 0x2b6cc77b6248febdUL, 0x3cd30628ec3aaffdUL, 0xce1c0b80d4ef486aUL, 0x4c3bff2ea6f66c23UL,
+ /* 91 */ 0x3f2ec4094aeaeb5fUL, 0x61b19b286e372ca7UL, 0x5eefa966de2a701dUL, 0x23b20565de55e3efUL,
+ /* 92 */ 0xe301ca5279d58557UL, 0x07b2d4ce27c2874fUL, 0xa532cd8a9dcf1d67UL, 0x2a52fee23f2bff56UL,
+ /* 93 */ 0x8624efb37cd8663dUL, 0xbbc7ac20ffbd7594UL, 0x57b85e9c82d37445UL, 0x7b3052cb86a6ec66UL,
+ /* 94 */ 0x3482f0ad2525e91eUL, 0x2cb68043d28edca0UL, 0xaf4f6d052e1b003aUL, 0x185f8c2529781b0aUL,
+ /* 95 */ 0xaa41de5bd80ce0d6UL, 0x9407b2416853e9d6UL, 0x563ec36e357f4c3aUL, 0x4cc4b8dd0e297bceUL,
+ /* 96 */ 0xa2fc1a52ffb8730eUL, 0x1811f16e67058e37UL, 0x10f9a366cddf4ee1UL, 0x72f4a0c4a0b9f099UL,
+ /* 97 */ 0x8c16c06f663f4ea7UL, 0x693b3af74e970fbaUL, 0x2102e7f1d69ec345UL, 0x0ba53cbc968a8089UL,
+ /* 98 */ 0xca3d9dc7fea15537UL, 0x4c6824bb51536493UL, 0xb9886314844006b1UL, 0x40d2a72ab454cc60UL,
+ /* 99 */ 0x5936a1b712570975UL, 0x91b9d648debda657UL, 0x3344094bb64330eaUL, 0x006ba10d12ee51d0UL,
+ /* 100 */ 0x19228468f5de5d58UL, 0x0eb12f4c38cc05b0UL, 0xa1039f9dd5601990UL, 0x4502d4ce4fff0e0bUL,
+ /* 101 */ 0xeb2054106837c189UL, 0xd0f6544c6dd3b93cUL, 0x40727064c416d74fUL, 0x6e15c6114b502ef0UL,
+ /* 102 */ 0x4df2a398cfb1a76bUL, 0x11256c7419f2f6b1UL, 0x4a497962066e6043UL, 0x705b3aab41355b44UL,
+ /* 103 */ 0x365ef536d797b1d8UL, 0x00076bd622ddf0dbUL, 0x3bbf33b0e0575a88UL, 0x3777aa05c8e4ca4dUL,
+ /* 104 */ 0x392745c85578db5fUL, 0x6fda4149dbae5ae2UL, 0xb1f0b00b8adc9867UL, 0x09963437d36f1da3UL,
+ /* 105 */ 0x7e824e90a5dc3853UL, 0xccb5f6641f135cbdUL, 0x6736d86c87ce8fccUL, 0x625f3ce26604249fUL,
+ /* 106 */ 0xaf8ac8059502f63fUL, 0x0c05e70a2e351469UL, 0x35292e9c764b6305UL, 0x1a394360c7e23ac3UL,
+ /* 107 */ 0xd5c6d53251183264UL, 0x62065abd43c2b74fUL, 0xb5fbf5d03b973f9bUL, 0x13a3da3661206e5eUL,
+ /* 108 */ 0xc6bd5837725d94e5UL, 0x18e30912205016c5UL, 0x2088ce1570033c68UL, 0x7fba1f495c837987UL,
+ /* 109 */ 0x5a8c7423f2f9079dUL, 0x1735157b34023fc5UL, 0xe4f9b49ad2fab351UL, 0x6691ff72c878e33cUL,
+ /* 110 */ 0x122c2adedc5eff3eUL, 0xf8dd4bf1d8956cf4UL, 0xeb86205d9e9e5bdaUL, 0x049b92b9d975c743UL,
+ /* 111 */ 0xa5379730b0f6c05aUL, 0x72a0ffacc6f3a553UL, 0xb0032c34b20dcd6dUL, 0x470e9dbc88d5164aUL,
+ /* 112 */ 0xb19cf10ca237c047UL, 0xb65466711f6c81a2UL, 0xb3321bd16dd80b43UL, 0x48c14f600c5fbe8eUL,
+ /* 113 */ 0x66451c264aa6c803UL, 0xb66e3904a4fa7da6UL, 0xd45f19b0b3128395UL, 0x31602627c3c9bc10UL,
+ /* 114 */ 0x3120dc4832e4e10dUL, 0xeb20c46756c717f7UL, 0x00f52e3f67280294UL, 0x566d4fc14730c509UL,
+ /* 115 */ 0x7e3a5d40fd837206UL, 0xc1e926dc7159547aUL, 0x216730fba68d6095UL, 0x22e8c3843f69cea7UL,
+ /* 116 */ 0x33d074e8930e4b2bUL, 0xb6e4350e84d15816UL, 0x5534c26ad6ba2365UL, 0x7773c12f89f1f3f3UL,
+ /* 117 */ 0x8cba404da57962aaUL, 0x5b9897a81999ce56UL, 0x508e862f121692fcUL, 0x3a81907fa093c291UL,
+ /* 118 */ 0x0dded0ff4725a510UL, 0x10d8cc10673fc503UL, 0x5b9d151c9f1f4e89UL, 0x32a5c1d5cb09a44cUL,
+ /* 119 */ 0x1e0aa442b90541fbUL, 0x5f85eb7cc1b485dbUL, 0xbee595ce8a9df2e5UL, 0x25e496c722422236UL,
+ /* 120 */ 0x5edf3c46cd0fe5b9UL, 0x34e75a7ed2a43388UL, 0xe488de11d761e352UL, 0x0e878a01a085545cUL,
+ /* 121 */ 0xba493c77e021bb04UL, 0x2b4d1843c7df899aUL, 0x9ea37a487ae80d67UL, 0x67a9958011e41794UL,
+ /* 122 */ 0x4b58051a6697b065UL, 0x47e33f7d8d6ba6d4UL, 0xbb4da8d483ca46c1UL, 0x68becaa181c2db0dUL,
+ /* 123 */ 0x8d8980e90b989aa5UL, 0xf95eb14a2c93c99bUL, 0x51c6c7c4796e73a2UL, 0x6e228363b5efb569UL,
+ /* 124 */ 0xc6bbc0b02dd624c8UL, 0x777eb47dec8170eeUL, 0x3cde15a004cfafa9UL, 0x1dc6bc087160bf9bUL,
+ /* 125 */ 0x2e07e043eec34002UL, 0x18e9fc677a68dc7fUL, 0xd8da03188bd15b9aUL, 0x48fbc3bb00568253UL,
+ /* 126 */ 0x57547d4cfb654ce1UL, 0xd3565b82a058e2adUL, 0xf63eaf0bbf154478UL, 0x47531ef114dfbb18UL,
+ /* 127 */ 0xe1ec630a4278c587UL, 0x5507d546ca8e83f3UL, 0x85e135c63adc0c2bUL, 0x0aa7efa85682844eUL,
+ /* 128 */ 0x72691ba8b3e1f615UL, 0x32b4e9701fbe3ffaUL, 0x97b6d92e39bb7868UL, 0x2cfe53dea02e39e8UL,
+ /* 129 */ 0x687392cd85cd52b0UL, 0x27ff66c910e29831UL, 0x97134556a9832d06UL, 0x269bb0360a84f8a0UL,
+ /* 130 */ 0x706e55457643f85cUL, 0x3734a48c9b597d1bUL, 0x7aee91e8c6efa472UL, 0x5cd6abc198a9d9e0UL,
+ /* 131 */ 0x0e04de06cb3ce41aUL, 0xd8c6eb893402e138UL, 0x904659bb686e3772UL, 0x7215c371746ba8c8UL,
+ /* 132 */ 0xfd12a97eeae4a2d9UL, 0x9514b7516394f2c5UL, 0x266fd5809208f294UL, 0x5c847085619a26b9UL,
+ /* 133 */ 0x52985410fed694eaUL, 0x3c905b934a2ed254UL, 0x10bb47692d3be467UL, 0x063b3d2d69e5e9e1UL,
+ /* 134 */ 0x472726eedda57debUL, 0xefb6c4ae10f41891UL, 0x2b1641917b307614UL, 0x117c554fc4f45b7cUL,
+ /* 135 */ 0xc07cf3118f9d8812UL, 0x01dbd82050017939UL, 0xd7e803f4171b2827UL, 0x1015e87487d225eaUL,
+ /* 136 */ 0xc58de3fed23acc4dUL, 0x50db91c294a7be2dUL, 0x0b94d43d1c9cf457UL, 0x6b1640fa6e37524aUL,
+ /* 137 */ 0x692f346c5fda0d09UL, 0x200b1c59fa4d3151UL, 0xb8c46f760777a296UL, 0x4b38395f3ffdfbcfUL,
+ /* 138 */ 0x18d25e00be54d671UL, 0x60d50582bec8aba6UL, 0x87ad8f263b78b982UL, 0x50fdf64e9cda0432UL,
+ /* 139 */ 0x90f567aac578dcf0UL, 0xef1e9b0ef2a3133bUL, 0x0eebba9242d9de71UL, 0x15473c9bf03101c7UL,
+ /* 140 */ 0x7c77e8ae56b78095UL, 0xb678e7666e6f078eUL, 0x2da0b9615348ba1fUL, 0x7cf931c1ff733f0bUL,
+ /* 141 */ 0x26b357f50a0a366cUL, 0xe9708cf42b87d732UL, 0xc13aeea5f91cb2c0UL, 0x35d90c991143bb4cUL,
+ /* 142 */ 0x47c1c404a9a0d9dcUL, 0x659e58451972d251UL, 0x3875a8c473b38c31UL, 0x1fbd9ed379561f24UL,
+ /* 143 */ 0x11fabc6fd41ec28dUL, 0x7ef8dfe3cd2a2dcaUL, 0x72e73b5d8c404595UL, 0x6135fa4954b72f27UL,
+ /* 144 */ 0xccfc32a2de24b69cUL, 0x3f55698c1f095d88UL, 0xbe3350ed5ac3f929UL, 0x5e9bf806ca477eebUL,
+ /* 145 */ 0xe9ce8fb63c309f68UL, 0x5376f63565e1f9f4UL, 0xd1afcfb35a6393f1UL, 0x6632a1ede5623506UL,
+ /* 146 */ 0x0b7d6c390c2ded4cUL, 0x56cb3281df04cb1fUL, 0x66305a1249ecc3c7UL, 0x5d588b60a38ca72aUL,
+ /* 147 */ 0xa6ecbf78e8e5f42dUL, 0x86eeb44b3c8a3eecUL, 0xec219c48fbd21604UL, 0x1aaf1af517c36731UL,
+ /* 148 */ 0xc306a2836769bde7UL, 0x208280622b1e2adbUL, 0x8027f51ffbff94a6UL, 0x76cfa1ce1124f26bUL,
+ /* 149 */ 0x18eb00562422abb6UL, 0xf377c4d58f8c29c3UL, 0x4dbbc207f531561aUL, 0x0253b7f082128a27UL,
+ /* 150 */ 0x3d1f091cb62c17e0UL, 0x4860e1abd64628a9UL, 0x52d17436309d4253UL, 0x356f97e13efae576UL,
+ /* 151 */ 0xd351e11aa150535bUL, 0x3e6b45bb1dd878ccUL, 0x0c776128bed92c98UL, 0x1d34ae93032885b8UL,
+ /* 152 */ 0x4ba0488ca85ba4c3UL, 0x985348c33c9ce6ceUL, 0x66124c6f97bda770UL, 0x0f81a0290654124aUL,
+ /* 153 */ 0x9ed09ca6569b86fdUL, 0x811009fd18af9a2dUL, 0xff08d03f93d8c20aUL, 0x52a148199faef26bUL,
+ /* 154 */ 0x3e03f9dc2d8d1b73UL, 0x4205801873961a70UL, 0xc0d987f041a35970UL, 0x07aa1f15a1c0d549UL,
+ /* 155 */ 0xdfd46ce08cd27224UL, 0x6d0a024f934e4239UL, 0x808a7a6399897b59UL, 0x0a4556e9e13d95a2UL,
+ /* 156 */ 0xd21a991fe9c13045UL, 0x9b0e8548fe7751b8UL, 0x5da643cb4bf30035UL, 0x77db28d63940f721UL,
+ /* 157 */ 0xfc5eeb614adc9011UL, 0x5229419ae8c411ebUL, 0x9ec3e7787d1dcf74UL, 0x340d053e216e4cb5UL,
+ /* 158 */ 0xcac7af39b48df2b4UL, 0xc0faec2871a10a94UL, 0x140a69245ca575edUL, 0x0cf1c37134273a4cUL,
+ /* 159 */ 0xc8ee306ac224b8a5UL, 0x57eaee7ccb4930b0UL, 0xa1e806bdaacbe74fUL, 0x7d9a62742eeb657dUL,
+ /* 160 */ 0x9eb6b6ef546c4830UL, 0x885cca1fddb36e2eUL, 0xe6b9f383ef0d7105UL, 0x58654fef9d2e0412UL,
+ /* 161 */ 0xa905c4ffbe0e8e26UL, 0x942de5df9b31816eUL, 0x497d723f802e88e1UL, 0x30684dea602f408dUL,
+ /* 162 */ 0x21e5a278a3e6cb34UL, 0xaefb6e6f5b151dc4UL, 0xb30b8e049d77ca15UL, 0x28c3c9cf53b98981UL,
+ /* 163 */ 0x287fb721556cdd2aUL, 0x0d317ca897022274UL, 0x7468c7423a543258UL, 0x4a7f11464eb5642fUL,
+ /* 164 */ 0xa237a4774d193aa6UL, 0xd865986ea92129a1UL, 0x24c515ecf87c1a88UL, 0x604003575f39f5ebUL,
+ /* 165 */ 0x47b9f189570a9b27UL, 0x2b98cede465e4b78UL, 0x026df551dbb85c20UL, 0x74fcd91047e21901UL,
+ /* 166 */ 0x13e2a90a23c1bfa3UL, 0x0cb0074e478519f6UL, 0x5ff1cbbe3af6cf44UL, 0x67fe5438be812dbeUL,
+ /* 167 */ 0xd13cf64fa40f05b0UL, 0x054dfb2f32283787UL, 0x4173915b7f0d2aeaUL, 0x482f144f1f610d4eUL,
+ /* 168 */ 0xf6210201b47f8234UL, 0x5d0ae1929e70b990UL, 0xdcd7f455b049567cUL, 0x7e93d0f1f0916f01UL,
+ /* 169 */ 0xdd79cbf18a7db4faUL, 0xbe8391bf6f74c62fUL, 0x027145d14b8291bdUL, 0x585a73ea2cbf1705UL,
+ /* 170 */ 0x485ca03e928a0db2UL, 0x10fc01a5742857e7UL, 0x2f482edbd6d551a7UL, 0x0f0433b5048fdb8aUL,
+ /* 171 */ 0x60da2e8dd7dc6247UL, 0x88b4c9d38cd4819aUL, 0x13033ac001f66697UL, 0x273b24fe3b367d75UL,
+ /* 172 */ 0xc6e8f66a31b3b9d4UL, 0x281514a494df49d5UL, 0xd1726fdfc8b23da7UL, 0x4b3ae7d103dee548UL,
+ /* 173 */ 0xc6256e19ce4b9d7eUL, 0xff5c5cf186e3c61cUL, 0xacc63ca34b8ec145UL, 0x74621888fee66574UL,
+ /* 174 */ 0x956f409645290a1eUL, 0xef0bf8e3263a962eUL, 0xed6a50eb5ec2647bUL, 0x0694283a9dca7502UL,
+ /* 175 */ 0x769b963643a2dcd1UL, 0x42b7c8ea09fc5353UL, 0x4f002aee13397eabUL, 0x63005e2c19b7d63aUL,
+ /* 176 */ 0xca6736da63023beaUL, 0x966c7f6db12a99b7UL, 0xace09390c537c5e1UL, 0x0b696063a1aa89eeUL,
+ /* 177 */ 0xebb03e97288c56e5UL, 0x432a9f9f938c8be8UL, 0xa6a5a93d5b717f71UL, 0x1a5fb4c3e18f9d97UL,
+ /* 178 */ 0x1c94e7ad1c60cdceUL, 0xee202a43fc02c4a0UL, 0x8dafe4d867c46a20UL, 0x0a10263c8ac27b58UL,
+ /* 179 */ 0xd0dea9dfe4432a4aUL, 0x856af87bbe9277c5UL, 0xce8472acc212c71aUL, 0x6f151b6d9bbb1e91UL,
+ /* 180 */ 0x26776c527ceed56aUL, 0x7d211cb7fbf8faecUL, 0x37ae66a6fd4609ccUL, 0x1f81b702d2770c42UL,
+ /* 181 */ 0x2fb0b057eac58392UL, 0xe1dd89fe29744e9dUL, 0xc964f8eb17beb4f8UL, 0x29571073c9a2d41eUL,
+ /* 182 */ 0xa948a18981c0e254UL, 0x2df6369b65b22830UL, 0xa33eb2d75fcfd3c6UL, 0x078cd6ec4199a01fUL,
+ /* 183 */ 0x4a584a41ad900d2fUL, 0x32142b78e2c74c52UL, 0x68c4e8338431c978UL, 0x7f69ea9008689fc2UL,
+ /* 184 */ 0x52f2c81e46a38265UL, 0xfd78072d04a832fdUL, 0x8cd7d5fa25359e94UL, 0x4de71b7454cc29d2UL,
+ /* 185 */ 0x42eb60ad1eda6ac9UL, 0x0aad37dfdbc09c3aUL, 0x81004b71e33cc191UL, 0x44e6be345122803cUL,
+ /* 186 */ 0x03fe8388ba1920dbUL, 0xf5d57c32150db008UL, 0x49c8c4281af60c29UL, 0x21edb518de701aeeUL,
+ /* 187 */ 0x7fb63e418f06dc99UL, 0xa4460d99c166d7b8UL, 0x24dd5248ce520a83UL, 0x5ec3ad712b928358UL,
+ /* 188 */ 0x15022a5fbd17930fUL, 0xa4f64a77d82570e3UL, 0x12bc8d6915783712UL, 0x498194c0fc620abbUL,
+ /* 189 */ 0x38a2d9d255686c82UL, 0x785c6bd9193e21f0UL, 0xe4d5c81ab24a5484UL, 0x56307860b2e20989UL,
+ /* 190 */ 0x429d55f78b4d74c4UL, 0x22f1834643350131UL, 0x1e60c24598c71fffUL, 0x59f2f014979983efUL,
+ /* 191 */ 0x46a47d56eb494a44UL, 0x3e22a854d636a18eUL, 0xb346e15274491c3bUL, 0x2ceafd4e5390cde7UL,
+ /* 192 */ 0xba8a8538be0d6675UL, 0x4b9074bb50818e23UL, 0xcbdab89085d304c3UL, 0x61a24fe0e56192c4UL,
+ /* 193 */ 0xcb7615e6db525bcbUL, 0xdd7d8c35a567e4caUL, 0xe6b4153acafcdd69UL, 0x2d668e097f3c9766UL,
+ /* 194 */ 0xa57e7e265ce55ef0UL, 0x5d9f4e527cd4b967UL, 0xfbc83606492fd1e5UL, 0x090d52beb7c3f7aeUL,
+ /* 195 */ 0x09b9515a1e7b4d7cUL, 0x1f266a2599da44c0UL, 0xa1c49548e2c55504UL, 0x7ef04287126f15ccUL,
+ /* 196 */ 0xfed1659dbd30ef15UL, 0x8b4ab9eec4e0277bUL, 0x884d6236a5df3291UL, 0x1fd96ea6bf5cf788UL,
+ /* 197 */ 0x42a161981f190d9aUL, 0x61d849507e6052c1UL, 0x9fe113bf285a2cd5UL, 0x7c22d676dbad85d8UL,
+ /* 198 */ 0x82e770ed2bfbd27dUL, 0x4c05b2ece996f5a5UL, 0xcd40a9c2b0900150UL, 0x5895319213d9bf64UL,
+ /* 199 */ 0xe7cc5d703fea2e08UL, 0xb50c491258e2188cUL, 0xcce30baa48205bf0UL, 0x537c659ccfa32d62UL,
+ /* 200 */ 0x37b6623a98cfc088UL, 0xfe9bed1fa4d6aca4UL, 0x04d29b8e56a8d1b0UL, 0x725f71c40b519575UL,
+ /* 201 */ 0x28c7f89cd0339ce6UL, 0x8367b14469ddc18bUL, 0x883ada83a6a1652cUL, 0x585f1974034d6c17UL,
+ /* 202 */ 0x89cfb266f1b19188UL, 0xe63b4863e7c35217UL, 0xd88c9da6b4c0526aUL, 0x3e035c9df0954635UL,
+ /* 203 */ 0xdd9d5412fb45de9dUL, 0xdd684532e4cff40dUL, 0x4b5c999b151d671cUL, 0x2d8c2cc811e7f690UL,
+ /* 204 */ 0x7f54be1d90055d40UL, 0xa464c5df464aaf40UL, 0x33979624f0e917beUL, 0x2c018dc527356b30UL,
+ /* 205 */ 0xa5415024e330b3d4UL, 0x73ff3d96691652d3UL, 0x94ec42c4ef9b59f1UL, 0x0747201618d08e5aUL,
+ /* 206 */ 0x4d6ca48aca411c53UL, 0x66415f2fcfa66119UL, 0x9c4dd40051e227ffUL, 0x59810bc09a02f7ebUL,
+ /* 207 */ 0x2a7eb171b3dc101dUL, 0x441c5ab99ffef68eUL, 0x32025c9b93b359eaUL, 0x5e8ce0a71e9d112fUL,
+ /* 208 */ 0xbfcccb92429503fdUL, 0xd271ba752f095d55UL, 0x345ead5e972d091eUL, 0x18c8df11a83103baUL,
+ /* 209 */ 0x90cd949a9aed0f4cUL, 0xc5d1f4cb6660e37eUL, 0xb8cac52d56c52e0bUL, 0x6e42e400c5808e0dUL,
+ /* 210 */ 0xa3b46966eeaefd23UL, 0x0c4f1f0be39ecdcaUL, 0x189dc8c9d683a51dUL, 0x51f27f054c09351bUL,
+ /* 211 */ 0x4c487ccd2a320682UL, 0x587ea95bb3df1c96UL, 0xc8ccf79e555cb8e8UL, 0x547dc829a206d73dUL,
+ /* 212 */ 0xb822a6cd80c39b06UL, 0xe96d54732000d4c6UL, 0x28535b6f91463b4dUL, 0x228f4660e2486e1dUL,
+ /* 213 */ 0x98799538de8d3abfUL, 0x8cd8330045ebca6eUL, 0x79952a008221e738UL, 0x4322e1a7535cd2bbUL,
+ /* 214 */ 0xb114c11819d1801cUL, 0x2016e4d84f3f5ec7UL, 0xdd0e2df409260f4cUL, 0x5ec362c0ae5f7266UL,
+ /* 215 */ 0xc0462b18b8b2b4eeUL, 0x7cc8d950274d1afbUL, 0xf25f7105436b02d2UL, 0x43bbf8dcbff9ccd3UL,
+ /* 216 */ 0xb6ad1767a039e9dfUL, 0xb0714da8f69d3583UL, 0x5e55fa18b42931f5UL, 0x4ed5558f33c60961UL,
+ /* 217 */ 0x1fe37901c647a5ddUL, 0x593ddf1f8081d357UL, 0x0249a4fd813fd7a6UL, 0x69acca274e9caf61UL,
+ /* 218 */ 0x047ba3ea330721c9UL, 0x83423fc20e7e1ea0UL, 0x1df4c0af01314a60UL, 0x09a62dab89289527UL,
+ /* 219 */ 0xa5b325a49cc6cb00UL, 0xe94b5dc654b56cb6UL, 0x3be28779adc994a0UL, 0x4296e8f8ba3a4aadUL,
+ /* 220 */ 0x328689761e451eabUL, 0x2e4d598bff59594aUL, 0x49b96853d7a7084aUL, 0x4980a319601420a8UL,
+ /* 221 */ 0x9565b9e12f552c42UL, 0x8a5318db7100fe96UL, 0x05c90b4d43add0d7UL, 0x538b4cd66a5d4edaUL,
+ /* 222 */ 0xf4e94fc3e89f039fUL, 0x592c9af26f618045UL, 0x08a36eb5fd4b9550UL, 0x25fffaf6c2ed1419UL,
+ /* 223 */ 0x34434459cc79d354UL, 0xeeecbfb4b1d5476bUL, 0xddeb34a061615d99UL, 0x5129cecceb64b773UL,
+ /* 224 */ 0xee43215894993520UL, 0x772f9c7cf14c0b3bUL, 0xd2e2fce306bedad5UL, 0x715f42b546f06a97UL,
+ /* 225 */ 0x434ecdceda5b5f1aUL, 0x0da17115a49741a9UL, 0x680bd77c73edad2eUL, 0x487c02354edd9041UL,
+ /* 226 */ 0xb8efeff3a70ed9c4UL, 0x56a32aa3e857e302UL, 0xdf3a68bd48a2a5a0UL, 0x07f650b73176c444UL,
+ /* 227 */ 0xe38b9b1626e0ccb1UL, 0x79e053c18b09fb36UL, 0x56d90319c9f94964UL, 0x1ca941e7ac9ff5c4UL,
+ /* 228 */ 0x49c4df29162fa0bbUL, 0x8488cf3282b33305UL, 0x95dfda14cabb437dUL, 0x3391f78264d5ad86UL,
+ /* 229 */ 0x729ae06ae2b5095dUL, 0xd58a58d73259a946UL, 0xe9834262d13921edUL, 0x27fedafaa54bb592UL,
+ /* 230 */ 0xa99dc5b829ad48bbUL, 0x5f025742499ee260UL, 0x802c8ecd5d7513fdUL, 0x78ceb3ef3f6dd938UL,
+ /* 231 */ 0xc342f44f8a135d94UL, 0x7b9edb44828cdda3UL, 0x9436d11a0537cfe7UL, 0x5064b164ec1ab4c8UL,
+ /* 232 */ 0x7020eccfd37eb2fcUL, 0x1f31ea3ed90d25fcUL, 0x1b930d7bdfa1bb34UL, 0x5344467a48113044UL,
+ /* 233 */ 0x70073170f25e6dfbUL, 0xe385dc1a50114cc8UL, 0x2348698ac8fc4f00UL, 0x2a77a55284dd40d8UL,
+ /* 234 */ 0xfe06afe0c98c6ce4UL, 0xc235df96dddfd6e4UL, 0x1428d01e33bf1ed3UL, 0x785768ec9300bdafUL,
+ /* 235 */ 0x9702e57a91deb63bUL, 0x61bdb8bfe5ce8b80UL, 0x645b426f3d1d58acUL, 0x4804a82227a557bcUL,
+ /* 236 */ 0x8e57048ab44d2601UL, 0x68d6501a4b3a6935UL, 0xc39c9ec3f9e1c293UL, 0x4172f257d4de63e2UL,
+ /* 237 */ 0xd368b450330c6401UL, 0x040d3017418f2391UL, 0x2c34bb6090b7d90dUL, 0x16f649228fdfd51fUL,
+ /* 238 */ 0xbea6818e2b928ef5UL, 0xe28ccf91cdc11e72UL, 0x594aaa68e77a36cdUL, 0x313034806c7ffd0fUL,
+ /* 239 */ 0x8a9d27ac2249bd65UL, 0x19a3b464018e9512UL, 0xc26ccff352b37ec7UL, 0x056f68341d797b21UL,
+ /* 240 */ 0x5e79d6757efd2327UL, 0xfabdbcb6553afe15UL, 0xd3e7222c6eaf5a60UL, 0x7046c76d4dae743bUL,
+ /* 241 */ 0x660be872b18d4a55UL, 0x19992518574e1496UL, 0xc103053a302bdcbbUL, 0x3ed8e9800b218e8eUL,
+ /* 242 */ 0x7b0b9239fa75e03eUL, 0xefe9fb684633c083UL, 0x98a35fbe391a7793UL, 0x6065510fe2d0fe34UL,
+ /* 243 */ 0x55cb668548abad0cUL, 0xb4584548da87e527UL, 0x2c43ecea0107c1ddUL, 0x526028809372de35UL,
+ /* 244 */ 0x3415c56af9213b1fUL, 0x5bee1a4d017e98dbUL, 0x13f6b105b5cf709bUL, 0x5ff20e3482b29ab6UL,
+ /* 245 */ 0x0aa29c75cc2e6c90UL, 0xfc7d73ca3a70e206UL, 0x899fc38fc4b5c515UL, 0x250386b124ffc207UL,
+ /* 246 */ 0x54ea28d5ae3d2b56UL, 0x9913149dd6de60ceUL, 0x16694fc58f06d6c1UL, 0x46b23975eb018fc7UL,
+ /* 247 */ 0x470a6a0fb4b7b4e2UL, 0x5d92475a8f7253deUL, 0xabeee5b52fbd3adbUL, 0x7fa20801a0806968UL,
+ /* 248 */ 0x76f3faf19f7714d2UL, 0xb3e840c12f4660c3UL, 0x0fb4cd8df212744eUL, 0x4b065a251d3a2dd2UL,
+ /* 249 */ 0x5cebde383d77cd4aUL, 0x6adf39df882c9cb1UL, 0xa2dd242eb09af759UL, 0x3147c0e50e5f6422UL,
+ /* 250 */ 0x164ca5101d1350dbUL, 0xf8d13479c33fc962UL, 0xe640ce4d13e5da08UL, 0x4bdee0c45061f8baUL,
+ /* 251 */ 0xd7c46dc1a4edb1c9UL, 0x5514d7b6437fd98aUL, 0x58942f6bb2a1c00bUL, 0x2dffb2ab1d70710eUL,
+ /* 252 */ 0xccdfcf2fc18b6d68UL, 0xa8ebcba8b7806167UL, 0x980697f95e2937e3UL, 0x02fbba1cd0126e8cUL
+};
+
+/* c is two 512-bit products: c0[0:7]=a0[0:3]*b0[0:3] and c1[8:15]=a1[4:7]*b1[4:7]
+ * a is two 256-bit integers: a0[0:3] and a1[4:7]
+ * b is two 256-bit integers: b0[0:3] and b1[4:7]
+ */
+static void mul2_256x256_integer_adx(u64 *const c, const u64 *const a, const u64 *const b)
+{
+ asm volatile(
+ "xorl %%r14d, %%r14d ;"
+ "movq (%1), %%rdx; " /* A[0] */
+ "mulx (%2), %%r8, %%r12; " /* A[0]*B[0] */
+ "xorl %%r10d, %%r10d ;"
+ "movq %%r8, (%0) ;"
+ "mulx 8(%2), %%r10, %%rax; " /* A[0]*B[1] */
+ "adox %%r10, %%r12 ;"
+ "mulx 16(%2), %%r8, %%rbx; " /* A[0]*B[2] */
+ "adox %%r8, %%rax ;"
+ "mulx 24(%2), %%r10, %%rcx; " /* A[0]*B[3] */
+ "adox %%r10, %%rbx ;"
+ /******************************************/
+ "adox %%r14, %%rcx ;"
+
+ "movq 8(%1), %%rdx; " /* A[1] */
+ "mulx (%2), %%r8, %%r9; " /* A[1]*B[0] */
+ "adox %%r12, %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[1]*B[1] */
+ "adox %%r10, %%r9 ;"
+ "adcx %%r9, %%rax ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[1]*B[2] */
+ "adox %%r8, %%r11 ;"
+ "adcx %%r11, %%rbx ;"
+ "mulx 24(%2), %%r10, %%r12; " /* A[1]*B[3] */
+ "adox %%r10, %%r13 ;"
+ "adcx %%r13, %%rcx ;"
+ /******************************************/
+ "adox %%r14, %%r12 ;"
+ "adcx %%r14, %%r12 ;"
+
+ "movq 16(%1), %%rdx; " /* A[2] */
+ "xorl %%r10d, %%r10d ;"
+ "mulx (%2), %%r8, %%r9; " /* A[2]*B[0] */
+ "adox %%rax, %%r8 ;"
+ "movq %%r8, 16(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[2]*B[1] */
+ "adox %%r10, %%r9 ;"
+ "adcx %%r9, %%rbx ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[2]*B[2] */
+ "adox %%r8, %%r11 ;"
+ "adcx %%r11, %%rcx ;"
+ "mulx 24(%2), %%r10, %%rax; " /* A[2]*B[3] */
+ "adox %%r10, %%r13 ;"
+ "adcx %%r13, %%r12 ;"
+ /******************************************/
+ "adox %%r14, %%rax ;"
+ "adcx %%r14, %%rax ;"
+
+ "movq 24(%1), %%rdx; " /* A[3] */
+ "xorl %%r10d, %%r10d ;"
+ "mulx (%2), %%r8, %%r9; " /* A[3]*B[0] */
+ "adox %%rbx, %%r8 ;"
+ "movq %%r8, 24(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[3]*B[1] */
+ "adox %%r10, %%r9 ;"
+ "adcx %%r9, %%rcx ;"
+ "movq %%rcx, 32(%0) ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[3]*B[2] */
+ "adox %%r8, %%r11 ;"
+ "adcx %%r11, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "mulx 24(%2), %%r10, %%rbx; " /* A[3]*B[3] */
+ "adox %%r10, %%r13 ;"
+ "adcx %%r13, %%rax ;"
+ "movq %%rax, 48(%0) ;"
+ /******************************************/
+ "adox %%r14, %%rbx ;"
+ "adcx %%r14, %%rbx ;"
+ "movq %%rbx, 56(%0) ;"
+
+ "movq 32(%1), %%rdx; " /* C[0] */
+ "mulx 32(%2), %%r8, %%r12; " /* C[0]*D[0] */
+ "xorl %%r10d, %%r10d ;"
+ "movq %%r8, 64(%0);"
+ "mulx 40(%2), %%r10, %%rax; " /* C[0]*D[1] */
+ "adox %%r10, %%r12 ;"
+ "mulx 48(%2), %%r8, %%rbx; " /* C[0]*D[2] */
+ "adox %%r8, %%rax ;"
+ "mulx 56(%2), %%r10, %%rcx; " /* C[0]*D[3] */
+ "adox %%r10, %%rbx ;"
+ /******************************************/
+ "adox %%r14, %%rcx ;"
+
+ "movq 40(%1), %%rdx; " /* C[1] */
+ "xorl %%r10d, %%r10d ;"
+ "mulx 32(%2), %%r8, %%r9; " /* C[1]*D[0] */
+ "adox %%r12, %%r8 ;"
+ "movq %%r8, 72(%0);"
+ "mulx 40(%2), %%r10, %%r11; " /* C[1]*D[1] */
+ "adox %%r10, %%r9 ;"
+ "adcx %%r9, %%rax ;"
+ "mulx 48(%2), %%r8, %%r13; " /* C[1]*D[2] */
+ "adox %%r8, %%r11 ;"
+ "adcx %%r11, %%rbx ;"
+ "mulx 56(%2), %%r10, %%r12; " /* C[1]*D[3] */
+ "adox %%r10, %%r13 ;"
+ "adcx %%r13, %%rcx ;"
+ /******************************************/
+ "adox %%r14, %%r12 ;"
+ "adcx %%r14, %%r12 ;"
+
+ "movq 48(%1), %%rdx; " /* C[2] */
+ "xorl %%r10d, %%r10d ;"
+ "mulx 32(%2), %%r8, %%r9; " /* C[2]*D[0] */
+ "adox %%rax, %%r8 ;"
+ "movq %%r8, 80(%0);"
+ "mulx 40(%2), %%r10, %%r11; " /* C[2]*D[1] */
+ "adox %%r10, %%r9 ;"
+ "adcx %%r9, %%rbx ;"
+ "mulx 48(%2), %%r8, %%r13; " /* C[2]*D[2] */
+ "adox %%r8, %%r11 ;"
+ "adcx %%r11, %%rcx ;"
+ "mulx 56(%2), %%r10, %%rax; " /* C[2]*D[3] */
+ "adox %%r10, %%r13 ;"
+ "adcx %%r13, %%r12 ;"
+ /******************************************/
+ "adox %%r14, %%rax ;"
+ "adcx %%r14, %%rax ;"
+
+ "movq 56(%1), %%rdx; " /* C[3] */
+ "xorl %%r10d, %%r10d ;"
+ "mulx 32(%2), %%r8, %%r9; " /* C[3]*D[0] */
+ "adox %%rbx, %%r8 ;"
+ "movq %%r8, 88(%0);"
+ "mulx 40(%2), %%r10, %%r11; " /* C[3]*D[1] */
+ "adox %%r10, %%r9 ;"
+ "adcx %%r9, %%rcx ;"
+ "movq %%rcx, 96(%0) ;"
+ "mulx 48(%2), %%r8, %%r13; " /* C[3]*D[2] */
+ "adox %%r8, %%r11 ;"
+ "adcx %%r11, %%r12 ;"
+ "movq %%r12, 104(%0) ;"
+ "mulx 56(%2), %%r10, %%rbx; " /* C[3]*D[3] */
+ "adox %%r10, %%r13 ;"
+ "adcx %%r13, %%rax ;"
+ "movq %%rax, 112(%0) ;"
+ /******************************************/
+ "adox %%r14, %%rbx ;"
+ "adcx %%r14, %%rbx ;"
+ "movq %%rbx, 120(%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(b)
+ : "memory", "cc", "%rax", "%rbx", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13", "%r14");
+}
+
+static void mul2_256x256_integer_bmi2(u64 *const c, const u64 *const a, const u64 *const b)
+{
+ asm volatile(
+ "movq (%1), %%rdx; " /* A[0] */
+ "mulx (%2), %%r8, %%r12; " /* A[0]*B[0] */
+ "movq %%r8, (%0) ;"
+ "mulx 8(%2), %%r10, %%rax; " /* A[0]*B[1] */
+ "addq %%r10, %%r12 ;"
+ "mulx 16(%2), %%r8, %%rbx; " /* A[0]*B[2] */
+ "adcq %%r8, %%rax ;"
+ "mulx 24(%2), %%r10, %%rcx; " /* A[0]*B[3] */
+ "adcq %%r10, %%rbx ;"
+ /******************************************/
+ "adcq $0, %%rcx ;"
+
+ "movq 8(%1), %%rdx; " /* A[1] */
+ "mulx (%2), %%r8, %%r9; " /* A[1]*B[0] */
+ "addq %%r12, %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[1]*B[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[1]*B[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 24(%2), %%r10, %%r12; " /* A[1]*B[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%r12 ;"
+
+ "addq %%r9, %%rax ;"
+ "adcq %%r11, %%rbx ;"
+ "adcq %%r13, %%rcx ;"
+ "adcq $0, %%r12 ;"
+
+ "movq 16(%1), %%rdx; " /* A[2] */
+ "mulx (%2), %%r8, %%r9; " /* A[2]*B[0] */
+ "addq %%rax, %%r8 ;"
+ "movq %%r8, 16(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[2]*B[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[2]*B[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 24(%2), %%r10, %%rax; " /* A[2]*B[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%rax ;"
+
+ "addq %%r9, %%rbx ;"
+ "adcq %%r11, %%rcx ;"
+ "adcq %%r13, %%r12 ;"
+ "adcq $0, %%rax ;"
+
+ "movq 24(%1), %%rdx; " /* A[3] */
+ "mulx (%2), %%r8, %%r9; " /* A[3]*B[0] */
+ "addq %%rbx, %%r8 ;"
+ "movq %%r8, 24(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[3]*B[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[3]*B[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 24(%2), %%r10, %%rbx; " /* A[3]*B[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%rbx ;"
+
+ "addq %%r9, %%rcx ;"
+ "movq %%rcx, 32(%0) ;"
+ "adcq %%r11, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "adcq %%r13, %%rax ;"
+ "movq %%rax, 48(%0) ;"
+ "adcq $0, %%rbx ;"
+ "movq %%rbx, 56(%0) ;"
+
+ "movq 32(%1), %%rdx; " /* C[0] */
+ "mulx 32(%2), %%r8, %%r12; " /* C[0]*D[0] */
+ "movq %%r8, 64(%0) ;"
+ "mulx 40(%2), %%r10, %%rax; " /* C[0]*D[1] */
+ "addq %%r10, %%r12 ;"
+ "mulx 48(%2), %%r8, %%rbx; " /* C[0]*D[2] */
+ "adcq %%r8, %%rax ;"
+ "mulx 56(%2), %%r10, %%rcx; " /* C[0]*D[3] */
+ "adcq %%r10, %%rbx ;"
+ /******************************************/
+ "adcq $0, %%rcx ;"
+
+ "movq 40(%1), %%rdx; " /* C[1] */
+ "mulx 32(%2), %%r8, %%r9; " /* C[1]*D[0] */
+ "addq %%r12, %%r8 ;"
+ "movq %%r8, 72(%0) ;"
+ "mulx 40(%2), %%r10, %%r11; " /* C[1]*D[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 48(%2), %%r8, %%r13; " /* C[1]*D[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 56(%2), %%r10, %%r12; " /* C[1]*D[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%r12 ;"
+
+ "addq %%r9, %%rax ;"
+ "adcq %%r11, %%rbx ;"
+ "adcq %%r13, %%rcx ;"
+ "adcq $0, %%r12 ;"
+
+ "movq 48(%1), %%rdx; " /* C[2] */
+ "mulx 32(%2), %%r8, %%r9; " /* C[2]*D[0] */
+ "addq %%rax, %%r8 ;"
+ "movq %%r8, 80(%0) ;"
+ "mulx 40(%2), %%r10, %%r11; " /* C[2]*D[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 48(%2), %%r8, %%r13; " /* C[2]*D[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 56(%2), %%r10, %%rax; " /* C[2]*D[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%rax ;"
+
+ "addq %%r9, %%rbx ;"
+ "adcq %%r11, %%rcx ;"
+ "adcq %%r13, %%r12 ;"
+ "adcq $0, %%rax ;"
+
+ "movq 56(%1), %%rdx; " /* C[3] */
+ "mulx 32(%2), %%r8, %%r9; " /* C[3]*D[0] */
+ "addq %%rbx, %%r8 ;"
+ "movq %%r8, 88(%0) ;"
+ "mulx 40(%2), %%r10, %%r11; " /* C[3]*D[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 48(%2), %%r8, %%r13; " /* C[3]*D[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 56(%2), %%r10, %%rbx; " /* C[3]*D[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%rbx ;"
+
+ "addq %%r9, %%rcx ;"
+ "movq %%rcx, 96(%0) ;"
+ "adcq %%r11, %%r12 ;"
+ "movq %%r12, 104(%0) ;"
+ "adcq %%r13, %%rax ;"
+ "movq %%rax, 112(%0) ;"
+ "adcq $0, %%rbx ;"
+ "movq %%rbx, 120(%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(b)
+ : "memory", "cc", "%rax", "%rbx", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13");
+}
+
+static void sqr2_256x256_integer_adx(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movq (%1), %%rdx ;" /* A[0] */
+ "mulx 8(%1), %%r8, %%r14 ;" /* A[1]*A[0] */
+ "xorl %%r15d, %%r15d;"
+ "mulx 16(%1), %%r9, %%r10 ;" /* A[2]*A[0] */
+ "adcx %%r14, %%r9 ;"
+ "mulx 24(%1), %%rax, %%rcx ;" /* A[3]*A[0] */
+ "adcx %%rax, %%r10 ;"
+ "movq 24(%1), %%rdx ;" /* A[3] */
+ "mulx 8(%1), %%r11, %%r12 ;" /* A[1]*A[3] */
+ "adcx %%rcx, %%r11 ;"
+ "mulx 16(%1), %%rax, %%r13 ;" /* A[2]*A[3] */
+ "adcx %%rax, %%r12 ;"
+ "movq 8(%1), %%rdx ;" /* A[1] */
+ "adcx %%r15, %%r13 ;"
+ "mulx 16(%1), %%rax, %%rcx ;" /* A[2]*A[1] */
+ "movq $0, %%r14 ;"
+ /******************************************/
+ "adcx %%r15, %%r14 ;"
+
+ "xorl %%r15d, %%r15d;"
+ "adox %%rax, %%r10 ;"
+ "adcx %%r8, %%r8 ;"
+ "adox %%rcx, %%r11 ;"
+ "adcx %%r9, %%r9 ;"
+ "adox %%r15, %%r12 ;"
+ "adcx %%r10, %%r10 ;"
+ "adox %%r15, %%r13 ;"
+ "adcx %%r11, %%r11 ;"
+ "adox %%r15, %%r14 ;"
+ "adcx %%r12, %%r12 ;"
+ "adcx %%r13, %%r13 ;"
+ "adcx %%r14, %%r14 ;"
+
+ "movq (%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[0]^2 */
+ /*******************/
+ "movq %%rax, 0(%0) ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "movq 8(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[1]^2 */
+ "adcq %%rax, %%r9 ;"
+ "movq %%r9, 16(%0) ;"
+ "adcq %%rcx, %%r10 ;"
+ "movq %%r10, 24(%0) ;"
+ "movq 16(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[2]^2 */
+ "adcq %%rax, %%r11 ;"
+ "movq %%r11, 32(%0) ;"
+ "adcq %%rcx, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "movq 24(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[3]^2 */
+ "adcq %%rax, %%r13 ;"
+ "movq %%r13, 48(%0) ;"
+ "adcq %%rcx, %%r14 ;"
+ "movq %%r14, 56(%0) ;"
+
+
+ "movq 32(%1), %%rdx ;" /* B[0] */
+ "mulx 40(%1), %%r8, %%r14 ;" /* B[1]*B[0] */
+ "xorl %%r15d, %%r15d;"
+ "mulx 48(%1), %%r9, %%r10 ;" /* B[2]*B[0] */
+ "adcx %%r14, %%r9 ;"
+ "mulx 56(%1), %%rax, %%rcx ;" /* B[3]*B[0] */
+ "adcx %%rax, %%r10 ;"
+ "movq 56(%1), %%rdx ;" /* B[3] */
+ "mulx 40(%1), %%r11, %%r12 ;" /* B[1]*B[3] */
+ "adcx %%rcx, %%r11 ;"
+ "mulx 48(%1), %%rax, %%r13 ;" /* B[2]*B[3] */
+ "adcx %%rax, %%r12 ;"
+ "movq 40(%1), %%rdx ;" /* B[1] */
+ "adcx %%r15, %%r13 ;"
+ "mulx 48(%1), %%rax, %%rcx ;" /* B[2]*B[1] */
+ "movq $0, %%r14 ;"
+ /******************************************/
+ "adcx %%r15, %%r14 ;"
+
+ "xorl %%r15d, %%r15d;"
+ "adox %%rax, %%r10 ;"
+ "adcx %%r8, %%r8 ;"
+ "adox %%rcx, %%r11 ;"
+ "adcx %%r9, %%r9 ;"
+ "adox %%r15, %%r12 ;"
+ "adcx %%r10, %%r10 ;"
+ "adox %%r15, %%r13 ;"
+ "adcx %%r11, %%r11 ;"
+ "adox %%r15, %%r14 ;"
+ "adcx %%r12, %%r12 ;"
+ "adcx %%r13, %%r13 ;"
+ "adcx %%r14, %%r14 ;"
+
+ "movq 32(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* B[0]^2 */
+ /*******************/
+ "movq %%rax, 64(%0) ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 72(%0) ;"
+ "movq 40(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* B[1]^2 */
+ "adcq %%rax, %%r9 ;"
+ "movq %%r9, 80(%0) ;"
+ "adcq %%rcx, %%r10 ;"
+ "movq %%r10, 88(%0) ;"
+ "movq 48(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* B[2]^2 */
+ "adcq %%rax, %%r11 ;"
+ "movq %%r11, 96(%0) ;"
+ "adcq %%rcx, %%r12 ;"
+ "movq %%r12, 104(%0) ;"
+ "movq 56(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* B[3]^2 */
+ "adcq %%rax, %%r13 ;"
+ "movq %%r13, 112(%0) ;"
+ "adcq %%rcx, %%r14 ;"
+ "movq %%r14, 120(%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13", "%r14", "%r15");
+}
+
+static void sqr2_256x256_integer_bmi2(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movq 8(%1), %%rdx ;" /* A[1] */
+ "mulx (%1), %%r8, %%r9 ;" /* A[0]*A[1] */
+ "mulx 16(%1), %%r10, %%r11 ;" /* A[2]*A[1] */
+ "mulx 24(%1), %%rcx, %%r14 ;" /* A[3]*A[1] */
+
+ "movq 16(%1), %%rdx ;" /* A[2] */
+ "mulx 24(%1), %%r12, %%r13 ;" /* A[3]*A[2] */
+ "mulx (%1), %%rax, %%rdx ;" /* A[0]*A[2] */
+
+ "addq %%rax, %%r9 ;"
+ "adcq %%rdx, %%r10 ;"
+ "adcq %%rcx, %%r11 ;"
+ "adcq %%r14, %%r12 ;"
+ "adcq $0, %%r13 ;"
+ "movq $0, %%r14 ;"
+ "adcq $0, %%r14 ;"
+
+ "movq (%1), %%rdx ;" /* A[0] */
+ "mulx 24(%1), %%rax, %%rcx ;" /* A[0]*A[3] */
+
+ "addq %%rax, %%r10 ;"
+ "adcq %%rcx, %%r11 ;"
+ "adcq $0, %%r12 ;"
+ "adcq $0, %%r13 ;"
+ "adcq $0, %%r14 ;"
+
+ "shldq $1, %%r13, %%r14 ;"
+ "shldq $1, %%r12, %%r13 ;"
+ "shldq $1, %%r11, %%r12 ;"
+ "shldq $1, %%r10, %%r11 ;"
+ "shldq $1, %%r9, %%r10 ;"
+ "shldq $1, %%r8, %%r9 ;"
+ "shlq $1, %%r8 ;"
+
+ /*******************/
+ "mulx %%rdx, %%rax, %%rcx ; " /* A[0]^2 */
+ /*******************/
+ "movq %%rax, 0(%0) ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "movq 8(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ; " /* A[1]^2 */
+ "adcq %%rax, %%r9 ;"
+ "movq %%r9, 16(%0) ;"
+ "adcq %%rcx, %%r10 ;"
+ "movq %%r10, 24(%0) ;"
+ "movq 16(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ; " /* A[2]^2 */
+ "adcq %%rax, %%r11 ;"
+ "movq %%r11, 32(%0) ;"
+ "adcq %%rcx, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "movq 24(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ; " /* A[3]^2 */
+ "adcq %%rax, %%r13 ;"
+ "movq %%r13, 48(%0) ;"
+ "adcq %%rcx, %%r14 ;"
+ "movq %%r14, 56(%0) ;"
+
+ "movq 40(%1), %%rdx ;" /* B[1] */
+ "mulx 32(%1), %%r8, %%r9 ;" /* B[0]*B[1] */
+ "mulx 48(%1), %%r10, %%r11 ;" /* B[2]*B[1] */
+ "mulx 56(%1), %%rcx, %%r14 ;" /* B[3]*B[1] */
+
+ "movq 48(%1), %%rdx ;" /* B[2] */
+ "mulx 56(%1), %%r12, %%r13 ;" /* B[3]*B[2] */
+ "mulx 32(%1), %%rax, %%rdx ;" /* B[0]*B[2] */
+
+ "addq %%rax, %%r9 ;"
+ "adcq %%rdx, %%r10 ;"
+ "adcq %%rcx, %%r11 ;"
+ "adcq %%r14, %%r12 ;"
+ "adcq $0, %%r13 ;"
+ "movq $0, %%r14 ;"
+ "adcq $0, %%r14 ;"
+
+ "movq 32(%1), %%rdx ;" /* B[0] */
+ "mulx 56(%1), %%rax, %%rcx ;" /* B[0]*B[3] */
+
+ "addq %%rax, %%r10 ;"
+ "adcq %%rcx, %%r11 ;"
+ "adcq $0, %%r12 ;"
+ "adcq $0, %%r13 ;"
+ "adcq $0, %%r14 ;"
+
+ "shldq $1, %%r13, %%r14 ;"
+ "shldq $1, %%r12, %%r13 ;"
+ "shldq $1, %%r11, %%r12 ;"
+ "shldq $1, %%r10, %%r11 ;"
+ "shldq $1, %%r9, %%r10 ;"
+ "shldq $1, %%r8, %%r9 ;"
+ "shlq $1, %%r8 ;"
+
+ /*******************/
+ "mulx %%rdx, %%rax, %%rcx ; " /* B[0]^2 */
+ /*******************/
+ "movq %%rax, 64(%0) ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 72(%0) ;"
+ "movq 40(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ; " /* B[1]^2 */
+ "adcq %%rax, %%r9 ;"
+ "movq %%r9, 80(%0) ;"
+ "adcq %%rcx, %%r10 ;"
+ "movq %%r10, 88(%0) ;"
+ "movq 48(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ; " /* B[2]^2 */
+ "adcq %%rax, %%r11 ;"
+ "movq %%r11, 96(%0) ;"
+ "adcq %%rcx, %%r12 ;"
+ "movq %%r12, 104(%0) ;"
+ "movq 56(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ; " /* B[3]^2 */
+ "adcq %%rax, %%r13 ;"
+ "movq %%r13, 112(%0) ;"
+ "adcq %%rcx, %%r14 ;"
+ "movq %%r14, 120(%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13", "%r14");
+}
+
+static void red_eltfp25519_2w_adx(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movl $38, %%edx; " /* 2*c = 38 = 2^256 */
+ "mulx 32(%1), %%r8, %%r10; " /* c*C[4] */
+ "xorl %%ebx, %%ebx ;"
+ "adox (%1), %%r8 ;"
+ "mulx 40(%1), %%r9, %%r11; " /* c*C[5] */
+ "adcx %%r10, %%r9 ;"
+ "adox 8(%1), %%r9 ;"
+ "mulx 48(%1), %%r10, %%rax; " /* c*C[6] */
+ "adcx %%r11, %%r10 ;"
+ "adox 16(%1), %%r10 ;"
+ "mulx 56(%1), %%r11, %%rcx; " /* c*C[7] */
+ "adcx %%rax, %%r11 ;"
+ "adox 24(%1), %%r11 ;"
+ /***************************************/
+ "adcx %%rbx, %%rcx ;"
+ "adox %%rbx, %%rcx ;"
+ "imul %%rdx, %%rcx ;" /* c*C[4], cf=0, of=0 */
+ "adcx %%rcx, %%r8 ;"
+ "adcx %%rbx, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "adcx %%rbx, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "adcx %%rbx, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%edx, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, (%0) ;"
+
+ "mulx 96(%1), %%r8, %%r10; " /* c*C[4] */
+ "xorl %%ebx, %%ebx ;"
+ "adox 64(%1), %%r8 ;"
+ "mulx 104(%1), %%r9, %%r11; " /* c*C[5] */
+ "adcx %%r10, %%r9 ;"
+ "adox 72(%1), %%r9 ;"
+ "mulx 112(%1), %%r10, %%rax; " /* c*C[6] */
+ "adcx %%r11, %%r10 ;"
+ "adox 80(%1), %%r10 ;"
+ "mulx 120(%1), %%r11, %%rcx; " /* c*C[7] */
+ "adcx %%rax, %%r11 ;"
+ "adox 88(%1), %%r11 ;"
+ /****************************************/
+ "adcx %%rbx, %%rcx ;"
+ "adox %%rbx, %%rcx ;"
+ "imul %%rdx, %%rcx ;" /* c*C[4], cf=0, of=0 */
+ "adcx %%rcx, %%r8 ;"
+ "adcx %%rbx, %%r9 ;"
+ "movq %%r9, 40(%0) ;"
+ "adcx %%rbx, %%r10 ;"
+ "movq %%r10, 48(%0) ;"
+ "adcx %%rbx, %%r11 ;"
+ "movq %%r11, 56(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%edx, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 32(%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rbx", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11");
+}
+
+static void red_eltfp25519_2w_bmi2(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movl $38, %%edx ; " /* 2*c = 38 = 2^256 */
+ "mulx 32(%1), %%r8, %%r10 ;" /* c*C[4] */
+ "mulx 40(%1), %%r9, %%r11 ;" /* c*C[5] */
+ "addq %%r10, %%r9 ;"
+ "mulx 48(%1), %%r10, %%rax ;" /* c*C[6] */
+ "adcq %%r11, %%r10 ;"
+ "mulx 56(%1), %%r11, %%rcx ;" /* c*C[7] */
+ "adcq %%rax, %%r11 ;"
+ /***************************************/
+ "adcq $0, %%rcx ;"
+ "addq (%1), %%r8 ;"
+ "adcq 8(%1), %%r9 ;"
+ "adcq 16(%1), %%r10 ;"
+ "adcq 24(%1), %%r11 ;"
+ "adcq $0, %%rcx ;"
+ "imul %%rdx, %%rcx ;" /* c*C[4], cf=0 */
+ "addq %%rcx, %%r8 ;"
+ "adcq $0, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "adcq $0, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "adcq $0, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%edx, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, (%0) ;"
+
+ "mulx 96(%1), %%r8, %%r10 ;" /* c*C[4] */
+ "mulx 104(%1), %%r9, %%r11 ;" /* c*C[5] */
+ "addq %%r10, %%r9 ;"
+ "mulx 112(%1), %%r10, %%rax ;" /* c*C[6] */
+ "adcq %%r11, %%r10 ;"
+ "mulx 120(%1), %%r11, %%rcx ;" /* c*C[7] */
+ "adcq %%rax, %%r11 ;"
+ /****************************************/
+ "adcq $0, %%rcx ;"
+ "addq 64(%1), %%r8 ;"
+ "adcq 72(%1), %%r9 ;"
+ "adcq 80(%1), %%r10 ;"
+ "adcq 88(%1), %%r11 ;"
+ "adcq $0, %%rcx ;"
+ "imul %%rdx, %%rcx ;" /* c*C[4], cf=0 */
+ "addq %%rcx, %%r8 ;"
+ "adcq $0, %%r9 ;"
+ "movq %%r9, 40(%0) ;"
+ "adcq $0, %%r10 ;"
+ "movq %%r10, 48(%0) ;"
+ "adcq $0, %%r11 ;"
+ "movq %%r11, 56(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%edx, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 32(%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11");
+}
+
+static void mul_256x256_integer_adx(u64 *const c, const u64 *const a, const u64 *const b)
+{
+ asm volatile(
+ "movq (%1), %%rdx; " /* A[0] */
+ "mulx (%2), %%r8, %%r9; " /* A[0]*B[0] */
+ "xorl %%r10d, %%r10d ;"
+ "movq %%r8, (%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[0]*B[1] */
+ "adox %%r9, %%r10 ;"
+ "movq %%r10, 8(%0) ;"
+ "mulx 16(%2), %%r12, %%r13; " /* A[0]*B[2] */
+ "adox %%r11, %%r12 ;"
+ "mulx 24(%2), %%r14, %%rdx; " /* A[0]*B[3] */
+ "adox %%r13, %%r14 ;"
+ "movq $0, %%rax ;"
+ /******************************************/
+ "adox %%rdx, %%rax ;"
+
+ "movq 8(%1), %%rdx; " /* A[1] */
+ "mulx (%2), %%r8, %%r9; " /* A[1]*B[0] */
+ "xorl %%r10d, %%r10d ;"
+ "adcx 8(%0), %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[1]*B[1] */
+ "adox %%r9, %%r10 ;"
+ "adcx %%r12, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "mulx 16(%2), %%r12, %%r13; " /* A[1]*B[2] */
+ "adox %%r11, %%r12 ;"
+ "adcx %%r14, %%r12 ;"
+ "movq $0, %%r8 ;"
+ "mulx 24(%2), %%r14, %%rdx; " /* A[1]*B[3] */
+ "adox %%r13, %%r14 ;"
+ "adcx %%rax, %%r14 ;"
+ "movq $0, %%rax ;"
+ /******************************************/
+ "adox %%rdx, %%rax ;"
+ "adcx %%r8, %%rax ;"
+
+ "movq 16(%1), %%rdx; " /* A[2] */
+ "mulx (%2), %%r8, %%r9; " /* A[2]*B[0] */
+ "xorl %%r10d, %%r10d ;"
+ "adcx 16(%0), %%r8 ;"
+ "movq %%r8, 16(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[2]*B[1] */
+ "adox %%r9, %%r10 ;"
+ "adcx %%r12, %%r10 ;"
+ "movq %%r10, 24(%0) ;"
+ "mulx 16(%2), %%r12, %%r13; " /* A[2]*B[2] */
+ "adox %%r11, %%r12 ;"
+ "adcx %%r14, %%r12 ;"
+ "movq $0, %%r8 ;"
+ "mulx 24(%2), %%r14, %%rdx; " /* A[2]*B[3] */
+ "adox %%r13, %%r14 ;"
+ "adcx %%rax, %%r14 ;"
+ "movq $0, %%rax ;"
+ /******************************************/
+ "adox %%rdx, %%rax ;"
+ "adcx %%r8, %%rax ;"
+
+ "movq 24(%1), %%rdx; " /* A[3] */
+ "mulx (%2), %%r8, %%r9; " /* A[3]*B[0] */
+ "xorl %%r10d, %%r10d ;"
+ "adcx 24(%0), %%r8 ;"
+ "movq %%r8, 24(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[3]*B[1] */
+ "adox %%r9, %%r10 ;"
+ "adcx %%r12, %%r10 ;"
+ "movq %%r10, 32(%0) ;"
+ "mulx 16(%2), %%r12, %%r13; " /* A[3]*B[2] */
+ "adox %%r11, %%r12 ;"
+ "adcx %%r14, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "movq $0, %%r8 ;"
+ "mulx 24(%2), %%r14, %%rdx; " /* A[3]*B[3] */
+ "adox %%r13, %%r14 ;"
+ "adcx %%rax, %%r14 ;"
+ "movq %%r14, 48(%0) ;"
+ "movq $0, %%rax ;"
+ /******************************************/
+ "adox %%rdx, %%rax ;"
+ "adcx %%r8, %%rax ;"
+ "movq %%rax, 56(%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(b)
+ : "memory", "cc", "%rax", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13", "%r14");
+}
+
+static void mul_256x256_integer_bmi2(u64 *const c, const u64 *const a, const u64 *const b)
+{
+ asm volatile(
+ "movq (%1), %%rdx; " /* A[0] */
+ "mulx (%2), %%r8, %%r12; " /* A[0]*B[0] */
+ "movq %%r8, (%0) ;"
+ "mulx 8(%2), %%r10, %%rax; " /* A[0]*B[1] */
+ "addq %%r10, %%r12 ;"
+ "mulx 16(%2), %%r8, %%rbx; " /* A[0]*B[2] */
+ "adcq %%r8, %%rax ;"
+ "mulx 24(%2), %%r10, %%rcx; " /* A[0]*B[3] */
+ "adcq %%r10, %%rbx ;"
+ /******************************************/
+ "adcq $0, %%rcx ;"
+
+ "movq 8(%1), %%rdx; " /* A[1] */
+ "mulx (%2), %%r8, %%r9; " /* A[1]*B[0] */
+ "addq %%r12, %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[1]*B[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[1]*B[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 24(%2), %%r10, %%r12; " /* A[1]*B[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%r12 ;"
+
+ "addq %%r9, %%rax ;"
+ "adcq %%r11, %%rbx ;"
+ "adcq %%r13, %%rcx ;"
+ "adcq $0, %%r12 ;"
+
+ "movq 16(%1), %%rdx; " /* A[2] */
+ "mulx (%2), %%r8, %%r9; " /* A[2]*B[0] */
+ "addq %%rax, %%r8 ;"
+ "movq %%r8, 16(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[2]*B[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[2]*B[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 24(%2), %%r10, %%rax; " /* A[2]*B[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%rax ;"
+
+ "addq %%r9, %%rbx ;"
+ "adcq %%r11, %%rcx ;"
+ "adcq %%r13, %%r12 ;"
+ "adcq $0, %%rax ;"
+
+ "movq 24(%1), %%rdx; " /* A[3] */
+ "mulx (%2), %%r8, %%r9; " /* A[3]*B[0] */
+ "addq %%rbx, %%r8 ;"
+ "movq %%r8, 24(%0) ;"
+ "mulx 8(%2), %%r10, %%r11; " /* A[3]*B[1] */
+ "adcq %%r10, %%r9 ;"
+ "mulx 16(%2), %%r8, %%r13; " /* A[3]*B[2] */
+ "adcq %%r8, %%r11 ;"
+ "mulx 24(%2), %%r10, %%rbx; " /* A[3]*B[3] */
+ "adcq %%r10, %%r13 ;"
+ /******************************************/
+ "adcq $0, %%rbx ;"
+
+ "addq %%r9, %%rcx ;"
+ "movq %%rcx, 32(%0) ;"
+ "adcq %%r11, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "adcq %%r13, %%rax ;"
+ "movq %%rax, 48(%0) ;"
+ "adcq $0, %%rbx ;"
+ "movq %%rbx, 56(%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(b)
+ : "memory", "cc", "%rax", "%rbx", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13");
+}
+
+static void sqr_256x256_integer_adx(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movq (%1), %%rdx ;" /* A[0] */
+ "mulx 8(%1), %%r8, %%r14 ;" /* A[1]*A[0] */
+ "xorl %%r15d, %%r15d;"
+ "mulx 16(%1), %%r9, %%r10 ;" /* A[2]*A[0] */
+ "adcx %%r14, %%r9 ;"
+ "mulx 24(%1), %%rax, %%rcx ;" /* A[3]*A[0] */
+ "adcx %%rax, %%r10 ;"
+ "movq 24(%1), %%rdx ;" /* A[3] */
+ "mulx 8(%1), %%r11, %%r12 ;" /* A[1]*A[3] */
+ "adcx %%rcx, %%r11 ;"
+ "mulx 16(%1), %%rax, %%r13 ;" /* A[2]*A[3] */
+ "adcx %%rax, %%r12 ;"
+ "movq 8(%1), %%rdx ;" /* A[1] */
+ "adcx %%r15, %%r13 ;"
+ "mulx 16(%1), %%rax, %%rcx ;" /* A[2]*A[1] */
+ "movq $0, %%r14 ;"
+ /******************************************/
+ "adcx %%r15, %%r14 ;"
+
+ "xorl %%r15d, %%r15d;"
+ "adox %%rax, %%r10 ;"
+ "adcx %%r8, %%r8 ;"
+ "adox %%rcx, %%r11 ;"
+ "adcx %%r9, %%r9 ;"
+ "adox %%r15, %%r12 ;"
+ "adcx %%r10, %%r10 ;"
+ "adox %%r15, %%r13 ;"
+ "adcx %%r11, %%r11 ;"
+ "adox %%r15, %%r14 ;"
+ "adcx %%r12, %%r12 ;"
+ "adcx %%r13, %%r13 ;"
+ "adcx %%r14, %%r14 ;"
+
+ "movq (%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[0]^2 */
+ /*******************/
+ "movq %%rax, 0(%0) ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "movq 8(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[1]^2 */
+ "adcq %%rax, %%r9 ;"
+ "movq %%r9, 16(%0) ;"
+ "adcq %%rcx, %%r10 ;"
+ "movq %%r10, 24(%0) ;"
+ "movq 16(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[2]^2 */
+ "adcq %%rax, %%r11 ;"
+ "movq %%r11, 32(%0) ;"
+ "adcq %%rcx, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "movq 24(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[3]^2 */
+ "adcq %%rax, %%r13 ;"
+ "movq %%r13, 48(%0) ;"
+ "adcq %%rcx, %%r14 ;"
+ "movq %%r14, 56(%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13", "%r14", "%r15");
+}
+
+static void sqr_256x256_integer_bmi2(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movq 8(%1), %%rdx ;" /* A[1] */
+ "mulx (%1), %%r8, %%r9 ;" /* A[0]*A[1] */
+ "mulx 16(%1), %%r10, %%r11 ;" /* A[2]*A[1] */
+ "mulx 24(%1), %%rcx, %%r14 ;" /* A[3]*A[1] */
+
+ "movq 16(%1), %%rdx ;" /* A[2] */
+ "mulx 24(%1), %%r12, %%r13 ;" /* A[3]*A[2] */
+ "mulx (%1), %%rax, %%rdx ;" /* A[0]*A[2] */
+
+ "addq %%rax, %%r9 ;"
+ "adcq %%rdx, %%r10 ;"
+ "adcq %%rcx, %%r11 ;"
+ "adcq %%r14, %%r12 ;"
+ "adcq $0, %%r13 ;"
+ "movq $0, %%r14 ;"
+ "adcq $0, %%r14 ;"
+
+ "movq (%1), %%rdx ;" /* A[0] */
+ "mulx 24(%1), %%rax, %%rcx ;" /* A[0]*A[3] */
+
+ "addq %%rax, %%r10 ;"
+ "adcq %%rcx, %%r11 ;"
+ "adcq $0, %%r12 ;"
+ "adcq $0, %%r13 ;"
+ "adcq $0, %%r14 ;"
+
+ "shldq $1, %%r13, %%r14 ;"
+ "shldq $1, %%r12, %%r13 ;"
+ "shldq $1, %%r11, %%r12 ;"
+ "shldq $1, %%r10, %%r11 ;"
+ "shldq $1, %%r9, %%r10 ;"
+ "shldq $1, %%r8, %%r9 ;"
+ "shlq $1, %%r8 ;"
+
+ /*******************/
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[0]^2 */
+ /*******************/
+ "movq %%rax, 0(%0) ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, 8(%0) ;"
+ "movq 8(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[1]^2 */
+ "adcq %%rax, %%r9 ;"
+ "movq %%r9, 16(%0) ;"
+ "adcq %%rcx, %%r10 ;"
+ "movq %%r10, 24(%0) ;"
+ "movq 16(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[2]^2 */
+ "adcq %%rax, %%r11 ;"
+ "movq %%r11, 32(%0) ;"
+ "adcq %%rcx, %%r12 ;"
+ "movq %%r12, 40(%0) ;"
+ "movq 24(%1), %%rdx ;"
+ "mulx %%rdx, %%rax, %%rcx ;" /* A[3]^2 */
+ "adcq %%rax, %%r13 ;"
+ "movq %%r13, 48(%0) ;"
+ "adcq %%rcx, %%r14 ;"
+ "movq %%r14, 56(%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11", "%r12", "%r13", "%r14");
+}
+
+static void red_eltfp25519_1w_adx(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movl $38, %%edx ;" /* 2*c = 38 = 2^256 */
+ "mulx 32(%1), %%r8, %%r10 ;" /* c*C[4] */
+ "xorl %%ebx, %%ebx ;"
+ "adox (%1), %%r8 ;"
+ "mulx 40(%1), %%r9, %%r11 ;" /* c*C[5] */
+ "adcx %%r10, %%r9 ;"
+ "adox 8(%1), %%r9 ;"
+ "mulx 48(%1), %%r10, %%rax ;" /* c*C[6] */
+ "adcx %%r11, %%r10 ;"
+ "adox 16(%1), %%r10 ;"
+ "mulx 56(%1), %%r11, %%rcx ;" /* c*C[7] */
+ "adcx %%rax, %%r11 ;"
+ "adox 24(%1), %%r11 ;"
+ /***************************************/
+ "adcx %%rbx, %%rcx ;"
+ "adox %%rbx, %%rcx ;"
+ "imul %%rdx, %%rcx ;" /* c*C[4], cf=0, of=0 */
+ "adcx %%rcx, %%r8 ;"
+ "adcx %%rbx, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "adcx %%rbx, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "adcx %%rbx, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%edx, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, (%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rbx", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11");
+}
+
+static void red_eltfp25519_1w_bmi2(u64 *const c, const u64 *const a)
+{
+ asm volatile(
+ "movl $38, %%edx ;" /* 2*c = 38 = 2^256 */
+ "mulx 32(%1), %%r8, %%r10 ;" /* c*C[4] */
+ "mulx 40(%1), %%r9, %%r11 ;" /* c*C[5] */
+ "addq %%r10, %%r9 ;"
+ "mulx 48(%1), %%r10, %%rax ;" /* c*C[6] */
+ "adcq %%r11, %%r10 ;"
+ "mulx 56(%1), %%r11, %%rcx ;" /* c*C[7] */
+ "adcq %%rax, %%r11 ;"
+ /***************************************/
+ "adcq $0, %%rcx ;"
+ "addq (%1), %%r8 ;"
+ "adcq 8(%1), %%r9 ;"
+ "adcq 16(%1), %%r10 ;"
+ "adcq 24(%1), %%r11 ;"
+ "adcq $0, %%rcx ;"
+ "imul %%rdx, %%rcx ;" /* c*C[4], cf=0 */
+ "addq %%rcx, %%r8 ;"
+ "adcq $0, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "adcq $0, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "adcq $0, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%edx, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, (%0) ;"
+ :
+ : "r"(c), "r"(a)
+ : "memory", "cc", "%rax", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11");
+}
+
+static __always_inline void add_eltfp25519_1w_adx(u64 *const c, const u64 *const a, const u64 *const b)
+{
+ asm volatile(
+ "mov $38, %%eax ;"
+ "xorl %%ecx, %%ecx ;"
+ "movq (%2), %%r8 ;"
+ "adcx (%1), %%r8 ;"
+ "movq 8(%2), %%r9 ;"
+ "adcx 8(%1), %%r9 ;"
+ "movq 16(%2), %%r10 ;"
+ "adcx 16(%1), %%r10 ;"
+ "movq 24(%2), %%r11 ;"
+ "adcx 24(%1), %%r11 ;"
+ "cmovc %%eax, %%ecx ;"
+ "xorl %%eax, %%eax ;"
+ "adcx %%rcx, %%r8 ;"
+ "adcx %%rax, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "adcx %%rax, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "adcx %%rax, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $38, %%ecx ;"
+ "cmovc %%ecx, %%eax ;"
+ "addq %%rax, %%r8 ;"
+ "movq %%r8, (%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(b)
+ : "memory", "cc", "%rax", "%rcx", "%r8", "%r9", "%r10", "%r11");
+}
+
+static __always_inline void add_eltfp25519_1w_bmi2(u64 *const c, const u64 *const a, const u64 *const b)
+{
+ asm volatile(
+ "mov $38, %%eax ;"
+ "movq (%2), %%r8 ;"
+ "addq (%1), %%r8 ;"
+ "movq 8(%2), %%r9 ;"
+ "adcq 8(%1), %%r9 ;"
+ "movq 16(%2), %%r10 ;"
+ "adcq 16(%1), %%r10 ;"
+ "movq 24(%2), %%r11 ;"
+ "adcq 24(%1), %%r11 ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%eax, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "adcq $0, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "adcq $0, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "adcq $0, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%eax, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, (%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(b)
+ : "memory", "cc", "%rax", "%rcx", "%r8", "%r9", "%r10", "%r11");
+}
+
+static __always_inline void sub_eltfp25519_1w(u64 *const c, const u64 *const a, const u64 *const b)
+{
+ asm volatile(
+ "mov $38, %%eax ;"
+ "movq (%1), %%r8 ;"
+ "subq (%2), %%r8 ;"
+ "movq 8(%1), %%r9 ;"
+ "sbbq 8(%2), %%r9 ;"
+ "movq 16(%1), %%r10 ;"
+ "sbbq 16(%2), %%r10 ;"
+ "movq 24(%1), %%r11 ;"
+ "sbbq 24(%2), %%r11 ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%eax, %%ecx ;"
+ "subq %%rcx, %%r8 ;"
+ "sbbq $0, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "sbbq $0, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "sbbq $0, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%eax, %%ecx ;"
+ "subq %%rcx, %%r8 ;"
+ "movq %%r8, (%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(b)
+ : "memory", "cc", "%rax", "%rcx", "%r8", "%r9", "%r10", "%r11");
+}
+
+/* Multiplication by a24 = (A+2)/4 = (486662+2)/4 = 121666 */
+static __always_inline void mul_a24_eltfp25519_1w(u64 *const c, const u64 *const a)
+{
+ const u64 a24 = 121666;
+ asm volatile(
+ "movq %2, %%rdx ;"
+ "mulx (%1), %%r8, %%r10 ;"
+ "mulx 8(%1), %%r9, %%r11 ;"
+ "addq %%r10, %%r9 ;"
+ "mulx 16(%1), %%r10, %%rax ;"
+ "adcq %%r11, %%r10 ;"
+ "mulx 24(%1), %%r11, %%rcx ;"
+ "adcq %%rax, %%r11 ;"
+ /**************************/
+ "adcq $0, %%rcx ;"
+ "movl $38, %%edx ;" /* 2*c = 38 = 2^256 mod 2^255-19*/
+ "imul %%rdx, %%rcx ;"
+ "addq %%rcx, %%r8 ;"
+ "adcq $0, %%r9 ;"
+ "movq %%r9, 8(%0) ;"
+ "adcq $0, %%r10 ;"
+ "movq %%r10, 16(%0) ;"
+ "adcq $0, %%r11 ;"
+ "movq %%r11, 24(%0) ;"
+ "mov $0, %%ecx ;"
+ "cmovc %%edx, %%ecx ;"
+ "addq %%rcx, %%r8 ;"
+ "movq %%r8, (%0) ;"
+ :
+ : "r"(c), "r"(a), "r"(a24)
+ : "memory", "cc", "%rax", "%rcx", "%rdx", "%r8", "%r9", "%r10", "%r11");
+}
+
+static void inv_eltfp25519_1w_adx(u64 *const c, const u64 *const a)
+{
+ struct {
+ eltfp25519_1w_buffer buffer;
+ eltfp25519_1w x0, x1, x2;
+ } __aligned(32) m;
+ u64 *T[4];
+
+ T[0] = m.x0;
+ T[1] = c; /* x^(-1) */
+ T[2] = m.x1;
+ T[3] = m.x2;
+
+ copy_eltfp25519_1w(T[1], a);
+ sqrn_eltfp25519_1w_adx(T[1], 1);
+ copy_eltfp25519_1w(T[2], T[1]);
+ sqrn_eltfp25519_1w_adx(T[2], 2);
+ mul_eltfp25519_1w_adx(T[0], a, T[2]);
+ mul_eltfp25519_1w_adx(T[1], T[1], T[0]);
+ copy_eltfp25519_1w(T[2], T[1]);
+ sqrn_eltfp25519_1w_adx(T[2], 1);
+ mul_eltfp25519_1w_adx(T[0], T[0], T[2]);
+ copy_eltfp25519_1w(T[2], T[0]);
+ sqrn_eltfp25519_1w_adx(T[2], 5);
+ mul_eltfp25519_1w_adx(T[0], T[0], T[2]);
+ copy_eltfp25519_1w(T[2], T[0]);
+ sqrn_eltfp25519_1w_adx(T[2], 10);
+ mul_eltfp25519_1w_adx(T[2], T[2], T[0]);
+ copy_eltfp25519_1w(T[3], T[2]);
+ sqrn_eltfp25519_1w_adx(T[3], 20);
+ mul_eltfp25519_1w_adx(T[3], T[3], T[2]);
+ sqrn_eltfp25519_1w_adx(T[3], 10);
+ mul_eltfp25519_1w_adx(T[3], T[3], T[0]);
+ copy_eltfp25519_1w(T[0], T[3]);
+ sqrn_eltfp25519_1w_adx(T[0], 50);
+ mul_eltfp25519_1w_adx(T[0], T[0], T[3]);
+ copy_eltfp25519_1w(T[2], T[0]);
+ sqrn_eltfp25519_1w_adx(T[2], 100);
+ mul_eltfp25519_1w_adx(T[2], T[2], T[0]);
+ sqrn_eltfp25519_1w_adx(T[2], 50);
+ mul_eltfp25519_1w_adx(T[2], T[2], T[3]);
+ sqrn_eltfp25519_1w_adx(T[2], 5);
+ mul_eltfp25519_1w_adx(T[1], T[1], T[2]);
+
+ memzero_explicit(&m, sizeof(m));
+}
+
+static void inv_eltfp25519_1w_bmi2(u64 *const c, const u64 *const a)
+{
+ struct {
+ eltfp25519_1w_buffer buffer;
+ eltfp25519_1w x0, x1, x2;
+ } __aligned(32) m;
+ u64 *T[5];
+
+ T[0] = m.x0;
+ T[1] = c; /* x^(-1) */
+ T[2] = m.x1;
+ T[3] = m.x2;
+
+ copy_eltfp25519_1w(T[1], a);
+ sqrn_eltfp25519_1w_bmi2(T[1], 1);
+ copy_eltfp25519_1w(T[2], T[1]);
+ sqrn_eltfp25519_1w_bmi2(T[2], 2);
+ mul_eltfp25519_1w_bmi2(T[0], a, T[2]);
+ mul_eltfp25519_1w_bmi2(T[1], T[1], T[0]);
+ copy_eltfp25519_1w(T[2], T[1]);
+ sqrn_eltfp25519_1w_bmi2(T[2], 1);
+ mul_eltfp25519_1w_bmi2(T[0], T[0], T[2]);
+ copy_eltfp25519_1w(T[2], T[0]);
+ sqrn_eltfp25519_1w_bmi2(T[2], 5);
+ mul_eltfp25519_1w_bmi2(T[0], T[0], T[2]);
+ copy_eltfp25519_1w(T[2], T[0]);
+ sqrn_eltfp25519_1w_bmi2(T[2], 10);
+ mul_eltfp25519_1w_bmi2(T[2], T[2], T[0]);
+ copy_eltfp25519_1w(T[3], T[2]);
+ sqrn_eltfp25519_1w_bmi2(T[3], 20);
+ mul_eltfp25519_1w_bmi2(T[3], T[3], T[2]);
+ sqrn_eltfp25519_1w_bmi2(T[3], 10);
+ mul_eltfp25519_1w_bmi2(T[3], T[3], T[0]);
+ copy_eltfp25519_1w(T[0], T[3]);
+ sqrn_eltfp25519_1w_bmi2(T[0], 50);
+ mul_eltfp25519_1w_bmi2(T[0], T[0], T[3]);
+ copy_eltfp25519_1w(T[2], T[0]);
+ sqrn_eltfp25519_1w_bmi2(T[2], 100);
+ mul_eltfp25519_1w_bmi2(T[2], T[2], T[0]);
+ sqrn_eltfp25519_1w_bmi2(T[2], 50);
+ mul_eltfp25519_1w_bmi2(T[2], T[2], T[3]);
+ sqrn_eltfp25519_1w_bmi2(T[2], 5);
+ mul_eltfp25519_1w_bmi2(T[1], T[1], T[2]);
+
+ memzero_explicit(&m, sizeof(m));
+}
+
+/* Given c, a 256-bit number, fred_eltfp25519_1w updates c
+ * with a number such that 0 <= C < 2**255-19.
+ */
+static __always_inline void fred_eltfp25519_1w(u64 *const c)
+{
+ u64 tmp0, tmp1;
+ asm volatile(
+ "movl $19, %k5 ;"
+ "movl $38, %k4 ;"
+
+ "btrq $63, %3 ;" /* Put bit 255 in carry flag and clear */
+ "cmovncl %k5, %k4 ;" /* c[255] ? 38 : 19 */
+
+ /* Add either 19 or 38 to c */
+ "addq %4, %0 ;"
+ "adcq $0, %1 ;"
+ "adcq $0, %2 ;"
+ "adcq $0, %3 ;"
+
+ /* Test for bit 255 again; only triggered on overflow modulo 2^255-19 */
+ "movl $0, %k4 ;"
+ "cmovnsl %k5, %k4 ;" /* c[255] ? 0 : 19 */
+ "btrq $63, %3 ;" /* Clear bit 255 */
+
+ /* Subtract 19 if necessary */
+ "subq %4, %0 ;"
+ "sbbq $0, %1 ;"
+ "sbbq $0, %2 ;"
+ "sbbq $0, %3 ;"
+
+ : "+r"(c[0]), "+r"(c[1]), "+r"(c[2]), "+r"(c[3]), "=r"(tmp0), "=r"(tmp1)
+ :
+ : "memory", "cc");
+}
+
+static __always_inline void cswap(u8 bit, u64 *const px, u64 *const py)
+{
+ u64 temp;
+ asm volatile(
+ "test %9, %9 ;"
+ "movq %0, %8 ;"
+ "cmovnzq %4, %0 ;"
+ "cmovnzq %8, %4 ;"
+ "movq %1, %8 ;"
+ "cmovnzq %5, %1 ;"
+ "cmovnzq %8, %5 ;"
+ "movq %2, %8 ;"
+ "cmovnzq %6, %2 ;"
+ "cmovnzq %8, %6 ;"
+ "movq %3, %8 ;"
+ "cmovnzq %7, %3 ;"
+ "cmovnzq %8, %7 ;"
+ : "+r"(px[0]), "+r"(px[1]), "+r"(px[2]), "+r"(px[3]),
+ "+r"(py[0]), "+r"(py[1]), "+r"(py[2]), "+r"(py[3]),
+ "=r"(temp)
+ : "r"(bit)
+ : "cc"
+ );
+}
+
+static __always_inline void cselect(u8 bit, u64 *const px, const u64 *const py)
+{
+ asm volatile(
+ "test %4, %4 ;"
+ "cmovnzq %5, %0 ;"
+ "cmovnzq %6, %1 ;"
+ "cmovnzq %7, %2 ;"
+ "cmovnzq %8, %3 ;"
+ : "+r"(px[0]), "+r"(px[1]), "+r"(px[2]), "+r"(px[3])
+ : "r"(bit), "rm"(py[0]), "rm"(py[1]), "rm"(py[2]), "rm"(py[3])
+ : "cc"
+ );
+}
+
+static void curve25519_adx(u8 shared[CURVE25519_POINT_SIZE], const u8 private_key[CURVE25519_POINT_SIZE], const u8 session_key[CURVE25519_POINT_SIZE])
+{
+ struct {
+ u64 buffer[4 * NUM_WORDS_ELTFP25519];
+ u64 coordinates[4 * NUM_WORDS_ELTFP25519];
+ u64 workspace[6 * NUM_WORDS_ELTFP25519];
+ u8 session[CURVE25519_POINT_SIZE];
+ u8 private[CURVE25519_POINT_SIZE];
+ } __aligned(32) m;
+
+ int i = 0, j = 0;
+ u64 prev = 0;
+ u64 *const X1 = (u64 *)m.session;
+ u64 *const key = (u64 *)m.private;
+ u64 *const Px = m.coordinates + 0;
+ u64 *const Pz = m.coordinates + 4;
+ u64 *const Qx = m.coordinates + 8;
+ u64 *const Qz = m.coordinates + 12;
+ u64 *const X2 = Qx;
+ u64 *const Z2 = Qz;
+ u64 *const X3 = Px;
+ u64 *const Z3 = Pz;
+ u64 *const X2Z2 = Qx;
+ u64 *const X3Z3 = Px;
+
+ u64 *const A = m.workspace + 0;
+ u64 *const B = m.workspace + 4;
+ u64 *const D = m.workspace + 8;
+ u64 *const C = m.workspace + 12;
+ u64 *const DA = m.workspace + 16;
+ u64 *const CB = m.workspace + 20;
+ u64 *const AB = A;
+ u64 *const DC = D;
+ u64 *const DACB = DA;
+
+ memcpy(m.private, private_key, sizeof(m.private));
+ memcpy(m.session, session_key, sizeof(m.session));
+
+ normalize_secret(m.private);
+
+ /* As in the draft:
+ * When receiving such an array, implementations of curve25519
+ * MUST mask the most-significant bit in the final byte. This
+ * is done to preserve compatibility with point formats which
+ * reserve the sign bit for use in other protocols and to
+ * increase resistance to implementation fingerprinting
+ */
+ m.session[CURVE25519_POINT_SIZE - 1] &= (1 << (255 % 8)) - 1;
+
+ copy_eltfp25519_1w(Px, X1);
+ setzero_eltfp25519_1w(Pz);
+ setzero_eltfp25519_1w(Qx);
+ setzero_eltfp25519_1w(Qz);
+
+ Pz[0] = 1;
+ Qx[0] = 1;
+
+ /* main-loop */
+ prev = 0;
+ j = 62;
+ for (i = 3; i >= 0; --i) {
+ while (j >= 0) {
+ u64 bit = (key[i] >> j) & 0x1;
+ u64 swap = bit ^ prev;
+ prev = bit;
+
+ add_eltfp25519_1w_adx(A, X2, Z2); /* A = (X2+Z2) */
+ sub_eltfp25519_1w(B, X2, Z2); /* B = (X2-Z2) */
+ add_eltfp25519_1w_adx(C, X3, Z3); /* C = (X3+Z3) */
+ sub_eltfp25519_1w(D, X3, Z3); /* D = (X3-Z3) */
+ mul_eltfp25519_2w_adx(DACB, AB, DC); /* [DA|CB] = [A|B]*[D|C] */
+
+ cselect(swap, A, C);
+ cselect(swap, B, D);
+
+ sqr_eltfp25519_2w_adx(AB); /* [AA|BB] = [A^2|B^2] */
+ add_eltfp25519_1w_adx(X3, DA, CB); /* X3 = (DA+CB) */
+ sub_eltfp25519_1w(Z3, DA, CB); /* Z3 = (DA-CB) */
+ sqr_eltfp25519_2w_adx(X3Z3); /* [X3|Z3] = [(DA+CB)|(DA+CB)]^2 */
+
+ copy_eltfp25519_1w(X2, B); /* X2 = B^2 */
+ sub_eltfp25519_1w(Z2, A, B); /* Z2 = E = AA-BB */
+
+ mul_a24_eltfp25519_1w(B, Z2); /* B = a24*E */
+ add_eltfp25519_1w_adx(B, B, X2); /* B = a24*E+B */
+ mul_eltfp25519_2w_adx(X2Z2, X2Z2, AB); /* [X2|Z2] = [B|E]*[A|a24*E+B] */
+ mul_eltfp25519_1w_adx(Z3, Z3, X1); /* Z3 = Z3*X1 */
+ --j;
+ }
+ j = 63;
+ }
+
+ inv_eltfp25519_1w_adx(A, Qz);
+ mul_eltfp25519_1w_adx((u64 *)shared, Qx, A);
+ fred_eltfp25519_1w((u64 *)shared);
+
+ memzero_explicit(&m, sizeof(m));
+}
+
+static void curve25519_adx_base(u8 session_key[CURVE25519_POINT_SIZE], const u8 private_key[CURVE25519_POINT_SIZE])
+{
+ struct {
+ u64 buffer[4 * NUM_WORDS_ELTFP25519];
+ u64 coordinates[4 * NUM_WORDS_ELTFP25519];
+ u64 workspace[4 * NUM_WORDS_ELTFP25519];
+ u8 private[CURVE25519_POINT_SIZE];
+ } __aligned(32) m;
+
+ const int ite[4] = { 64, 64, 64, 63 };
+ const int q = 3;
+ u64 swap = 1;
+
+ int i = 0, j = 0, k = 0;
+ u64 *const key = (u64 *)m.private;
+ u64 *const Ur1 = m.coordinates + 0;
+ u64 *const Zr1 = m.coordinates + 4;
+ u64 *const Ur2 = m.coordinates + 8;
+ u64 *const Zr2 = m.coordinates + 12;
+
+ u64 *const UZr1 = m.coordinates + 0;
+ u64 *const ZUr2 = m.coordinates + 8;
+
+ u64 *const A = m.workspace + 0;
+ u64 *const B = m.workspace + 4;
+ u64 *const C = m.workspace + 8;
+ u64 *const D = m.workspace + 12;
+
+ u64 *const AB = m.workspace + 0;
+ u64 *const CD = m.workspace + 8;
+
+ const u64 *const P = table_ladder_8k;
+
+ memcpy(m.private, private_key, sizeof(m.private));
+
+ normalize_secret(m.private);
+
+ setzero_eltfp25519_1w(Ur1);
+ setzero_eltfp25519_1w(Zr1);
+ setzero_eltfp25519_1w(Zr2);
+ Ur1[0] = 1;
+ Zr1[0] = 1;
+ Zr2[0] = 1;
+
+ /* G-S */
+ Ur2[3] = 0x1eaecdeee27cab34UL;
+ Ur2[2] = 0xadc7a0b9235d48e2UL;
+ Ur2[1] = 0xbbf095ae14b2edf8UL;
+ Ur2[0] = 0x7e94e1fec82faabdUL;
+
+ /* main-loop */
+ j = q;
+ for (i = 0; i < NUM_WORDS_ELTFP25519; ++i) {
+ while (j < ite[i]) {
+ u64 bit = (key[i] >> j) & 0x1;
+ k = (64 * i + j - q);
+ swap = swap ^ bit;
+ cswap(swap, Ur1, Ur2);
+ cswap(swap, Zr1, Zr2);
+ swap = bit;
+ /* Addition */
+ sub_eltfp25519_1w(B, Ur1, Zr1); /* B = Ur1-Zr1 */
+ add_eltfp25519_1w_adx(A, Ur1, Zr1); /* A = Ur1+Zr1 */
+ mul_eltfp25519_1w_adx(C, &P[4 * k], B); /* C = M0-B */
+ sub_eltfp25519_1w(B, A, C); /* B = (Ur1+Zr1) - M*(Ur1-Zr1) */
+ add_eltfp25519_1w_adx(A, A, C); /* A = (Ur1+Zr1) + M*(Ur1-Zr1) */
+ sqr_eltfp25519_2w_adx(AB); /* A = A^2 | B = B^2 */
+ mul_eltfp25519_2w_adx(UZr1, ZUr2, AB); /* Ur1 = Zr2*A | Zr1 = Ur2*B */
+ ++j;
+ }
+ j = 0;
+ }
+
+ /* Doubling */
+ for (i = 0; i < q; ++i) {
+ add_eltfp25519_1w_adx(A, Ur1, Zr1); /* A = Ur1+Zr1 */
+ sub_eltfp25519_1w(B, Ur1, Zr1); /* B = Ur1-Zr1 */
+ sqr_eltfp25519_2w_adx(AB); /* A = A**2 B = B**2 */
+ copy_eltfp25519_1w(C, B); /* C = B */
+ sub_eltfp25519_1w(B, A, B); /* B = A-B */
+ mul_a24_eltfp25519_1w(D, B); /* D = my_a24*B */
+ add_eltfp25519_1w_adx(D, D, C); /* D = D+C */
+ mul_eltfp25519_2w_adx(UZr1, AB, CD); /* Ur1 = A*B Zr1 = Zr1*A */
+ }
+
+ /* Convert to affine coordinates */
+ inv_eltfp25519_1w_adx(A, Zr1);
+ mul_eltfp25519_1w_adx((u64 *)session_key, Ur1, A);
+ fred_eltfp25519_1w((u64 *)session_key);
+
+ memzero_explicit(&m, sizeof(m));
+}
+
+static void curve25519_bmi2(u8 shared[CURVE25519_POINT_SIZE], const u8 private_key[CURVE25519_POINT_SIZE], const u8 session_key[CURVE25519_POINT_SIZE])
+{
+ struct {
+ u64 buffer[4 * NUM_WORDS_ELTFP25519];
+ u64 coordinates[4 * NUM_WORDS_ELTFP25519];
+ u64 workspace[6 * NUM_WORDS_ELTFP25519];
+ u8 session[CURVE25519_POINT_SIZE];
+ u8 private[CURVE25519_POINT_SIZE];
+ } __aligned(32) m;
+
+ int i = 0, j = 0;
+ u64 prev = 0;
+ u64 *const X1 = (u64 *)m.session;
+ u64 *const key = (u64 *)m.private;
+ u64 *const Px = m.coordinates + 0;
+ u64 *const Pz = m.coordinates + 4;
+ u64 *const Qx = m.coordinates + 8;
+ u64 *const Qz = m.coordinates + 12;
+ u64 *const X2 = Qx;
+ u64 *const Z2 = Qz;
+ u64 *const X3 = Px;
+ u64 *const Z3 = Pz;
+ u64 *const X2Z2 = Qx;
+ u64 *const X3Z3 = Px;
+
+ u64 *const A = m.workspace + 0;
+ u64 *const B = m.workspace + 4;
+ u64 *const D = m.workspace + 8;
+ u64 *const C = m.workspace + 12;
+ u64 *const DA = m.workspace + 16;
+ u64 *const CB = m.workspace + 20;
+ u64 *const AB = A;
+ u64 *const DC = D;
+ u64 *const DACB = DA;
+
+ memcpy(m.private, private_key, sizeof(m.private));
+ memcpy(m.session, session_key, sizeof(m.session));
+
+ normalize_secret(m.private);
+
+ /* As in the draft:
+ * When receiving such an array, implementations of curve25519
+ * MUST mask the most-significant bit in the final byte. This
+ * is done to preserve compatibility with point formats which
+ * reserve the sign bit for use in other protocols and to
+ * increase resistance to implementation fingerprinting
+ */
+ m.session[CURVE25519_POINT_SIZE - 1] &= (1 << (255 % 8)) - 1;
+
+ copy_eltfp25519_1w(Px, X1);
+ setzero_eltfp25519_1w(Pz);
+ setzero_eltfp25519_1w(Qx);
+ setzero_eltfp25519_1w(Qz);
+
+ Pz[0] = 1;
+ Qx[0] = 1;
+
+ /* main-loop */
+ prev = 0;
+ j = 62;
+ for (i = 3; i >= 0; --i) {
+ while (j >= 0) {
+ u64 bit = (key[i] >> j) & 0x1;
+ u64 swap = bit ^ prev;
+ prev = bit;
+
+ add_eltfp25519_1w_bmi2(A, X2, Z2); /* A = (X2+Z2) */
+ sub_eltfp25519_1w(B, X2, Z2); /* B = (X2-Z2) */
+ add_eltfp25519_1w_bmi2(C, X3, Z3); /* C = (X3+Z3) */
+ sub_eltfp25519_1w(D, X3, Z3); /* D = (X3-Z3) */
+ mul_eltfp25519_2w_bmi2(DACB, AB, DC); /* [DA|CB] = [A|B]*[D|C] */
+
+ cselect(swap, A, C);
+ cselect(swap, B, D);
+
+ sqr_eltfp25519_2w_bmi2(AB); /* [AA|BB] = [A^2|B^2] */
+ add_eltfp25519_1w_bmi2(X3, DA, CB); /* X3 = (DA+CB) */
+ sub_eltfp25519_1w(Z3, DA, CB); /* Z3 = (DA-CB) */
+ sqr_eltfp25519_2w_bmi2(X3Z3); /* [X3|Z3] = [(DA+CB)|(DA+CB)]^2 */
+
+ copy_eltfp25519_1w(X2, B); /* X2 = B^2 */
+ sub_eltfp25519_1w(Z2, A, B); /* Z2 = E = AA-BB */
+
+ mul_a24_eltfp25519_1w(B, Z2); /* B = a24*E */
+ add_eltfp25519_1w_bmi2(B, B, X2); /* B = a24*E+B */
+ mul_eltfp25519_2w_bmi2(X2Z2, X2Z2, AB); /* [X2|Z2] = [B|E]*[A|a24*E+B] */
+ mul_eltfp25519_1w_bmi2(Z3, Z3, X1); /* Z3 = Z3*X1 */
+ --j;
+ }
+ j = 63;
+ }
+
+ inv_eltfp25519_1w_bmi2(A, Qz);
+ mul_eltfp25519_1w_bmi2((u64 *)shared, Qx, A);
+ fred_eltfp25519_1w((u64 *)shared);
+
+ memzero_explicit(&m, sizeof(m));
+}
+
+static void curve25519_bmi2_base(u8 session_key[CURVE25519_POINT_SIZE], const u8 private_key[CURVE25519_POINT_SIZE])
+{
+ struct {
+ u64 buffer[4 * NUM_WORDS_ELTFP25519];
+ u64 coordinates[4 * NUM_WORDS_ELTFP25519];
+ u64 workspace[4 * NUM_WORDS_ELTFP25519];
+ u8 private[CURVE25519_POINT_SIZE];
+ } __aligned(32) m;
+
+ const int ite[4] = { 64, 64, 64, 63 };
+ const int q = 3;
+ u64 swap = 1;
+
+ int i = 0, j = 0, k = 0;
+ u64 *const key = (u64 *)m.private;
+ u64 *const Ur1 = m.coordinates + 0;
+ u64 *const Zr1 = m.coordinates + 4;
+ u64 *const Ur2 = m.coordinates + 8;
+ u64 *const Zr2 = m.coordinates + 12;
+
+ u64 *const UZr1 = m.coordinates + 0;
+ u64 *const ZUr2 = m.coordinates + 8;
+
+ u64 *const A = m.workspace + 0;
+ u64 *const B = m.workspace + 4;
+ u64 *const C = m.workspace + 8;
+ u64 *const D = m.workspace + 12;
+
+ u64 *const AB = m.workspace + 0;
+ u64 *const CD = m.workspace + 8;
+
+ const u64 *const P = table_ladder_8k;
+
+ memcpy(m.private, private_key, sizeof(m.private));
+
+ normalize_secret(m.private);
+
+ setzero_eltfp25519_1w(Ur1);
+ setzero_eltfp25519_1w(Zr1);
+ setzero_eltfp25519_1w(Zr2);
+ Ur1[0] = 1;
+ Zr1[0] = 1;
+ Zr2[0] = 1;
+
+ /* G-S */
+ Ur2[3] = 0x1eaecdeee27cab34UL;
+ Ur2[2] = 0xadc7a0b9235d48e2UL;
+ Ur2[1] = 0xbbf095ae14b2edf8UL;
+ Ur2[0] = 0x7e94e1fec82faabdUL;
+
+ /* main-loop */
+ j = q;
+ for (i = 0; i < NUM_WORDS_ELTFP25519; ++i) {
+ while (j < ite[i]) {
+ u64 bit = (key[i] >> j) & 0x1;
+ k = (64 * i + j - q);
+ swap = swap ^ bit;
+ cswap(swap, Ur1, Ur2);
+ cswap(swap, Zr1, Zr2);
+ swap = bit;
+ /* Addition */
+ sub_eltfp25519_1w(B, Ur1, Zr1); /* B = Ur1-Zr1 */
+ add_eltfp25519_1w_bmi2(A, Ur1, Zr1); /* A = Ur1+Zr1 */
+ mul_eltfp25519_1w_bmi2(C, &P[4 * k], B);/* C = M0-B */
+ sub_eltfp25519_1w(B, A, C); /* B = (Ur1+Zr1) - M*(Ur1-Zr1) */
+ add_eltfp25519_1w_bmi2(A, A, C); /* A = (Ur1+Zr1) + M*(Ur1-Zr1) */
+ sqr_eltfp25519_2w_bmi2(AB); /* A = A^2 | B = B^2 */
+ mul_eltfp25519_2w_bmi2(UZr1, ZUr2, AB); /* Ur1 = Zr2*A | Zr1 = Ur2*B */
+ ++j;
+ }
+ j = 0;
+ }
+
+ /* Doubling */
+ for (i = 0; i < q; ++i) {
+ add_eltfp25519_1w_bmi2(A, Ur1, Zr1); /* A = Ur1+Zr1 */
+ sub_eltfp25519_1w(B, Ur1, Zr1); /* B = Ur1-Zr1 */
+ sqr_eltfp25519_2w_bmi2(AB); /* A = A**2 B = B**2 */
+ copy_eltfp25519_1w(C, B); /* C = B */
+ sub_eltfp25519_1w(B, A, B); /* B = A-B */
+ mul_a24_eltfp25519_1w(D, B); /* D = my_a24*B */
+ add_eltfp25519_1w_bmi2(D, D, C); /* D = D+C */
+ mul_eltfp25519_2w_bmi2(UZr1, AB, CD); /* Ur1 = A*B Zr1 = Zr1*A */
+ }
+
+ /* Convert to affine coordinates */
+ inv_eltfp25519_1w_bmi2(A, Zr1);
+ mul_eltfp25519_1w_bmi2((u64 *)session_key, Ur1, A);
+ fred_eltfp25519_1w((u64 *)session_key);
+
+ memzero_explicit(&m, sizeof(m));
+}
diff --git a/lib/zinc/curve25519/curve25519.c b/lib/zinc/curve25519/curve25519.c
new file mode 100644
index 000000000000..b70de02a4c07
--- /dev/null
+++ b/lib/zinc/curve25519/curve25519.c
@@ -0,0 +1,86 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#include <zinc/curve25519.h>
+
+#include <linux/version.h>
+#include <linux/string.h>
+#include <linux/random.h>
+#include <crypto/algapi.h>
+
+static __always_inline void normalize_secret(u8 secret[CURVE25519_POINT_SIZE])
+{
+ secret[0] &= 248;
+ secret[31] &= 127;
+ secret[31] |= 64;
+}
+
+#if defined(CONFIG_X86_64)
+#include "curve25519-x86_64.h"
+#elif IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && defined(CONFIG_ARM)
+#include "curve25519-arm.h"
+#else
+void __init curve25519_fpu_init(void) { }
+#endif
+
+#if defined(CONFIG_ARCH_SUPPORTS_INT128) && defined(__SIZEOF_INT128__)
+#include "curve25519-hacl64.h"
+#else
+#include "curve25519-fiat32.h"
+#endif
+
+static const u8 null_point[CURVE25519_POINT_SIZE] = { 0 };
+
+bool curve25519(u8 mypublic[CURVE25519_POINT_SIZE], const u8 secret[CURVE25519_POINT_SIZE], const u8 basepoint[CURVE25519_POINT_SIZE])
+{
+#if defined(CONFIG_X86_64)
+ if (curve25519_use_adx)
+ curve25519_adx(mypublic, secret, basepoint);
+ else if (curve25519_use_bmi2)
+ curve25519_bmi2(mypublic, secret, basepoint);
+ else
+#elif IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && defined(CONFIG_ARM)
+ if (curve25519_use_neon && may_use_simd()) {
+ kernel_neon_begin();
+ curve25519_neon(mypublic, secret, basepoint);
+ kernel_neon_end();
+ } else
+#endif
+ curve25519_generic(mypublic, secret, basepoint);
+
+ return crypto_memneq(mypublic, null_point, CURVE25519_POINT_SIZE);
+}
+EXPORT_SYMBOL(curve25519);
+
+bool curve25519_generate_public(u8 pub[CURVE25519_POINT_SIZE], const u8 secret[CURVE25519_POINT_SIZE])
+{
+ static const u8 basepoint[CURVE25519_POINT_SIZE] __aligned(32) = { 9 };
+
+ if (unlikely(!crypto_memneq(secret, null_point, CURVE25519_POINT_SIZE)))
+ return false;
+
+#if defined(CONFIG_X86_64)
+ if (curve25519_use_adx) {
+ curve25519_adx_base(pub, secret);
+ return crypto_memneq(pub, null_point, CURVE25519_POINT_SIZE);
+ }
+ if (curve25519_use_bmi2) {
+ curve25519_bmi2_base(pub, secret);
+ return crypto_memneq(pub, null_point, CURVE25519_POINT_SIZE);
+ }
+#endif
+
+ return curve25519(pub, secret, basepoint);
+}
+EXPORT_SYMBOL(curve25519_generate_public);
+
+void curve25519_generate_secret(u8 secret[CURVE25519_POINT_SIZE])
+{
+ get_random_bytes_wait(secret, CURVE25519_POINT_SIZE);
+ normalize_secret(secret);
+}
+EXPORT_SYMBOL(curve25519_generate_secret);
+
+#include "selftest/curve25519.h"
diff --git a/lib/zinc/main.c b/lib/zinc/main.c
new file mode 100644
index 000000000000..8d29182e8df0
--- /dev/null
+++ b/lib/zinc/main.c
@@ -0,0 +1,36 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#include <zinc/chacha20poly1305.h>
+#include <zinc/chacha20.h>
+#include <zinc/poly1305.h>
+#include <zinc/blake2s.h>
+#include <zinc/curve25519.h>
+
+#include <linux/init.h>
+#include <linux/module.h>
+
+static int __init mod_init(void)
+{
+ chacha20_fpu_init();
+ poly1305_fpu_init();
+ blake2s_fpu_init();
+ curve25519_fpu_init();
+#ifdef CONFIG_ZINC_DEBUG
+ if (!curve25519_selftest() || !poly1305_selftest() || !chacha20poly1305_selftest() || !blake2s_selftest())
+ return -ENOTRECOVERABLE;
+#endif
+ return 0;
+}
+
+static void __exit mod_exit(void)
+{
+}
+
+module_init(mod_init);
+module_exit(mod_exit);
+MODULE_LICENSE("GPL v2");
+MODULE_DESCRIPTION("Zinc cryptography library");
+MODULE_AUTHOR("Jason A. Donenfeld <Jason@...c4.com>");
diff --git a/lib/zinc/poly1305/poly1305-arm.S b/lib/zinc/poly1305/poly1305-arm.S
new file mode 100644
index 000000000000..d6b9a80dcc8b
--- /dev/null
+++ b/lib/zinc/poly1305/poly1305-arm.S
@@ -0,0 +1,1115 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#include <linux/linkage.h>
+
+.text
+#if defined(__thumb2__)
+.syntax unified
+.thumb
+#else
+.code 32
+#endif
+
+.align 5
+ENTRY(poly1305_init_arm)
+ stmdb sp!,{r4-r11}
+
+ eor r3,r3,r3
+ cmp r1,#0
+ str r3,[r0,#0] @ zero hash value
+ str r3,[r0,#4]
+ str r3,[r0,#8]
+ str r3,[r0,#12]
+ str r3,[r0,#16]
+ str r3,[r0,#36] @ is_base2_26
+ add r0,r0,#20
+
+#ifdef __thumb2__
+ it eq
+#endif
+ moveq r0,#0
+ beq .Lno_key
+
+ ldrb r4,[r1,#0]
+ mov r10,#0x0fffffff
+ ldrb r5,[r1,#1]
+ and r3,r10,#-4 @ 0x0ffffffc
+ ldrb r6,[r1,#2]
+ ldrb r7,[r1,#3]
+ orr r4,r4,r5,lsl#8
+ ldrb r5,[r1,#4]
+ orr r4,r4,r6,lsl#16
+ ldrb r6,[r1,#5]
+ orr r4,r4,r7,lsl#24
+ ldrb r7,[r1,#6]
+ and r4,r4,r10
+
+ ldrb r8,[r1,#7]
+ orr r5,r5,r6,lsl#8
+ ldrb r6,[r1,#8]
+ orr r5,r5,r7,lsl#16
+ ldrb r7,[r1,#9]
+ orr r5,r5,r8,lsl#24
+ ldrb r8,[r1,#10]
+ and r5,r5,r3
+
+ ldrb r9,[r1,#11]
+ orr r6,r6,r7,lsl#8
+ ldrb r7,[r1,#12]
+ orr r6,r6,r8,lsl#16
+ ldrb r8,[r1,#13]
+ orr r6,r6,r9,lsl#24
+ ldrb r9,[r1,#14]
+ and r6,r6,r3
+
+ ldrb r10,[r1,#15]
+ orr r7,r7,r8,lsl#8
+ str r4,[r0,#0]
+ orr r7,r7,r9,lsl#16
+ str r5,[r0,#4]
+ orr r7,r7,r10,lsl#24
+ str r6,[r0,#8]
+ and r7,r7,r3
+ str r7,[r0,#12]
+.Lno_key:
+ ldmia sp!,{r4-r11}
+#if __LINUX_ARM_ARCH__ >= 5
+ bx lr @ bx lr
+#else
+ tst lr,#1
+ moveq pc,lr @ be binary compatible with V4, yet
+ .word 0xe12fff1e @ interoperable with Thumb ISA:-)
+#endif
+ENDPROC(poly1305_init_arm)
+
+.align 5
+ENTRY(poly1305_blocks_arm)
+.Lpoly1305_blocks_arm:
+ stmdb sp!,{r3-r11,lr}
+
+ ands r2,r2,#-16
+ beq .Lno_data
+
+ cmp r3,#0
+ add r2,r2,r1 @ end pointer
+ sub sp,sp,#32
+
+ ldmia r0,{r4-r12} @ load context
+
+ str r0,[sp,#12] @ offload stuff
+ mov lr,r1
+ str r2,[sp,#16]
+ str r10,[sp,#20]
+ str r11,[sp,#24]
+ str r12,[sp,#28]
+ b .Loop
+
+.Loop:
+#if __LINUX_ARM_ARCH__ < 7
+ ldrb r0,[lr],#16 @ load input
+#ifdef __thumb2__
+ it hi
+#endif
+ addhi r8,r8,#1 @ 1<<128
+ ldrb r1,[lr,#-15]
+ ldrb r2,[lr,#-14]
+ ldrb r3,[lr,#-13]
+ orr r1,r0,r1,lsl#8
+ ldrb r0,[lr,#-12]
+ orr r2,r1,r2,lsl#16
+ ldrb r1,[lr,#-11]
+ orr r3,r2,r3,lsl#24
+ ldrb r2,[lr,#-10]
+ adds r4,r4,r3 @ accumulate input
+
+ ldrb r3,[lr,#-9]
+ orr r1,r0,r1,lsl#8
+ ldrb r0,[lr,#-8]
+ orr r2,r1,r2,lsl#16
+ ldrb r1,[lr,#-7]
+ orr r3,r2,r3,lsl#24
+ ldrb r2,[lr,#-6]
+ adcs r5,r5,r3
+
+ ldrb r3,[lr,#-5]
+ orr r1,r0,r1,lsl#8
+ ldrb r0,[lr,#-4]
+ orr r2,r1,r2,lsl#16
+ ldrb r1,[lr,#-3]
+ orr r3,r2,r3,lsl#24
+ ldrb r2,[lr,#-2]
+ adcs r6,r6,r3
+
+ ldrb r3,[lr,#-1]
+ orr r1,r0,r1,lsl#8
+ str lr,[sp,#8] @ offload input pointer
+ orr r2,r1,r2,lsl#16
+ add r10,r10,r10,lsr#2
+ orr r3,r2,r3,lsl#24
+#else
+ ldr r0,[lr],#16 @ load input
+#ifdef __thumb2__
+ it hi
+#endif
+ addhi r8,r8,#1 @ padbit
+ ldr r1,[lr,#-12]
+ ldr r2,[lr,#-8]
+ ldr r3,[lr,#-4]
+#ifdef __ARMEB__
+ rev r0,r0
+ rev r1,r1
+ rev r2,r2
+ rev r3,r3
+#endif
+ adds r4,r4,r0 @ accumulate input
+ str lr,[sp,#8] @ offload input pointer
+ adcs r5,r5,r1
+ add r10,r10,r10,lsr#2
+ adcs r6,r6,r2
+#endif
+ add r11,r11,r11,lsr#2
+ adcs r7,r7,r3
+ add r12,r12,r12,lsr#2
+
+ umull r2,r3,r5,r9
+ adc r8,r8,#0
+ umull r0,r1,r4,r9
+ umlal r2,r3,r8,r10
+ umlal r0,r1,r7,r10
+ ldr r10,[sp,#20] @ reload r10
+ umlal r2,r3,r6,r12
+ umlal r0,r1,r5,r12
+ umlal r2,r3,r7,r11
+ umlal r0,r1,r6,r11
+ umlal r2,r3,r4,r10
+ str r0,[sp,#0] @ future r4
+ mul r0,r11,r8
+ ldr r11,[sp,#24] @ reload r11
+ adds r2,r2,r1 @ d1+=d0>>32
+ eor r1,r1,r1
+ adc lr,r3,#0 @ future r6
+ str r2,[sp,#4] @ future r5
+
+ mul r2,r12,r8
+ eor r3,r3,r3
+ umlal r0,r1,r7,r12
+ ldr r12,[sp,#28] @ reload r12
+ umlal r2,r3,r7,r9
+ umlal r0,r1,r6,r9
+ umlal r2,r3,r6,r10
+ umlal r0,r1,r5,r10
+ umlal r2,r3,r5,r11
+ umlal r0,r1,r4,r11
+ umlal r2,r3,r4,r12
+ ldr r4,[sp,#0]
+ mul r8,r9,r8
+ ldr r5,[sp,#4]
+
+ adds r6,lr,r0 @ d2+=d1>>32
+ ldr lr,[sp,#8] @ reload input pointer
+ adc r1,r1,#0
+ adds r7,r2,r1 @ d3+=d2>>32
+ ldr r0,[sp,#16] @ reload end pointer
+ adc r3,r3,#0
+ add r8,r8,r3 @ h4+=d3>>32
+
+ and r1,r8,#-4
+ and r8,r8,#3
+ add r1,r1,r1,lsr#2 @ *=5
+ adds r4,r4,r1
+ adcs r5,r5,#0
+ adcs r6,r6,#0
+ adcs r7,r7,#0
+ adc r8,r8,#0
+
+ cmp r0,lr @ done yet?
+ bhi .Loop
+
+ ldr r0,[sp,#12]
+ add sp,sp,#32
+ stmia r0,{r4-r8} @ store the result
+
+.Lno_data:
+#if __LINUX_ARM_ARCH__ >= 5
+ ldmia sp!,{r3-r11,pc}
+#else
+ ldmia sp!,{r3-r11,lr}
+ tst lr,#1
+ moveq pc,lr @ be binary compatible with V4, yet
+ .word 0xe12fff1e @ interoperable with Thumb ISA:-)
+#endif
+ENDPROC(poly1305_blocks_arm)
+
+.align 5
+ENTRY(poly1305_emit_arm)
+ stmdb sp!,{r4-r11}
+.Lpoly1305_emit_enter:
+ ldmia r0,{r3-r7}
+ adds r8,r3,#5 @ compare to modulus
+ adcs r9,r4,#0
+ adcs r10,r5,#0
+ adcs r11,r6,#0
+ adc r7,r7,#0
+ tst r7,#4 @ did it carry/borrow?
+
+#ifdef __thumb2__
+ it ne
+#endif
+ movne r3,r8
+ ldr r8,[r2,#0]
+#ifdef __thumb2__
+ it ne
+#endif
+ movne r4,r9
+ ldr r9,[r2,#4]
+#ifdef __thumb2__
+ it ne
+#endif
+ movne r5,r10
+ ldr r10,[r2,#8]
+#ifdef __thumb2__
+ it ne
+#endif
+ movne r6,r11
+ ldr r11,[r2,#12]
+
+ adds r3,r3,r8
+ adcs r4,r4,r9
+ adcs r5,r5,r10
+ adc r6,r6,r11
+
+#if __LINUX_ARM_ARCH__ >= 7
+#ifdef __ARMEB__
+ rev r3,r3
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+#endif
+ str r3,[r1,#0]
+ str r4,[r1,#4]
+ str r5,[r1,#8]
+ str r6,[r1,#12]
+#else
+ strb r3,[r1,#0]
+ mov r3,r3,lsr#8
+ strb r4,[r1,#4]
+ mov r4,r4,lsr#8
+ strb r5,[r1,#8]
+ mov r5,r5,lsr#8
+ strb r6,[r1,#12]
+ mov r6,r6,lsr#8
+
+ strb r3,[r1,#1]
+ mov r3,r3,lsr#8
+ strb r4,[r1,#5]
+ mov r4,r4,lsr#8
+ strb r5,[r1,#9]
+ mov r5,r5,lsr#8
+ strb r6,[r1,#13]
+ mov r6,r6,lsr#8
+
+ strb r3,[r1,#2]
+ mov r3,r3,lsr#8
+ strb r4,[r1,#6]
+ mov r4,r4,lsr#8
+ strb r5,[r1,#10]
+ mov r5,r5,lsr#8
+ strb r6,[r1,#14]
+ mov r6,r6,lsr#8
+
+ strb r3,[r1,#3]
+ strb r4,[r1,#7]
+ strb r5,[r1,#11]
+ strb r6,[r1,#15]
+#endif
+ ldmia sp!,{r4-r11}
+#if __LINUX_ARM_ARCH__ >= 5
+ bx lr @ bx lr
+#else
+ tst lr,#1
+ moveq pc,lr @ be binary compatible with V4, yet
+ .word 0xe12fff1e @ interoperable with Thumb ISA:-)
+#endif
+ENDPROC(poly1305_emit_arm)
+
+
+#if __LINUX_ARM_ARCH__ >= 7
+.fpu neon
+
+.align 5
+ENTRY(poly1305_init_neon)
+.Lpoly1305_init_neon:
+ ldr r4,[r0,#20] @ load key base 2^32
+ ldr r5,[r0,#24]
+ ldr r6,[r0,#28]
+ ldr r7,[r0,#32]
+
+ and r2,r4,#0x03ffffff @ base 2^32 -> base 2^26
+ mov r3,r4,lsr#26
+ mov r4,r5,lsr#20
+ orr r3,r3,r5,lsl#6
+ mov r5,r6,lsr#14
+ orr r4,r4,r6,lsl#12
+ mov r6,r7,lsr#8
+ orr r5,r5,r7,lsl#18
+ and r3,r3,#0x03ffffff
+ and r4,r4,#0x03ffffff
+ and r5,r5,#0x03ffffff
+
+ vdup.32 d0,r2 @ r^1 in both lanes
+ add r2,r3,r3,lsl#2 @ *5
+ vdup.32 d1,r3
+ add r3,r4,r4,lsl#2
+ vdup.32 d2,r2
+ vdup.32 d3,r4
+ add r4,r5,r5,lsl#2
+ vdup.32 d4,r3
+ vdup.32 d5,r5
+ add r5,r6,r6,lsl#2
+ vdup.32 d6,r4
+ vdup.32 d7,r6
+ vdup.32 d8,r5
+
+ mov r5,#2 @ counter
+
+.Lsquare_neon:
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ d0 = h0*r0 + h4*5*r1 + h3*5*r2 + h2*5*r3 + h1*5*r4
+ @ d1 = h1*r0 + h0*r1 + h4*5*r2 + h3*5*r3 + h2*5*r4
+ @ d2 = h2*r0 + h1*r1 + h0*r2 + h4*5*r3 + h3*5*r4
+ @ d3 = h3*r0 + h2*r1 + h1*r2 + h0*r3 + h4*5*r4
+ @ d4 = h4*r0 + h3*r1 + h2*r2 + h1*r3 + h0*r4
+
+ vmull.u32 q5,d0,d0[1]
+ vmull.u32 q6,d1,d0[1]
+ vmull.u32 q7,d3,d0[1]
+ vmull.u32 q8,d5,d0[1]
+ vmull.u32 q9,d7,d0[1]
+
+ vmlal.u32 q5,d7,d2[1]
+ vmlal.u32 q6,d0,d1[1]
+ vmlal.u32 q7,d1,d1[1]
+ vmlal.u32 q8,d3,d1[1]
+ vmlal.u32 q9,d5,d1[1]
+
+ vmlal.u32 q5,d5,d4[1]
+ vmlal.u32 q6,d7,d4[1]
+ vmlal.u32 q8,d1,d3[1]
+ vmlal.u32 q7,d0,d3[1]
+ vmlal.u32 q9,d3,d3[1]
+
+ vmlal.u32 q5,d3,d6[1]
+ vmlal.u32 q8,d0,d5[1]
+ vmlal.u32 q6,d5,d6[1]
+ vmlal.u32 q7,d7,d6[1]
+ vmlal.u32 q9,d1,d5[1]
+
+ vmlal.u32 q8,d7,d8[1]
+ vmlal.u32 q5,d1,d8[1]
+ vmlal.u32 q6,d3,d8[1]
+ vmlal.u32 q7,d5,d8[1]
+ vmlal.u32 q9,d0,d7[1]
+
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ lazy reduction as discussed in "NEON crypto" by D.J. Bernstein
+ @ and P. Schwabe
+ @
+ @ H0>>+H1>>+H2>>+H3>>+H4
+ @ H3>>+H4>>*5+H0>>+H1
+ @
+ @ Trivia.
+ @
+ @ Result of multiplication of n-bit number by m-bit number is
+ @ n+m bits wide. However! Even though 2^n is a n+1-bit number,
+ @ m-bit number multiplied by 2^n is still n+m bits wide.
+ @
+ @ Sum of two n-bit numbers is n+1 bits wide, sum of three - n+2,
+ @ and so is sum of four. Sum of 2^m n-m-bit numbers and n-bit
+ @ one is n+1 bits wide.
+ @
+ @ >>+ denotes Hnext += Hn>>26, Hn &= 0x3ffffff. This means that
+ @ H0, H2, H3 are guaranteed to be 26 bits wide, while H1 and H4
+ @ can be 27. However! In cases when their width exceeds 26 bits
+ @ they are limited by 2^26+2^6. This in turn means that *sum*
+ @ of the products with these values can still be viewed as sum
+ @ of 52-bit numbers as long as the amount of addends is not a
+ @ power of 2. For example,
+ @
+ @ H4 = H4*R0 + H3*R1 + H2*R2 + H1*R3 + H0 * R4,
+ @
+ @ which can't be larger than 5 * (2^26 + 2^6) * (2^26 + 2^6), or
+ @ 5 * (2^52 + 2*2^32 + 2^12), which in turn is smaller than
+ @ 8 * (2^52) or 2^55. However, the value is then multiplied by
+ @ by 5, so we should be looking at 5 * 5 * (2^52 + 2^33 + 2^12),
+ @ which is less than 32 * (2^52) or 2^57. And when processing
+ @ data we are looking at triple as many addends...
+ @
+ @ In key setup procedure pre-reduced H0 is limited by 5*4+1 and
+ @ 5*H4 - by 5*5 52-bit addends, or 57 bits. But when hashing the
+ @ input H0 is limited by (5*4+1)*3 addends, or 58 bits, while
+ @ 5*H4 by 5*5*3, or 59[!] bits. How is this relevant? vmlal.u32
+ @ instruction accepts 2x32-bit input and writes 2x64-bit result.
+ @ This means that result of reduction have to be compressed upon
+ @ loop wrap-around. This can be done in the process of reduction
+ @ to minimize amount of instructions [as well as amount of
+ @ 128-bit instructions, which benefits low-end processors], but
+ @ one has to watch for H2 (which is narrower than H0) and 5*H4
+ @ not being wider than 58 bits, so that result of right shift
+ @ by 26 bits fits in 32 bits. This is also useful on x86,
+ @ because it allows to use paddd in place for paddq, which
+ @ benefits Atom, where paddq is ridiculously slow.
+
+ vshr.u64 q15,q8,#26
+ vmovn.i64 d16,q8
+ vshr.u64 q4,q5,#26
+ vmovn.i64 d10,q5
+ vadd.i64 q9,q9,q15 @ h3 -> h4
+ vbic.i32 d16,#0xfc000000 @ &=0x03ffffff
+ vadd.i64 q6,q6,q4 @ h0 -> h1
+ vbic.i32 d10,#0xfc000000
+
+ vshrn.u64 d30,q9,#26
+ vmovn.i64 d18,q9
+ vshr.u64 q4,q6,#26
+ vmovn.i64 d12,q6
+ vadd.i64 q7,q7,q4 @ h1 -> h2
+ vbic.i32 d18,#0xfc000000
+ vbic.i32 d12,#0xfc000000
+
+ vadd.i32 d10,d10,d30
+ vshl.u32 d30,d30,#2
+ vshrn.u64 d8,q7,#26
+ vmovn.i64 d14,q7
+ vadd.i32 d10,d10,d30 @ h4 -> h0
+ vadd.i32 d16,d16,d8 @ h2 -> h3
+ vbic.i32 d14,#0xfc000000
+
+ vshr.u32 d30,d10,#26
+ vbic.i32 d10,#0xfc000000
+ vshr.u32 d8,d16,#26
+ vbic.i32 d16,#0xfc000000
+ vadd.i32 d12,d12,d30 @ h0 -> h1
+ vadd.i32 d18,d18,d8 @ h3 -> h4
+
+ subs r5,r5,#1
+ beq .Lsquare_break_neon
+
+ add r6,r0,#(48+0*9*4)
+ add r7,r0,#(48+1*9*4)
+
+ vtrn.32 d0,d10 @ r^2:r^1
+ vtrn.32 d3,d14
+ vtrn.32 d5,d16
+ vtrn.32 d1,d12
+ vtrn.32 d7,d18
+
+ vshl.u32 d4,d3,#2 @ *5
+ vshl.u32 d6,d5,#2
+ vshl.u32 d2,d1,#2
+ vshl.u32 d8,d7,#2
+ vadd.i32 d4,d4,d3
+ vadd.i32 d2,d2,d1
+ vadd.i32 d6,d6,d5
+ vadd.i32 d8,d8,d7
+
+ vst4.32 {d0[0],d1[0],d2[0],d3[0]},[r6]!
+ vst4.32 {d0[1],d1[1],d2[1],d3[1]},[r7]!
+ vst4.32 {d4[0],d5[0],d6[0],d7[0]},[r6]!
+ vst4.32 {d4[1],d5[1],d6[1],d7[1]},[r7]!
+ vst1.32 {d8[0]},[r6,:32]
+ vst1.32 {d8[1]},[r7,:32]
+
+ b .Lsquare_neon
+
+.align 4
+.Lsquare_break_neon:
+ add r6,r0,#(48+2*4*9)
+ add r7,r0,#(48+3*4*9)
+
+ vmov d0,d10 @ r^4:r^3
+ vshl.u32 d2,d12,#2 @ *5
+ vmov d1,d12
+ vshl.u32 d4,d14,#2
+ vmov d3,d14
+ vshl.u32 d6,d16,#2
+ vmov d5,d16
+ vshl.u32 d8,d18,#2
+ vmov d7,d18
+ vadd.i32 d2,d2,d12
+ vadd.i32 d4,d4,d14
+ vadd.i32 d6,d6,d16
+ vadd.i32 d8,d8,d18
+
+ vst4.32 {d0[0],d1[0],d2[0],d3[0]},[r6]!
+ vst4.32 {d0[1],d1[1],d2[1],d3[1]},[r7]!
+ vst4.32 {d4[0],d5[0],d6[0],d7[0]},[r6]!
+ vst4.32 {d4[1],d5[1],d6[1],d7[1]},[r7]!
+ vst1.32 {d8[0]},[r6]
+ vst1.32 {d8[1]},[r7]
+
+ bx lr @ bx lr
+ENDPROC(poly1305_init_neon)
+
+.align 5
+ENTRY(poly1305_blocks_neon)
+ ldr ip,[r0,#36] @ is_base2_26
+ ands r2,r2,#-16
+ beq .Lno_data_neon
+
+ cmp r2,#64
+ bhs .Lenter_neon
+ tst ip,ip @ is_base2_26?
+ beq .Lpoly1305_blocks_arm
+
+.Lenter_neon:
+ stmdb sp!,{r4-r7}
+ vstmdb sp!,{d8-d15} @ ABI specification says so
+
+ tst ip,ip @ is_base2_26?
+ bne .Lbase2_26_neon
+
+ stmdb sp!,{r1-r3,lr}
+ bl .Lpoly1305_init_neon
+
+ ldr r4,[r0,#0] @ load hash value base 2^32
+ ldr r5,[r0,#4]
+ ldr r6,[r0,#8]
+ ldr r7,[r0,#12]
+ ldr ip,[r0,#16]
+
+ and r2,r4,#0x03ffffff @ base 2^32 -> base 2^26
+ mov r3,r4,lsr#26
+ veor d10,d10,d10
+ mov r4,r5,lsr#20
+ orr r3,r3,r5,lsl#6
+ veor d12,d12,d12
+ mov r5,r6,lsr#14
+ orr r4,r4,r6,lsl#12
+ veor d14,d14,d14
+ mov r6,r7,lsr#8
+ orr r5,r5,r7,lsl#18
+ veor d16,d16,d16
+ and r3,r3,#0x03ffffff
+ orr r6,r6,ip,lsl#24
+ veor d18,d18,d18
+ and r4,r4,#0x03ffffff
+ mov r1,#1
+ and r5,r5,#0x03ffffff
+ str r1,[r0,#36] @ is_base2_26
+
+ vmov.32 d10[0],r2
+ vmov.32 d12[0],r3
+ vmov.32 d14[0],r4
+ vmov.32 d16[0],r5
+ vmov.32 d18[0],r6
+ adr r5,.Lzeros
+
+ ldmia sp!,{r1-r3,lr}
+ b .Lbase2_32_neon
+
+.align 4
+.Lbase2_26_neon:
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ load hash value
+
+ veor d10,d10,d10
+ veor d12,d12,d12
+ veor d14,d14,d14
+ veor d16,d16,d16
+ veor d18,d18,d18
+ vld4.32 {d10[0],d12[0],d14[0],d16[0]},[r0]!
+ adr r5,.Lzeros
+ vld1.32 {d18[0]},[r0]
+ sub r0,r0,#16 @ rewind
+
+.Lbase2_32_neon:
+ add r4,r1,#32
+ mov r3,r3,lsl#24
+ tst r2,#31
+ beq .Leven
+
+ vld4.32 {d20[0],d22[0],d24[0],d26[0]},[r1]!
+ vmov.32 d28[0],r3
+ sub r2,r2,#16
+ add r4,r1,#32
+
+#ifdef __ARMEB__
+ vrev32.8 q10,q10
+ vrev32.8 q13,q13
+ vrev32.8 q11,q11
+ vrev32.8 q12,q12
+#endif
+ vsri.u32 d28,d26,#8 @ base 2^32 -> base 2^26
+ vshl.u32 d26,d26,#18
+
+ vsri.u32 d26,d24,#14
+ vshl.u32 d24,d24,#12
+ vadd.i32 d29,d28,d18 @ add hash value and move to #hi
+
+ vbic.i32 d26,#0xfc000000
+ vsri.u32 d24,d22,#20
+ vshl.u32 d22,d22,#6
+
+ vbic.i32 d24,#0xfc000000
+ vsri.u32 d22,d20,#26
+ vadd.i32 d27,d26,d16
+
+ vbic.i32 d20,#0xfc000000
+ vbic.i32 d22,#0xfc000000
+ vadd.i32 d25,d24,d14
+
+ vadd.i32 d21,d20,d10
+ vadd.i32 d23,d22,d12
+
+ mov r7,r5
+ add r6,r0,#48
+
+ cmp r2,r2
+ b .Long_tail
+
+.align 4
+.Leven:
+ subs r2,r2,#64
+ it lo
+ movlo r4,r5
+
+ vmov.i32 q14,#1<<24 @ padbit, yes, always
+ vld4.32 {d20,d22,d24,d26},[r1] @ inp[0:1]
+ add r1,r1,#64
+ vld4.32 {d21,d23,d25,d27},[r4] @ inp[2:3] (or 0)
+ add r4,r4,#64
+ itt hi
+ addhi r7,r0,#(48+1*9*4)
+ addhi r6,r0,#(48+3*9*4)
+
+#ifdef __ARMEB__
+ vrev32.8 q10,q10
+ vrev32.8 q13,q13
+ vrev32.8 q11,q11
+ vrev32.8 q12,q12
+#endif
+ vsri.u32 q14,q13,#8 @ base 2^32 -> base 2^26
+ vshl.u32 q13,q13,#18
+
+ vsri.u32 q13,q12,#14
+ vshl.u32 q12,q12,#12
+
+ vbic.i32 q13,#0xfc000000
+ vsri.u32 q12,q11,#20
+ vshl.u32 q11,q11,#6
+
+ vbic.i32 q12,#0xfc000000
+ vsri.u32 q11,q10,#26
+
+ vbic.i32 q10,#0xfc000000
+ vbic.i32 q11,#0xfc000000
+
+ bls .Lskip_loop
+
+ vld4.32 {d0[1],d1[1],d2[1],d3[1]},[r7]! @ load r^2
+ vld4.32 {d0[0],d1[0],d2[0],d3[0]},[r6]! @ load r^4
+ vld4.32 {d4[1],d5[1],d6[1],d7[1]},[r7]!
+ vld4.32 {d4[0],d5[0],d6[0],d7[0]},[r6]!
+ b .Loop_neon
+
+.align 5
+.Loop_neon:
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ ((inp[0]*r^4+inp[2]*r^2+inp[4])*r^4+inp[6]*r^2
+ @ ((inp[1]*r^4+inp[3]*r^2+inp[5])*r^3+inp[7]*r
+ @ ___________________/
+ @ ((inp[0]*r^4+inp[2]*r^2+inp[4])*r^4+inp[6]*r^2+inp[8])*r^2
+ @ ((inp[1]*r^4+inp[3]*r^2+inp[5])*r^4+inp[7]*r^2+inp[9])*r
+ @ ___________________/ ____________________/
+ @
+ @ Note that we start with inp[2:3]*r^2. This is because it
+ @ doesn't depend on reduction in previous iteration.
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ d4 = h4*r0 + h3*r1 + h2*r2 + h1*r3 + h0*r4
+ @ d3 = h3*r0 + h2*r1 + h1*r2 + h0*r3 + h4*5*r4
+ @ d2 = h2*r0 + h1*r1 + h0*r2 + h4*5*r3 + h3*5*r4
+ @ d1 = h1*r0 + h0*r1 + h4*5*r2 + h3*5*r3 + h2*5*r4
+ @ d0 = h0*r0 + h4*5*r1 + h3*5*r2 + h2*5*r3 + h1*5*r4
+
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ inp[2:3]*r^2
+
+ vadd.i32 d24,d24,d14 @ accumulate inp[0:1]
+ vmull.u32 q7,d25,d0[1]
+ vadd.i32 d20,d20,d10
+ vmull.u32 q5,d21,d0[1]
+ vadd.i32 d26,d26,d16
+ vmull.u32 q8,d27,d0[1]
+ vmlal.u32 q7,d23,d1[1]
+ vadd.i32 d22,d22,d12
+ vmull.u32 q6,d23,d0[1]
+
+ vadd.i32 d28,d28,d18
+ vmull.u32 q9,d29,d0[1]
+ subs r2,r2,#64
+ vmlal.u32 q5,d29,d2[1]
+ it lo
+ movlo r4,r5
+ vmlal.u32 q8,d25,d1[1]
+ vld1.32 d8[1],[r7,:32]
+ vmlal.u32 q6,d21,d1[1]
+ vmlal.u32 q9,d27,d1[1]
+
+ vmlal.u32 q5,d27,d4[1]
+ vmlal.u32 q8,d23,d3[1]
+ vmlal.u32 q9,d25,d3[1]
+ vmlal.u32 q6,d29,d4[1]
+ vmlal.u32 q7,d21,d3[1]
+
+ vmlal.u32 q8,d21,d5[1]
+ vmlal.u32 q5,d25,d6[1]
+ vmlal.u32 q9,d23,d5[1]
+ vmlal.u32 q6,d27,d6[1]
+ vmlal.u32 q7,d29,d6[1]
+
+ vmlal.u32 q8,d29,d8[1]
+ vmlal.u32 q5,d23,d8[1]
+ vmlal.u32 q9,d21,d7[1]
+ vmlal.u32 q6,d25,d8[1]
+ vmlal.u32 q7,d27,d8[1]
+
+ vld4.32 {d21,d23,d25,d27},[r4] @ inp[2:3] (or 0)
+ add r4,r4,#64
+
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ (hash+inp[0:1])*r^4 and accumulate
+
+ vmlal.u32 q8,d26,d0[0]
+ vmlal.u32 q5,d20,d0[0]
+ vmlal.u32 q9,d28,d0[0]
+ vmlal.u32 q6,d22,d0[0]
+ vmlal.u32 q7,d24,d0[0]
+ vld1.32 d8[0],[r6,:32]
+
+ vmlal.u32 q8,d24,d1[0]
+ vmlal.u32 q5,d28,d2[0]
+ vmlal.u32 q9,d26,d1[0]
+ vmlal.u32 q6,d20,d1[0]
+ vmlal.u32 q7,d22,d1[0]
+
+ vmlal.u32 q8,d22,d3[0]
+ vmlal.u32 q5,d26,d4[0]
+ vmlal.u32 q9,d24,d3[0]
+ vmlal.u32 q6,d28,d4[0]
+ vmlal.u32 q7,d20,d3[0]
+
+ vmlal.u32 q8,d20,d5[0]
+ vmlal.u32 q5,d24,d6[0]
+ vmlal.u32 q9,d22,d5[0]
+ vmlal.u32 q6,d26,d6[0]
+ vmlal.u32 q8,d28,d8[0]
+
+ vmlal.u32 q7,d28,d6[0]
+ vmlal.u32 q5,d22,d8[0]
+ vmlal.u32 q9,d20,d7[0]
+ vmov.i32 q14,#1<<24 @ padbit, yes, always
+ vmlal.u32 q6,d24,d8[0]
+ vmlal.u32 q7,d26,d8[0]
+
+ vld4.32 {d20,d22,d24,d26},[r1] @ inp[0:1]
+ add r1,r1,#64
+#ifdef __ARMEB__
+ vrev32.8 q10,q10
+ vrev32.8 q11,q11
+ vrev32.8 q12,q12
+ vrev32.8 q13,q13
+#endif
+
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ lazy reduction interleaved with base 2^32 -> base 2^26 of
+ @ inp[0:3] previously loaded to q10-q13 and smashed to q10-q14.
+
+ vshr.u64 q15,q8,#26
+ vmovn.i64 d16,q8
+ vshr.u64 q4,q5,#26
+ vmovn.i64 d10,q5
+ vadd.i64 q9,q9,q15 @ h3 -> h4
+ vbic.i32 d16,#0xfc000000
+ vsri.u32 q14,q13,#8 @ base 2^32 -> base 2^26
+ vadd.i64 q6,q6,q4 @ h0 -> h1
+ vshl.u32 q13,q13,#18
+ vbic.i32 d10,#0xfc000000
+
+ vshrn.u64 d30,q9,#26
+ vmovn.i64 d18,q9
+ vshr.u64 q4,q6,#26
+ vmovn.i64 d12,q6
+ vadd.i64 q7,q7,q4 @ h1 -> h2
+ vsri.u32 q13,q12,#14
+ vbic.i32 d18,#0xfc000000
+ vshl.u32 q12,q12,#12
+ vbic.i32 d12,#0xfc000000
+
+ vadd.i32 d10,d10,d30
+ vshl.u32 d30,d30,#2
+ vbic.i32 q13,#0xfc000000
+ vshrn.u64 d8,q7,#26
+ vmovn.i64 d14,q7
+ vaddl.u32 q5,d10,d30 @ h4 -> h0 [widen for a sec]
+ vsri.u32 q12,q11,#20
+ vadd.i32 d16,d16,d8 @ h2 -> h3
+ vshl.u32 q11,q11,#6
+ vbic.i32 d14,#0xfc000000
+ vbic.i32 q12,#0xfc000000
+
+ vshrn.u64 d30,q5,#26 @ re-narrow
+ vmovn.i64 d10,q5
+ vsri.u32 q11,q10,#26
+ vbic.i32 q10,#0xfc000000
+ vshr.u32 d8,d16,#26
+ vbic.i32 d16,#0xfc000000
+ vbic.i32 d10,#0xfc000000
+ vadd.i32 d12,d12,d30 @ h0 -> h1
+ vadd.i32 d18,d18,d8 @ h3 -> h4
+ vbic.i32 q11,#0xfc000000
+
+ bhi .Loop_neon
+
+.Lskip_loop:
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ multiply (inp[0:1]+hash) or inp[2:3] by r^2:r^1
+
+ add r7,r0,#(48+0*9*4)
+ add r6,r0,#(48+1*9*4)
+ adds r2,r2,#32
+ it ne
+ movne r2,#0
+ bne .Long_tail
+
+ vadd.i32 d25,d24,d14 @ add hash value and move to #hi
+ vadd.i32 d21,d20,d10
+ vadd.i32 d27,d26,d16
+ vadd.i32 d23,d22,d12
+ vadd.i32 d29,d28,d18
+
+.Long_tail:
+ vld4.32 {d0[1],d1[1],d2[1],d3[1]},[r7]! @ load r^1
+ vld4.32 {d0[0],d1[0],d2[0],d3[0]},[r6]! @ load r^2
+
+ vadd.i32 d24,d24,d14 @ can be redundant
+ vmull.u32 q7,d25,d0
+ vadd.i32 d20,d20,d10
+ vmull.u32 q5,d21,d0
+ vadd.i32 d26,d26,d16
+ vmull.u32 q8,d27,d0
+ vadd.i32 d22,d22,d12
+ vmull.u32 q6,d23,d0
+ vadd.i32 d28,d28,d18
+ vmull.u32 q9,d29,d0
+
+ vmlal.u32 q5,d29,d2
+ vld4.32 {d4[1],d5[1],d6[1],d7[1]},[r7]!
+ vmlal.u32 q8,d25,d1
+ vld4.32 {d4[0],d5[0],d6[0],d7[0]},[r6]!
+ vmlal.u32 q6,d21,d1
+ vmlal.u32 q9,d27,d1
+ vmlal.u32 q7,d23,d1
+
+ vmlal.u32 q8,d23,d3
+ vld1.32 d8[1],[r7,:32]
+ vmlal.u32 q5,d27,d4
+ vld1.32 d8[0],[r6,:32]
+ vmlal.u32 q9,d25,d3
+ vmlal.u32 q6,d29,d4
+ vmlal.u32 q7,d21,d3
+
+ vmlal.u32 q8,d21,d5
+ it ne
+ addne r7,r0,#(48+2*9*4)
+ vmlal.u32 q5,d25,d6
+ it ne
+ addne r6,r0,#(48+3*9*4)
+ vmlal.u32 q9,d23,d5
+ vmlal.u32 q6,d27,d6
+ vmlal.u32 q7,d29,d6
+
+ vmlal.u32 q8,d29,d8
+ vorn q0,q0,q0 @ all-ones, can be redundant
+ vmlal.u32 q5,d23,d8
+ vshr.u64 q0,q0,#38
+ vmlal.u32 q9,d21,d7
+ vmlal.u32 q6,d25,d8
+ vmlal.u32 q7,d27,d8
+
+ beq .Lshort_tail
+
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ (hash+inp[0:1])*r^4:r^3 and accumulate
+
+ vld4.32 {d0[1],d1[1],d2[1],d3[1]},[r7]! @ load r^3
+ vld4.32 {d0[0],d1[0],d2[0],d3[0]},[r6]! @ load r^4
+
+ vmlal.u32 q7,d24,d0
+ vmlal.u32 q5,d20,d0
+ vmlal.u32 q8,d26,d0
+ vmlal.u32 q6,d22,d0
+ vmlal.u32 q9,d28,d0
+
+ vmlal.u32 q5,d28,d2
+ vld4.32 {d4[1],d5[1],d6[1],d7[1]},[r7]!
+ vmlal.u32 q8,d24,d1
+ vld4.32 {d4[0],d5[0],d6[0],d7[0]},[r6]!
+ vmlal.u32 q6,d20,d1
+ vmlal.u32 q9,d26,d1
+ vmlal.u32 q7,d22,d1
+
+ vmlal.u32 q8,d22,d3
+ vld1.32 d8[1],[r7,:32]
+ vmlal.u32 q5,d26,d4
+ vld1.32 d8[0],[r6,:32]
+ vmlal.u32 q9,d24,d3
+ vmlal.u32 q6,d28,d4
+ vmlal.u32 q7,d20,d3
+
+ vmlal.u32 q8,d20,d5
+ vmlal.u32 q5,d24,d6
+ vmlal.u32 q9,d22,d5
+ vmlal.u32 q6,d26,d6
+ vmlal.u32 q7,d28,d6
+
+ vmlal.u32 q8,d28,d8
+ vorn q0,q0,q0 @ all-ones
+ vmlal.u32 q5,d22,d8
+ vshr.u64 q0,q0,#38
+ vmlal.u32 q9,d20,d7
+ vmlal.u32 q6,d24,d8
+ vmlal.u32 q7,d26,d8
+
+.Lshort_tail:
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ horizontal addition
+
+ vadd.i64 d16,d16,d17
+ vadd.i64 d10,d10,d11
+ vadd.i64 d18,d18,d19
+ vadd.i64 d12,d12,d13
+ vadd.i64 d14,d14,d15
+
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ lazy reduction, but without narrowing
+
+ vshr.u64 q15,q8,#26
+ vand.i64 q8,q8,q0
+ vshr.u64 q4,q5,#26
+ vand.i64 q5,q5,q0
+ vadd.i64 q9,q9,q15 @ h3 -> h4
+ vadd.i64 q6,q6,q4 @ h0 -> h1
+
+ vshr.u64 q15,q9,#26
+ vand.i64 q9,q9,q0
+ vshr.u64 q4,q6,#26
+ vand.i64 q6,q6,q0
+ vadd.i64 q7,q7,q4 @ h1 -> h2
+
+ vadd.i64 q5,q5,q15
+ vshl.u64 q15,q15,#2
+ vshr.u64 q4,q7,#26
+ vand.i64 q7,q7,q0
+ vadd.i64 q5,q5,q15 @ h4 -> h0
+ vadd.i64 q8,q8,q4 @ h2 -> h3
+
+ vshr.u64 q15,q5,#26
+ vand.i64 q5,q5,q0
+ vshr.u64 q4,q8,#26
+ vand.i64 q8,q8,q0
+ vadd.i64 q6,q6,q15 @ h0 -> h1
+ vadd.i64 q9,q9,q4 @ h3 -> h4
+
+ cmp r2,#0
+ bne .Leven
+
+ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
+ @ store hash value
+
+ vst4.32 {d10[0],d12[0],d14[0],d16[0]},[r0]!
+ vst1.32 {d18[0]},[r0]
+
+ vldmia sp!,{d8-d15} @ epilogue
+ ldmia sp!,{r4-r7}
+.Lno_data_neon:
+ bx lr @ bx lr
+ENDPROC(poly1305_blocks_neon)
+
+.align 5
+ENTRY(poly1305_emit_neon)
+ ldr ip,[r0,#36] @ is_base2_26
+
+ stmdb sp!,{r4-r11}
+
+ tst ip,ip
+ beq .Lpoly1305_emit_enter
+
+ ldmia r0,{r3-r7}
+ eor r8,r8,r8
+
+ adds r3,r3,r4,lsl#26 @ base 2^26 -> base 2^32
+ mov r4,r4,lsr#6
+ adcs r4,r4,r5,lsl#20
+ mov r5,r5,lsr#12
+ adcs r5,r5,r6,lsl#14
+ mov r6,r6,lsr#18
+ adcs r6,r6,r7,lsl#8
+ adc r7,r8,r7,lsr#24 @ can be partially reduced ...
+
+ and r8,r7,#-4 @ ... so reduce
+ and r7,r6,#3
+ add r8,r8,r8,lsr#2 @ *= 5
+ adds r3,r3,r8
+ adcs r4,r4,#0
+ adcs r5,r5,#0
+ adcs r6,r6,#0
+ adc r7,r7,#0
+
+ adds r8,r3,#5 @ compare to modulus
+ adcs r9,r4,#0
+ adcs r10,r5,#0
+ adcs r11,r6,#0
+ adc r7,r7,#0
+ tst r7,#4 @ did it carry/borrow?
+
+ it ne
+ movne r3,r8
+ ldr r8,[r2,#0]
+ it ne
+ movne r4,r9
+ ldr r9,[r2,#4]
+ it ne
+ movne r5,r10
+ ldr r10,[r2,#8]
+ it ne
+ movne r6,r11
+ ldr r11,[r2,#12]
+
+ adds r3,r3,r8 @ accumulate nonce
+ adcs r4,r4,r9
+ adcs r5,r5,r10
+ adc r6,r6,r11
+
+#ifdef __ARMEB__
+ rev r3,r3
+ rev r4,r4
+ rev r5,r5
+ rev r6,r6
+#endif
+ str r3,[r1,#0] @ store the result
+ str r4,[r1,#4]
+ str r5,[r1,#8]
+ str r6,[r1,#12]
+
+ ldmia sp!,{r4-r11}
+ bx lr @ bx lr
+ENDPROC(poly1305_emit_neon)
+
+.align 5
+.Lzeros:
+.long 0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
+#endif
diff --git a/lib/zinc/poly1305/poly1305-arm64.S b/lib/zinc/poly1305/poly1305-arm64.S
new file mode 100644
index 000000000000..911b57edeb20
--- /dev/null
+++ b/lib/zinc/poly1305/poly1305-arm64.S
@@ -0,0 +1,820 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#include <linux/linkage.h>
+.text
+
+.align 5
+ENTRY(poly1305_init_arm)
+ cmp x1,xzr
+ stp xzr,xzr,[x0] // zero hash value
+ stp xzr,xzr,[x0,#16] // [along with is_base2_26]
+
+ csel x0,xzr,x0,eq
+ b.eq .Lno_key
+
+ ldp x7,x8,[x1] // load key
+ mov x9,#0xfffffffc0fffffff
+ movk x9,#0x0fff,lsl#48
+#ifdef __ARMEB__
+ rev x7,x7 // flip bytes
+ rev x8,x8
+#endif
+ and x7,x7,x9 // &=0ffffffc0fffffff
+ and x9,x9,#-4
+ and x8,x8,x9 // &=0ffffffc0ffffffc
+ stp x7,x8,[x0,#32] // save key value
+
+.Lno_key:
+ ret
+ENDPROC(poly1305_init_arm)
+
+.align 5
+ENTRY(poly1305_blocks_arm)
+ ands x2,x2,#-16
+ b.eq .Lno_data
+
+ ldp x4,x5,[x0] // load hash value
+ ldp x7,x8,[x0,#32] // load key value
+ ldr x6,[x0,#16]
+ add x9,x8,x8,lsr#2 // s1 = r1 + (r1 >> 2)
+ b .Loop
+
+.align 5
+.Loop:
+ ldp x10,x11,[x1],#16 // load input
+ sub x2,x2,#16
+#ifdef __ARMEB__
+ rev x10,x10
+ rev x11,x11
+#endif
+ adds x4,x4,x10 // accumulate input
+ adcs x5,x5,x11
+
+ mul x12,x4,x7 // h0*r0
+ adc x6,x6,x3
+ umulh x13,x4,x7
+
+ mul x10,x5,x9 // h1*5*r1
+ umulh x11,x5,x9
+
+ adds x12,x12,x10
+ mul x10,x4,x8 // h0*r1
+ adc x13,x13,x11
+ umulh x14,x4,x8
+
+ adds x13,x13,x10
+ mul x10,x5,x7 // h1*r0
+ adc x14,x14,xzr
+ umulh x11,x5,x7
+
+ adds x13,x13,x10
+ mul x10,x6,x9 // h2*5*r1
+ adc x14,x14,x11
+ mul x11,x6,x7 // h2*r0
+
+ adds x13,x13,x10
+ adc x14,x14,x11
+
+ and x10,x14,#-4 // final reduction
+ and x6,x14,#3
+ add x10,x10,x14,lsr#2
+ adds x4,x12,x10
+ adcs x5,x13,xzr
+ adc x6,x6,xzr
+
+ cbnz x2,.Loop
+
+ stp x4,x5,[x0] // store hash value
+ str x6,[x0,#16]
+
+.Lno_data:
+ ret
+ENDPROC(poly1305_blocks_arm)
+
+.align 5
+ENTRY(poly1305_emit_arm)
+ ldp x4,x5,[x0] // load hash base 2^64
+ ldr x6,[x0,#16]
+ ldp x10,x11,[x2] // load nonce
+
+ adds x12,x4,#5 // compare to modulus
+ adcs x13,x5,xzr
+ adc x14,x6,xzr
+
+ tst x14,#-4 // see if it's carried/borrowed
+
+ csel x4,x4,x12,eq
+ csel x5,x5,x13,eq
+
+#ifdef __ARMEB__
+ ror x10,x10,#32 // flip nonce words
+ ror x11,x11,#32
+#endif
+ adds x4,x4,x10 // accumulate nonce
+ adc x5,x5,x11
+#ifdef __ARMEB__
+ rev x4,x4 // flip output bytes
+ rev x5,x5
+#endif
+ stp x4,x5,[x1] // write result
+
+ ret
+ENDPROC(poly1305_emit_arm)
+
+.align 5
+__poly1305_mult:
+ mul x12,x4,x7 // h0*r0
+ umulh x13,x4,x7
+
+ mul x10,x5,x9 // h1*5*r1
+ umulh x11,x5,x9
+
+ adds x12,x12,x10
+ mul x10,x4,x8 // h0*r1
+ adc x13,x13,x11
+ umulh x14,x4,x8
+
+ adds x13,x13,x10
+ mul x10,x5,x7 // h1*r0
+ adc x14,x14,xzr
+ umulh x11,x5,x7
+
+ adds x13,x13,x10
+ mul x10,x6,x9 // h2*5*r1
+ adc x14,x14,x11
+ mul x11,x6,x7 // h2*r0
+
+ adds x13,x13,x10
+ adc x14,x14,x11
+
+ and x10,x14,#-4 // final reduction
+ and x6,x14,#3
+ add x10,x10,x14,lsr#2
+ adds x4,x12,x10
+ adcs x5,x13,xzr
+ adc x6,x6,xzr
+
+ ret
+
+__poly1305_splat:
+ and x12,x4,#0x03ffffff // base 2^64 -> base 2^26
+ ubfx x13,x4,#26,#26
+ extr x14,x5,x4,#52
+ and x14,x14,#0x03ffffff
+ ubfx x15,x5,#14,#26
+ extr x16,x6,x5,#40
+
+ str w12,[x0,#16*0] // r0
+ add w12,w13,w13,lsl#2 // r1*5
+ str w13,[x0,#16*1] // r1
+ add w13,w14,w14,lsl#2 // r2*5
+ str w12,[x0,#16*2] // s1
+ str w14,[x0,#16*3] // r2
+ add w14,w15,w15,lsl#2 // r3*5
+ str w13,[x0,#16*4] // s2
+ str w15,[x0,#16*5] // r3
+ add w15,w16,w16,lsl#2 // r4*5
+ str w14,[x0,#16*6] // s3
+ str w16,[x0,#16*7] // r4
+ str w15,[x0,#16*8] // s4
+
+ ret
+
+.align 5
+ENTRY(poly1305_blocks_neon)
+ ldr x17,[x0,#24]
+ cmp x2,#128
+ b.hs .Lblocks_neon
+ cbz x17,poly1305_blocks_arm
+
+.Lblocks_neon:
+ stp x29,x30,[sp,#-80]!
+ add x29,sp,#0
+
+ ands x2,x2,#-16
+ b.eq .Lno_data_neon
+
+ cbz x17,.Lbase2_64_neon
+
+ ldp w10,w11,[x0] // load hash value base 2^26
+ ldp w12,w13,[x0,#8]
+ ldr w14,[x0,#16]
+
+ tst x2,#31
+ b.eq .Leven_neon
+
+ ldp x7,x8,[x0,#32] // load key value
+
+ add x4,x10,x11,lsl#26 // base 2^26 -> base 2^64
+ lsr x5,x12,#12
+ adds x4,x4,x12,lsl#52
+ add x5,x5,x13,lsl#14
+ adc x5,x5,xzr
+ lsr x6,x14,#24
+ adds x5,x5,x14,lsl#40
+ adc x14,x6,xzr // can be partially reduced...
+
+ ldp x12,x13,[x1],#16 // load input
+ sub x2,x2,#16
+ add x9,x8,x8,lsr#2 // s1 = r1 + (r1 >> 2)
+
+ and x10,x14,#-4 // ... so reduce
+ and x6,x14,#3
+ add x10,x10,x14,lsr#2
+ adds x4,x4,x10
+ adcs x5,x5,xzr
+ adc x6,x6,xzr
+
+#ifdef __ARMEB__
+ rev x12,x12
+ rev x13,x13
+#endif
+ adds x4,x4,x12 // accumulate input
+ adcs x5,x5,x13
+ adc x6,x6,x3
+
+ bl __poly1305_mult
+ ldr x30,[sp,#8]
+
+ cbz x3,.Lstore_base2_64_neon
+
+ and x10,x4,#0x03ffffff // base 2^64 -> base 2^26
+ ubfx x11,x4,#26,#26
+ extr x12,x5,x4,#52
+ and x12,x12,#0x03ffffff
+ ubfx x13,x5,#14,#26
+ extr x14,x6,x5,#40
+
+ cbnz x2,.Leven_neon
+
+ stp w10,w11,[x0] // store hash value base 2^26
+ stp w12,w13,[x0,#8]
+ str w14,[x0,#16]
+ b .Lno_data_neon
+
+.align 4
+.Lstore_base2_64_neon:
+ stp x4,x5,[x0] // store hash value base 2^64
+ stp x6,xzr,[x0,#16] // note that is_base2_26 is zeroed
+ b .Lno_data_neon
+
+.align 4
+.Lbase2_64_neon:
+ ldp x7,x8,[x0,#32] // load key value
+
+ ldp x4,x5,[x0] // load hash value base 2^64
+ ldr x6,[x0,#16]
+
+ tst x2,#31
+ b.eq .Linit_neon
+
+ ldp x12,x13,[x1],#16 // load input
+ sub x2,x2,#16
+ add x9,x8,x8,lsr#2 // s1 = r1 + (r1 >> 2)
+#ifdef __ARMEB__
+ rev x12,x12
+ rev x13,x13
+#endif
+ adds x4,x4,x12 // accumulate input
+ adcs x5,x5,x13
+ adc x6,x6,x3
+
+ bl __poly1305_mult
+
+.Linit_neon:
+ and x10,x4,#0x03ffffff // base 2^64 -> base 2^26
+ ubfx x11,x4,#26,#26
+ extr x12,x5,x4,#52
+ and x12,x12,#0x03ffffff
+ ubfx x13,x5,#14,#26
+ extr x14,x6,x5,#40
+
+ stp d8,d9,[sp,#16] // meet ABI requirements
+ stp d10,d11,[sp,#32]
+ stp d12,d13,[sp,#48]
+ stp d14,d15,[sp,#64]
+
+ fmov d24,x10
+ fmov d25,x11
+ fmov d26,x12
+ fmov d27,x13
+ fmov d28,x14
+
+ ////////////////////////////////// initialize r^n table
+ mov x4,x7 // r^1
+ add x9,x8,x8,lsr#2 // s1 = r1 + (r1 >> 2)
+ mov x5,x8
+ mov x6,xzr
+ add x0,x0,#48+12
+ bl __poly1305_splat
+
+ bl __poly1305_mult // r^2
+ sub x0,x0,#4
+ bl __poly1305_splat
+
+ bl __poly1305_mult // r^3
+ sub x0,x0,#4
+ bl __poly1305_splat
+
+ bl __poly1305_mult // r^4
+ sub x0,x0,#4
+ bl __poly1305_splat
+ ldr x30,[sp,#8]
+
+ add x16,x1,#32
+ adr x17,.Lzeros
+ subs x2,x2,#64
+ csel x16,x17,x16,lo
+
+ mov x4,#1
+ str x4,[x0,#-24] // set is_base2_26
+ sub x0,x0,#48 // restore original x0
+ b .Ldo_neon
+
+.align 4
+.Leven_neon:
+ add x16,x1,#32
+ adr x17,.Lzeros
+ subs x2,x2,#64
+ csel x16,x17,x16,lo
+
+ stp d8,d9,[sp,#16] // meet ABI requirements
+ stp d10,d11,[sp,#32]
+ stp d12,d13,[sp,#48]
+ stp d14,d15,[sp,#64]
+
+ fmov d24,x10
+ fmov d25,x11
+ fmov d26,x12
+ fmov d27,x13
+ fmov d28,x14
+
+.Ldo_neon:
+ ldp x8,x12,[x16],#16 // inp[2:3] (or zero)
+ ldp x9,x13,[x16],#48
+
+ lsl x3,x3,#24
+ add x15,x0,#48
+
+#ifdef __ARMEB__
+ rev x8,x8
+ rev x12,x12
+ rev x9,x9
+ rev x13,x13
+#endif
+ and x4,x8,#0x03ffffff // base 2^64 -> base 2^26
+ and x5,x9,#0x03ffffff
+ ubfx x6,x8,#26,#26
+ ubfx x7,x9,#26,#26
+ add x4,x4,x5,lsl#32 // bfi x4,x5,#32,#32
+ extr x8,x12,x8,#52
+ extr x9,x13,x9,#52
+ add x6,x6,x7,lsl#32 // bfi x6,x7,#32,#32
+ fmov d14,x4
+ and x8,x8,#0x03ffffff
+ and x9,x9,#0x03ffffff
+ ubfx x10,x12,#14,#26
+ ubfx x11,x13,#14,#26
+ add x12,x3,x12,lsr#40
+ add x13,x3,x13,lsr#40
+ add x8,x8,x9,lsl#32 // bfi x8,x9,#32,#32
+ fmov d15,x6
+ add x10,x10,x11,lsl#32 // bfi x10,x11,#32,#32
+ add x12,x12,x13,lsl#32 // bfi x12,x13,#32,#32
+ fmov d16,x8
+ fmov d17,x10
+ fmov d18,x12
+
+ ldp x8,x12,[x1],#16 // inp[0:1]
+ ldp x9,x13,[x1],#48
+
+ ld1 {v0.4s,v1.4s,v2.4s,v3.4s},[x15],#64
+ ld1 {v4.4s,v5.4s,v6.4s,v7.4s},[x15],#64
+ ld1 {v8.4s},[x15]
+
+#ifdef __ARMEB__
+ rev x8,x8
+ rev x12,x12
+ rev x9,x9
+ rev x13,x13
+#endif
+ and x4,x8,#0x03ffffff // base 2^64 -> base 2^26
+ and x5,x9,#0x03ffffff
+ ubfx x6,x8,#26,#26
+ ubfx x7,x9,#26,#26
+ add x4,x4,x5,lsl#32 // bfi x4,x5,#32,#32
+ extr x8,x12,x8,#52
+ extr x9,x13,x9,#52
+ add x6,x6,x7,lsl#32 // bfi x6,x7,#32,#32
+ fmov d9,x4
+ and x8,x8,#0x03ffffff
+ and x9,x9,#0x03ffffff
+ ubfx x10,x12,#14,#26
+ ubfx x11,x13,#14,#26
+ add x12,x3,x12,lsr#40
+ add x13,x3,x13,lsr#40
+ add x8,x8,x9,lsl#32 // bfi x8,x9,#32,#32
+ fmov d10,x6
+ add x10,x10,x11,lsl#32 // bfi x10,x11,#32,#32
+ add x12,x12,x13,lsl#32 // bfi x12,x13,#32,#32
+ movi v31.2d,#-1
+ fmov d11,x8
+ fmov d12,x10
+ fmov d13,x12
+ ushr v31.2d,v31.2d,#38
+
+ b.ls .Lskip_loop
+
+.align 4
+.Loop_neon:
+ ////////////////////////////////////////////////////////////////
+ // ((inp[0]*r^4+inp[2]*r^2+inp[4])*r^4+inp[6]*r^2
+ // ((inp[1]*r^4+inp[3]*r^2+inp[5])*r^3+inp[7]*r
+ // ___________________/
+ // ((inp[0]*r^4+inp[2]*r^2+inp[4])*r^4+inp[6]*r^2+inp[8])*r^2
+ // ((inp[1]*r^4+inp[3]*r^2+inp[5])*r^4+inp[7]*r^2+inp[9])*r
+ // ___________________/ ____________________/
+ //
+ // Note that we start with inp[2:3]*r^2. This is because it
+ // doesn't depend on reduction in previous iteration.
+ ////////////////////////////////////////////////////////////////
+ // d4 = h0*r4 + h1*r3 + h2*r2 + h3*r1 + h4*r0
+ // d3 = h0*r3 + h1*r2 + h2*r1 + h3*r0 + h4*5*r4
+ // d2 = h0*r2 + h1*r1 + h2*r0 + h3*5*r4 + h4*5*r3
+ // d1 = h0*r1 + h1*r0 + h2*5*r4 + h3*5*r3 + h4*5*r2
+ // d0 = h0*r0 + h1*5*r4 + h2*5*r3 + h3*5*r2 + h4*5*r1
+
+ subs x2,x2,#64
+ umull v23.2d,v14.2s,v7.s[2]
+ csel x16,x17,x16,lo
+ umull v22.2d,v14.2s,v5.s[2]
+ umull v21.2d,v14.2s,v3.s[2]
+ ldp x8,x12,[x16],#16 // inp[2:3] (or zero)
+ umull v20.2d,v14.2s,v1.s[2]
+ ldp x9,x13,[x16],#48
+ umull v19.2d,v14.2s,v0.s[2]
+#ifdef __ARMEB__
+ rev x8,x8
+ rev x12,x12
+ rev x9,x9
+ rev x13,x13
+#endif
+
+ umlal v23.2d,v15.2s,v5.s[2]
+ and x4,x8,#0x03ffffff // base 2^64 -> base 2^26
+ umlal v22.2d,v15.2s,v3.s[2]
+ and x5,x9,#0x03ffffff
+ umlal v21.2d,v15.2s,v1.s[2]
+ ubfx x6,x8,#26,#26
+ umlal v20.2d,v15.2s,v0.s[2]
+ ubfx x7,x9,#26,#26
+ umlal v19.2d,v15.2s,v8.s[2]
+ add x4,x4,x5,lsl#32 // bfi x4,x5,#32,#32
+
+ umlal v23.2d,v16.2s,v3.s[2]
+ extr x8,x12,x8,#52
+ umlal v22.2d,v16.2s,v1.s[2]
+ extr x9,x13,x9,#52
+ umlal v21.2d,v16.2s,v0.s[2]
+ add x6,x6,x7,lsl#32 // bfi x6,x7,#32,#32
+ umlal v20.2d,v16.2s,v8.s[2]
+ fmov d14,x4
+ umlal v19.2d,v16.2s,v6.s[2]
+ and x8,x8,#0x03ffffff
+
+ umlal v23.2d,v17.2s,v1.s[2]
+ and x9,x9,#0x03ffffff
+ umlal v22.2d,v17.2s,v0.s[2]
+ ubfx x10,x12,#14,#26
+ umlal v21.2d,v17.2s,v8.s[2]
+ ubfx x11,x13,#14,#26
+ umlal v20.2d,v17.2s,v6.s[2]
+ add x8,x8,x9,lsl#32 // bfi x8,x9,#32,#32
+ umlal v19.2d,v17.2s,v4.s[2]
+ fmov d15,x6
+
+ add v11.2s,v11.2s,v26.2s
+ add x12,x3,x12,lsr#40
+ umlal v23.2d,v18.2s,v0.s[2]
+ add x13,x3,x13,lsr#40
+ umlal v22.2d,v18.2s,v8.s[2]
+ add x10,x10,x11,lsl#32 // bfi x10,x11,#32,#32
+ umlal v21.2d,v18.2s,v6.s[2]
+ add x12,x12,x13,lsl#32 // bfi x12,x13,#32,#32
+ umlal v20.2d,v18.2s,v4.s[2]
+ fmov d16,x8
+ umlal v19.2d,v18.2s,v2.s[2]
+ fmov d17,x10
+
+ ////////////////////////////////////////////////////////////////
+ // (hash+inp[0:1])*r^4 and accumulate
+
+ add v9.2s,v9.2s,v24.2s
+ fmov d18,x12
+ umlal v22.2d,v11.2s,v1.s[0]
+ ldp x8,x12,[x1],#16 // inp[0:1]
+ umlal v19.2d,v11.2s,v6.s[0]
+ ldp x9,x13,[x1],#48
+ umlal v23.2d,v11.2s,v3.s[0]
+ umlal v20.2d,v11.2s,v8.s[0]
+ umlal v21.2d,v11.2s,v0.s[0]
+#ifdef __ARMEB__
+ rev x8,x8
+ rev x12,x12
+ rev x9,x9
+ rev x13,x13
+#endif
+
+ add v10.2s,v10.2s,v25.2s
+ umlal v22.2d,v9.2s,v5.s[0]
+ umlal v23.2d,v9.2s,v7.s[0]
+ and x4,x8,#0x03ffffff // base 2^64 -> base 2^26
+ umlal v21.2d,v9.2s,v3.s[0]
+ and x5,x9,#0x03ffffff
+ umlal v19.2d,v9.2s,v0.s[0]
+ ubfx x6,x8,#26,#26
+ umlal v20.2d,v9.2s,v1.s[0]
+ ubfx x7,x9,#26,#26
+
+ add v12.2s,v12.2s,v27.2s
+ add x4,x4,x5,lsl#32 // bfi x4,x5,#32,#32
+ umlal v22.2d,v10.2s,v3.s[0]
+ extr x8,x12,x8,#52
+ umlal v23.2d,v10.2s,v5.s[0]
+ extr x9,x13,x9,#52
+ umlal v19.2d,v10.2s,v8.s[0]
+ add x6,x6,x7,lsl#32 // bfi x6,x7,#32,#32
+ umlal v21.2d,v10.2s,v1.s[0]
+ fmov d9,x4
+ umlal v20.2d,v10.2s,v0.s[0]
+ and x8,x8,#0x03ffffff
+
+ add v13.2s,v13.2s,v28.2s
+ and x9,x9,#0x03ffffff
+ umlal v22.2d,v12.2s,v0.s[0]
+ ubfx x10,x12,#14,#26
+ umlal v19.2d,v12.2s,v4.s[0]
+ ubfx x11,x13,#14,#26
+ umlal v23.2d,v12.2s,v1.s[0]
+ add x8,x8,x9,lsl#32 // bfi x8,x9,#32,#32
+ umlal v20.2d,v12.2s,v6.s[0]
+ fmov d10,x6
+ umlal v21.2d,v12.2s,v8.s[0]
+ add x12,x3,x12,lsr#40
+
+ umlal v22.2d,v13.2s,v8.s[0]
+ add x13,x3,x13,lsr#40
+ umlal v19.2d,v13.2s,v2.s[0]
+ add x10,x10,x11,lsl#32 // bfi x10,x11,#32,#32
+ umlal v23.2d,v13.2s,v0.s[0]
+ add x12,x12,x13,lsl#32 // bfi x12,x13,#32,#32
+ umlal v20.2d,v13.2s,v4.s[0]
+ fmov d11,x8
+ umlal v21.2d,v13.2s,v6.s[0]
+ fmov d12,x10
+ fmov d13,x12
+
+ /////////////////////////////////////////////////////////////////
+ // lazy reduction as discussed in "NEON crypto" by D.J. Bernstein
+ // and P. Schwabe
+ //
+ // [see discussion in poly1305-armv4 module]
+
+ ushr v29.2d,v22.2d,#26
+ xtn v27.2s,v22.2d
+ ushr v30.2d,v19.2d,#26
+ and v19.16b,v19.16b,v31.16b
+ add v23.2d,v23.2d,v29.2d // h3 -> h4
+ bic v27.2s,#0xfc,lsl#24 // &=0x03ffffff
+ add v20.2d,v20.2d,v30.2d // h0 -> h1
+
+ ushr v29.2d,v23.2d,#26
+ xtn v28.2s,v23.2d
+ ushr v30.2d,v20.2d,#26
+ xtn v25.2s,v20.2d
+ bic v28.2s,#0xfc,lsl#24
+ add v21.2d,v21.2d,v30.2d // h1 -> h2
+
+ add v19.2d,v19.2d,v29.2d
+ shl v29.2d,v29.2d,#2
+ shrn v30.2s,v21.2d,#26
+ xtn v26.2s,v21.2d
+ add v19.2d,v19.2d,v29.2d // h4 -> h0
+ bic v25.2s,#0xfc,lsl#24
+ add v27.2s,v27.2s,v30.2s // h2 -> h3
+ bic v26.2s,#0xfc,lsl#24
+
+ shrn v29.2s,v19.2d,#26
+ xtn v24.2s,v19.2d
+ ushr v30.2s,v27.2s,#26
+ bic v27.2s,#0xfc,lsl#24
+ bic v24.2s,#0xfc,lsl#24
+ add v25.2s,v25.2s,v29.2s // h0 -> h1
+ add v28.2s,v28.2s,v30.2s // h3 -> h4
+
+ b.hi .Loop_neon
+
+.Lskip_loop:
+ dup v16.2d,v16.d[0]
+ add v11.2s,v11.2s,v26.2s
+
+ ////////////////////////////////////////////////////////////////
+ // multiply (inp[0:1]+hash) or inp[2:3] by r^2:r^1
+
+ adds x2,x2,#32
+ b.ne .Long_tail
+
+ dup v16.2d,v11.d[0]
+ add v14.2s,v9.2s,v24.2s
+ add v17.2s,v12.2s,v27.2s
+ add v15.2s,v10.2s,v25.2s
+ add v18.2s,v13.2s,v28.2s
+
+.Long_tail:
+ dup v14.2d,v14.d[0]
+ umull2 v19.2d,v16.4s,v6.4s
+ umull2 v22.2d,v16.4s,v1.4s
+ umull2 v23.2d,v16.4s,v3.4s
+ umull2 v21.2d,v16.4s,v0.4s
+ umull2 v20.2d,v16.4s,v8.4s
+
+ dup v15.2d,v15.d[0]
+ umlal2 v19.2d,v14.4s,v0.4s
+ umlal2 v21.2d,v14.4s,v3.4s
+ umlal2 v22.2d,v14.4s,v5.4s
+ umlal2 v23.2d,v14.4s,v7.4s
+ umlal2 v20.2d,v14.4s,v1.4s
+
+ dup v17.2d,v17.d[0]
+ umlal2 v19.2d,v15.4s,v8.4s
+ umlal2 v22.2d,v15.4s,v3.4s
+ umlal2 v21.2d,v15.4s,v1.4s
+ umlal2 v23.2d,v15.4s,v5.4s
+ umlal2 v20.2d,v15.4s,v0.4s
+
+ dup v18.2d,v18.d[0]
+ umlal2 v22.2d,v17.4s,v0.4s
+ umlal2 v23.2d,v17.4s,v1.4s
+ umlal2 v19.2d,v17.4s,v4.4s
+ umlal2 v20.2d,v17.4s,v6.4s
+ umlal2 v21.2d,v17.4s,v8.4s
+
+ umlal2 v22.2d,v18.4s,v8.4s
+ umlal2 v19.2d,v18.4s,v2.4s
+ umlal2 v23.2d,v18.4s,v0.4s
+ umlal2 v20.2d,v18.4s,v4.4s
+ umlal2 v21.2d,v18.4s,v6.4s
+
+ b.eq .Lshort_tail
+
+ ////////////////////////////////////////////////////////////////
+ // (hash+inp[0:1])*r^4:r^3 and accumulate
+
+ add v9.2s,v9.2s,v24.2s
+ umlal v22.2d,v11.2s,v1.2s
+ umlal v19.2d,v11.2s,v6.2s
+ umlal v23.2d,v11.2s,v3.2s
+ umlal v20.2d,v11.2s,v8.2s
+ umlal v21.2d,v11.2s,v0.2s
+
+ add v10.2s,v10.2s,v25.2s
+ umlal v22.2d,v9.2s,v5.2s
+ umlal v19.2d,v9.2s,v0.2s
+ umlal v23.2d,v9.2s,v7.2s
+ umlal v20.2d,v9.2s,v1.2s
+ umlal v21.2d,v9.2s,v3.2s
+
+ add v12.2s,v12.2s,v27.2s
+ umlal v22.2d,v10.2s,v3.2s
+ umlal v19.2d,v10.2s,v8.2s
+ umlal v23.2d,v10.2s,v5.2s
+ umlal v20.2d,v10.2s,v0.2s
+ umlal v21.2d,v10.2s,v1.2s
+
+ add v13.2s,v13.2s,v28.2s
+ umlal v22.2d,v12.2s,v0.2s
+ umlal v19.2d,v12.2s,v4.2s
+ umlal v23.2d,v12.2s,v1.2s
+ umlal v20.2d,v12.2s,v6.2s
+ umlal v21.2d,v12.2s,v8.2s
+
+ umlal v22.2d,v13.2s,v8.2s
+ umlal v19.2d,v13.2s,v2.2s
+ umlal v23.2d,v13.2s,v0.2s
+ umlal v20.2d,v13.2s,v4.2s
+ umlal v21.2d,v13.2s,v6.2s
+
+.Lshort_tail:
+ ////////////////////////////////////////////////////////////////
+ // horizontal add
+
+ addp v22.2d,v22.2d,v22.2d
+ ldp d8,d9,[sp,#16] // meet ABI requirements
+ addp v19.2d,v19.2d,v19.2d
+ ldp d10,d11,[sp,#32]
+ addp v23.2d,v23.2d,v23.2d
+ ldp d12,d13,[sp,#48]
+ addp v20.2d,v20.2d,v20.2d
+ ldp d14,d15,[sp,#64]
+ addp v21.2d,v21.2d,v21.2d
+
+ ////////////////////////////////////////////////////////////////
+ // lazy reduction, but without narrowing
+
+ ushr v29.2d,v22.2d,#26
+ and v22.16b,v22.16b,v31.16b
+ ushr v30.2d,v19.2d,#26
+ and v19.16b,v19.16b,v31.16b
+
+ add v23.2d,v23.2d,v29.2d // h3 -> h4
+ add v20.2d,v20.2d,v30.2d // h0 -> h1
+
+ ushr v29.2d,v23.2d,#26
+ and v23.16b,v23.16b,v31.16b
+ ushr v30.2d,v20.2d,#26
+ and v20.16b,v20.16b,v31.16b
+ add v21.2d,v21.2d,v30.2d // h1 -> h2
+
+ add v19.2d,v19.2d,v29.2d
+ shl v29.2d,v29.2d,#2
+ ushr v30.2d,v21.2d,#26
+ and v21.16b,v21.16b,v31.16b
+ add v19.2d,v19.2d,v29.2d // h4 -> h0
+ add v22.2d,v22.2d,v30.2d // h2 -> h3
+
+ ushr v29.2d,v19.2d,#26
+ and v19.16b,v19.16b,v31.16b
+ ushr v30.2d,v22.2d,#26
+ and v22.16b,v22.16b,v31.16b
+ add v20.2d,v20.2d,v29.2d // h0 -> h1
+ add v23.2d,v23.2d,v30.2d // h3 -> h4
+
+ ////////////////////////////////////////////////////////////////
+ // write the result, can be partially reduced
+
+ st4 {v19.s,v20.s,v21.s,v22.s}[0],[x0],#16
+ st1 {v23.s}[0],[x0]
+
+.Lno_data_neon:
+ ldr x29,[sp],#80
+ ret
+ENDPROC(poly1305_blocks_neon)
+
+.align 5
+ENTRY(poly1305_emit_neon)
+ ldr x17,[x0,#24]
+ cbz x17,poly1305_emit_arm
+
+ ldp w10,w11,[x0] // load hash value base 2^26
+ ldp w12,w13,[x0,#8]
+ ldr w14,[x0,#16]
+
+ add x4,x10,x11,lsl#26 // base 2^26 -> base 2^64
+ lsr x5,x12,#12
+ adds x4,x4,x12,lsl#52
+ add x5,x5,x13,lsl#14
+ adc x5,x5,xzr
+ lsr x6,x14,#24
+ adds x5,x5,x14,lsl#40
+ adc x6,x6,xzr // can be partially reduced...
+
+ ldp x10,x11,[x2] // load nonce
+
+ and x12,x6,#-4 // ... so reduce
+ add x12,x12,x6,lsr#2
+ and x6,x6,#3
+ adds x4,x4,x12
+ adcs x5,x5,xzr
+ adc x6,x6,xzr
+
+ adds x12,x4,#5 // compare to modulus
+ adcs x13,x5,xzr
+ adc x14,x6,xzr
+
+ tst x14,#-4 // see if it's carried/borrowed
+
+ csel x4,x4,x12,eq
+ csel x5,x5,x13,eq
+
+#ifdef __ARMEB__
+ ror x10,x10,#32 // flip nonce words
+ ror x11,x11,#32
+#endif
+ adds x4,x4,x10 // accumulate nonce
+ adc x5,x5,x11
+#ifdef __ARMEB__
+ rev x4,x4 // flip output bytes
+ rev x5,x5
+#endif
+ stp x4,x5,[x1] // write result
+
+ ret
+ENDPROC(poly1305_emit_neon)
+
+.align 5
+.Lzeros:
+.long 0,0,0,0,0,0,0,0
diff --git a/lib/zinc/poly1305/poly1305-mips.S b/lib/zinc/poly1305/poly1305-mips.S
new file mode 100644
index 000000000000..32d8558d8601
--- /dev/null
+++ b/lib/zinc/poly1305/poly1305-mips.S
@@ -0,0 +1,417 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2016-2018 René van Dorst <opensource@...rst.com> All Rights Reserved.
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#if __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+#define MSB 0
+#define LSB 3
+#else
+#define MSB 3
+#define LSB 0
+#endif
+
+#define POLY1305_BLOCK_SIZE 16
+.text
+#define H0 $t0
+#define H1 $t1
+#define H2 $t2
+#define H3 $t3
+#define H4 $t4
+
+#define R0 $t5
+#define R1 $t6
+#define R2 $t7
+#define R3 $t8
+
+#define O0 $s0
+#define O1 $s4
+#define O2 $v1
+#define O3 $t9
+#define O4 $s5
+
+#define S1 $s1
+#define S2 $s2
+#define S3 $s3
+
+#define SC $at
+#define CA $v0
+
+/* Input arguments */
+#define poly $a0
+#define src $a1
+#define srclen $a2
+#define hibit $a3
+
+/* Location in the opaque buffer
+ * R[0..3], CA, H[0..4]
+ */
+#define PTR_POLY1305_R(n) ( 0 + (n*4)) ## ($a0)
+#define PTR_POLY1305_CA (16 ) ## ($a0)
+#define PTR_POLY1305_H(n) (20 + (n*4)) ## ($a0)
+
+#define POLY1305_BLOCK_SIZE 16
+#define POLY1305_STACK_SIZE 8 * 4
+
+.set reorder
+.set noat
+.align 4
+.globl poly1305_blocks_mips
+.ent poly1305_blocks_mips
+poly1305_blocks_mips:
+ .frame $sp,POLY1305_STACK_SIZE,$31
+ /* srclen &= 0xFFFFFFF0 */
+ ins srclen, $zero, 0, 4
+
+ .set noreorder
+ /* check srclen >= 16 bytes */
+ beqz srclen, .Lpoly1305_blocks_mips_end
+ addiu $sp, -(POLY1305_STACK_SIZE)
+ .set reorder
+
+ /* Calculate last round based on src address pointer.
+ * last round src ptr (srclen) = src + (srclen & 0xFFFFFFF0)
+ */
+ addu srclen, src
+
+ lw R0, PTR_POLY1305_R(0)
+ lw R1, PTR_POLY1305_R(1)
+ lw R2, PTR_POLY1305_R(2)
+ lw R3, PTR_POLY1305_R(3)
+
+ /* store the used save registers. */
+ sw $s0, 0($sp)
+ sw $s1, 4($sp)
+ sw $s2, 8($sp)
+ sw $s3, 12($sp)
+ sw $s4, 16($sp)
+ sw $s5, 20($sp)
+
+ /* load Hx and Carry */
+ lw CA, PTR_POLY1305_CA
+ lw H0, PTR_POLY1305_H(0)
+ lw H1, PTR_POLY1305_H(1)
+ lw H2, PTR_POLY1305_H(2)
+ lw H3, PTR_POLY1305_H(3)
+ lw H4, PTR_POLY1305_H(4)
+
+ /* Sx = Rx + (Rx >> 2) */
+ srl S1, R1, 2
+ srl S2, R2, 2
+ srl S3, R3, 2
+ addu S1, R1
+ addu S2, R2
+ addu S3, R3
+
+ addiu SC, $zero, 1
+
+.Lpoly1305_loop:
+ lwl O0, 0+MSB(src)
+ lwl O1, 4+MSB(src)
+ lwl O2, 8+MSB(src)
+ lwl O3,12+MSB(src)
+ lwr O0, 0+LSB(src)
+ lwr O1, 4+LSB(src)
+ lwr O2, 8+LSB(src)
+ lwr O3,12+LSB(src)
+
+#if __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+ wsbh O0
+ wsbh O1
+ wsbh O2
+ wsbh O3
+ rotr O0, 16
+ rotr O1, 16
+ rotr O2, 16
+ rotr O3, 16
+#endif
+
+ /* h0 = (u32)(d0 = (u64)h0 + inp[0] + c 'Carry_previous cycle'); */
+ addu H0, CA
+ sltu CA, H0, CA
+ addu O0, H0
+ sltu H0, O0, H0
+ addu CA, H0
+
+ /* h1 = (u32)(d1 = (u64)h1 + (d0 >> 32) + inp[4]); */
+ addu H1, CA
+ sltu CA, H1, CA
+ addu O1, H1
+ sltu H1, O1, H1
+ addu CA, H1
+
+ /* h2 = (u32)(d2 = (u64)h2 + (d1 >> 32) + inp[8]); */
+ addu H2, CA
+ sltu CA, H2, CA
+ addu O2, H2
+ sltu H2, O2, H2
+ addu CA, H2
+
+ /* h3 = (u32)(d3 = (u64)h3 + (d2 >> 32) + inp[12]); */
+ addu H3, CA
+ sltu CA, H3, CA
+ addu O3, H3
+ sltu H3, O3, H3
+ addu CA, H3
+
+ /* h4 += (u32)(d3 >> 32) + padbit; */
+ addu H4, hibit
+ addu O4, H4, CA
+
+ /* D0 */
+ multu O0, R0
+ maddu O1, S3
+ maddu O2, S2
+ maddu O3, S1
+ mfhi CA
+ mflo H0
+
+ /* D1 */
+ multu O0, R1
+ maddu O1, R0
+ maddu O2, S3
+ maddu O3, S2
+ maddu O4, S1
+ maddu CA, SC
+ mfhi CA
+ mflo H1
+
+ /* D2 */
+ multu O0, R2
+ maddu O1, R1
+ maddu O2, R0
+ maddu O3, S3
+ maddu O4, S2
+ maddu CA, SC
+ mfhi CA
+ mflo H2
+
+ /* D4 */
+ mul H4, O4, R0
+
+ /* D3 */
+ multu O0, R3
+ maddu O1, R2
+ maddu O2, R1
+ maddu O3, R0
+ maddu O4, S3
+ maddu CA, SC
+ mfhi CA
+ mflo H3
+
+ addiu src, POLY1305_BLOCK_SIZE
+
+ /* h4 += (u32)(d3 >> 32); */
+ addu O4, H4, CA
+ /* h4 &= 3 */
+ andi H4, O4, 3
+ /* c = (h4 >> 2) + (h4 & ~3U); */
+ srl CA, O4, 2
+ ins O4, $zero, 0, 2
+
+ /* able to do a 16 byte block. */
+ .set noreorder
+ bne src, srclen, .Lpoly1305_loop
+ /* Delay slot is always executed. */
+ addu CA, O4
+ .set reorder
+
+ /* restore the used save registers. */
+ lw $s0, 0($sp)
+ lw $s1, 4($sp)
+ lw $s2, 8($sp)
+ lw $s3, 12($sp)
+ lw $s4, 16($sp)
+ lw $s5, 20($sp)
+
+ /* store Hx and Carry */
+ sw CA, PTR_POLY1305_CA
+ sw H0, PTR_POLY1305_H(0)
+ sw H1, PTR_POLY1305_H(1)
+ sw H2, PTR_POLY1305_H(2)
+ sw H3, PTR_POLY1305_H(3)
+ sw H4, PTR_POLY1305_H(4)
+
+.Lpoly1305_blocks_mips_end:
+ /* Jump Back */
+ .set noreorder
+ jr $ra
+ addiu $sp, POLY1305_STACK_SIZE
+ .set reorder
+.end poly1305_blocks_mips
+.set at
+.set reorder
+
+/* Input arguments CTX=$a0, MAC=$a1, NONCE=$a2 */
+#define MAC $a1
+#define NONCE $a2
+
+#define G0 $t5
+#define G1 $t6
+#define G2 $t7
+#define G3 $t8
+#define G4 $t9
+
+.set reorder
+.set noat
+.align 4
+.globl poly1305_emit_mips
+.ent poly1305_emit_mips
+poly1305_emit_mips:
+ /* load Hx and Carry */
+ lw CA, PTR_POLY1305_CA
+ lw H0, PTR_POLY1305_H(0)
+ lw H1, PTR_POLY1305_H(1)
+ lw H2, PTR_POLY1305_H(2)
+ lw H3, PTR_POLY1305_H(3)
+ lw H4, PTR_POLY1305_H(4)
+
+ /* Add left over carry */
+ addu H0, CA
+ sltu CA, H0, CA
+ addu H1, CA
+ sltu CA, H1, CA
+ addu H2, CA
+ sltu CA, H2, CA
+ addu H3, CA
+ sltu CA, H3, CA
+ addu H4, CA
+
+ /* compare to modulus by computing h + -p */
+ addiu G0, H0, 5
+ sltu CA, G0, H0
+ addu G1, H1, CA
+ sltu CA, G1, H1
+ addu G2, H2, CA
+ sltu CA, G2, H2
+ addu G3, H3, CA
+ sltu CA, G3, H3
+ addu G4, H4, CA
+
+ srl SC, G4, 2
+
+ /* if there was carry into 131st bit, h3:h0 = g3:g0 */
+ movn H0, G0, SC
+ movn H1, G1, SC
+ movn H2, G2, SC
+ movn H3, G3, SC
+
+ lwl G0, 0+MSB(NONCE)
+ lwl G1, 4+MSB(NONCE)
+ lwl G2, 8+MSB(NONCE)
+ lwl G3,12+MSB(NONCE)
+ lwr G0, 0+LSB(NONCE)
+ lwr G1, 4+LSB(NONCE)
+ lwr G2, 8+LSB(NONCE)
+ lwr G3,12+LSB(NONCE)
+
+ /* mac = (h + nonce) % (2^128) */
+ addu H0, G0
+ sltu CA, H0, G0
+
+ /* H1 */
+ addu H1, CA
+ sltu CA, H1, CA
+ addu H1, G1
+ sltu G1, H1, G1
+ addu CA, G1
+
+ /* H2 */
+ addu H2, CA
+ sltu CA, H2, CA
+ addu H2, G2
+ sltu G2, H2, G2
+ addu CA, G2
+
+ /* H3 */
+ addu H3, CA
+ addu H3, G3
+
+#if __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+ wsbh H0
+ wsbh H1
+ wsbh H2
+ wsbh H3
+ rotr H0, 16
+ rotr H1, 16
+ rotr H2, 16
+ rotr H3, 16
+#endif
+
+ /* store MAC */
+ swl H0, 0+MSB(MAC)
+ swl H1, 4+MSB(MAC)
+ swl H2, 8+MSB(MAC)
+ swl H3,12+MSB(MAC)
+ swr H0, 0+LSB(MAC)
+ swr H1, 4+LSB(MAC)
+ swr H2, 8+LSB(MAC)
+ .set noreorder
+ jr $ra
+ swr H3,12+LSB(MAC)
+ .set reorder
+.end poly1305_emit_mips
+
+#define PR0 $t0
+#define PR1 $t1
+#define PR2 $t2
+#define PR3 $t3
+#define PT0 $t4
+
+/* Input arguments CTX=$a0, KEY=$a1 */
+
+.align 4
+.globl poly1305_init_mips
+.ent poly1305_init_mips
+poly1305_init_mips:
+ lwl PR0, 0+MSB($a1)
+ lwl PR1, 4+MSB($a1)
+ lwl PR2, 8+MSB($a1)
+ lwl PR3,12+MSB($a1)
+ lwr PR0, 0+LSB($a1)
+ lwr PR1, 4+LSB($a1)
+ lwr PR2, 8+LSB($a1)
+ lwr PR3,12+LSB($a1)
+
+ /* store Hx and Carry */
+ sw $zero, PTR_POLY1305_CA
+ sw $zero, PTR_POLY1305_H(0)
+ sw $zero, PTR_POLY1305_H(1)
+ sw $zero, PTR_POLY1305_H(2)
+ sw $zero, PTR_POLY1305_H(3)
+ sw $zero, PTR_POLY1305_H(4)
+
+#if __BYTE_ORDER__ == __ORDER_BIG_ENDIAN__
+ wsbh PR0
+ wsbh PR1
+ wsbh PR2
+ wsbh PR3
+ rotr PR0, 16
+ rotr PR1, 16
+ rotr PR2, 16
+ rotr PR3, 16
+#endif
+
+ lui PT0, 0x0FFF
+ ori PT0, 0xFFFC
+
+ /* AND 0x0fffffff; */
+ ext PR0, PR0, 0, (32-4)
+
+ /* AND 0x0ffffffc; */
+ and PR1, PT0
+ and PR2, PT0
+ and PR3, PT0
+
+ /* store Rx */
+ sw PR0, PTR_POLY1305_R(0)
+ sw PR1, PTR_POLY1305_R(1)
+ sw PR2, PTR_POLY1305_R(2)
+
+ .set noreorder
+ /* Jump Back */
+ jr $ra
+ sw PR3, PTR_POLY1305_R(3)
+ .set reorder
+.end poly1305_init_mips
diff --git a/lib/zinc/poly1305/poly1305-mips64.S b/lib/zinc/poly1305/poly1305-mips64.S
new file mode 100644
index 000000000000..1a45fbee3c7f
--- /dev/null
+++ b/lib/zinc/poly1305/poly1305-mips64.S
@@ -0,0 +1,357 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#if !defined(CONFIG_64BIT)
+#error "This is only for 64-bit kernels."
+#endif
+
+#ifdef __MIPSEB__
+#define MSB 0
+#define LSB 7
+#else
+#define MSB 7
+#define LSB 0
+#endif
+
+#if defined(CONFIG_CPU_MIPS64_R6) || defined(CONFIG_CPU_MIPSR6)
+#define dmultu(rs,rt)
+#define mflo(rd,rs,rt) dmulu rd,rs,rt
+#define mfhi(rd,rs,rt) dmuhu rd,rs,rt
+#else
+#define dmultu(rs,rt) dmultu rs,rt
+#define multu(rs,rt) multu rs,rt
+#define mflo(rd,rs,rt) mflo rd
+#define mfhi(rd,rs,rt) mfhi rd
+#endif
+
+.text
+.set noat
+.set noreorder
+
+/* While most of the assembly in the kernel prefers ENTRY() and ENDPROC(),
+ * there is no existing MIPS assembly that uses it, and MIPS assembler seems
+ * to like its own .ent/.end notation, which the MIPS include files don't
+ * provide in a MIPS-specific ENTRY/ENDPROC definition. So, we skip these
+ * for now, until somebody complains. */
+
+.align 5
+.globl poly1305_init_mips
+.ent poly1305_init_mips
+poly1305_init_mips:
+ .frame $29,0,$31
+ .set reorder
+
+ sd $0,0($4)
+ sd $0,8($4)
+ sd $0,16($4)
+
+ beqz $5,.Lno_key
+
+#if defined(CONFIG_CPU_MIPS64_R6) || defined(CONFIG_CPU_MIPSR6)
+ ld $8,0($5)
+ ld $9,8($5)
+#else
+ ldl $8,0+MSB($5)
+ ldl $9,8+MSB($5)
+ ldr $8,0+LSB($5)
+ ldr $9,8+LSB($5)
+#endif
+#ifdef __MIPSEB__
+#if defined(CONFIG_CPU_MIPS64_R2) || defined(CONFIG_CPU_MIPSR2) || defined(CONFIG_CPU_MIPS64_R6) || defined(CONFIG_CPU_MIPSR6)
+ dsbh $8,$8 # byte swap
+ dsbh $9,$9
+ dshd $8,$8
+ dshd $9,$9
+#else
+ ori $10,$0,0xFF
+ dsll $1,$10,32
+ or $10,$1 # 0x000000FF000000FF
+
+ and $11,$8,$10 # byte swap
+ and $2,$9,$10
+ dsrl $1,$8,24
+ dsrl $24,$9,24
+ dsll $11,24
+ dsll $2,24
+ and $1,$10
+ and $24,$10
+ dsll $10,8 # 0x0000FF000000FF00
+ or $11,$1
+ or $2,$24
+ and $1,$8,$10
+ and $24,$9,$10
+ dsrl $8,8
+ dsrl $9,8
+ dsll $1,8
+ dsll $24,8
+ and $8,$10
+ and $9,$10
+ or $11,$1
+ or $2,$24
+ or $8,$11
+ or $9,$2
+ dsrl $11,$8,32
+ dsrl $2,$9,32
+ dsll $8,32
+ dsll $9,32
+ or $8,$11
+ or $9,$2
+#endif
+#endif
+ li $10,1
+ dsll $10,32
+ daddiu $10,-63
+ dsll $10,28
+ daddiu $10,-1 # 0ffffffc0fffffff
+
+ and $8,$10
+ daddiu $10,-3 # 0ffffffc0ffffffc
+ and $9,$10
+
+ sd $8,24($4)
+ dsrl $10,$9,2
+ sd $9,32($4)
+ daddu $10,$9 # s1 = r1 + (r1 >> 2)
+ sd $10,40($4)
+
+.Lno_key:
+ li $2,0 # return 0
+ jr $31
+.end poly1305_init_mips
+
+.align 5
+.globl poly1305_blocks_mips
+.ent poly1305_blocks_mips
+poly1305_blocks_mips:
+ .set noreorder
+ dsrl $6,4 # number of complete blocks
+ bnez $6,poly1305_blocks_internal
+ nop
+ jr $31
+ nop
+.end poly1305_blocks_mips
+
+.align 5
+.ent poly1305_blocks_internal
+poly1305_blocks_internal:
+ .frame $29,6*8,$31
+ .mask 0x00030000,-8
+ .set noreorder
+ dsubu $29,6*8
+ sd $17,40($29)
+ sd $16,32($29)
+ .set reorder
+
+ ld $12,0($4) # load hash value
+ ld $13,8($4)
+ ld $14,16($4)
+
+ ld $15,24($4) # load key
+ ld $16,32($4)
+ ld $17,40($4)
+
+.Loop:
+#if defined(CONFIG_CPU_MIPS64_R6) || defined(CONFIG_CPU_MIPSR6)
+ ld $8,0($5) # load input
+ ld $9,8($5)
+#else
+ ldl $8,0+MSB($5) # load input
+ ldl $9,8+MSB($5)
+ ldr $8,0+LSB($5)
+ ldr $9,8+LSB($5)
+#endif
+ daddiu $6,-1
+ daddiu $5,16
+#ifdef __MIPSEB__
+#if defined(CONFIG_CPU_MIPS64_R2) || defined(CONFIG_CPU_MIPSR2) || defined(CONFIG_CPU_MIPS64_R6) || defined(CONFIG_CPU_MIPSR6)
+ dsbh $8,$8 # byte swap
+ dsbh $9,$9
+ dshd $8,$8
+ dshd $9,$9
+#else
+ ori $10,$0,0xFF
+ dsll $1,$10,32
+ or $10,$1 # 0x000000FF000000FF
+
+ and $11,$8,$10 # byte swap
+ and $2,$9,$10
+ dsrl $1,$8,24
+ dsrl $24,$9,24
+ dsll $11,24
+ dsll $2,24
+ and $1,$10
+ and $24,$10
+ dsll $10,8 # 0x0000FF000000FF00
+ or $11,$1
+ or $2,$24
+ and $1,$8,$10
+ and $24,$9,$10
+ dsrl $8,8
+ dsrl $9,8
+ dsll $1,8
+ dsll $24,8
+ and $8,$10
+ and $9,$10
+ or $11,$1
+ or $2,$24
+ or $8,$11
+ or $9,$2
+ dsrl $11,$8,32
+ dsrl $2,$9,32
+ dsll $8,32
+ dsll $9,32
+ or $8,$11
+ or $9,$2
+#endif
+#endif
+ daddu $12,$8 # accumulate input
+ daddu $13,$9
+ sltu $10,$12,$8
+ sltu $11,$13,$9
+ daddu $13,$10
+
+ dmultu ($15,$12) # h0*r0
+ daddu $14,$7
+ sltu $10,$13,$10
+ mflo ($8,$15,$12)
+ mfhi ($9,$15,$12)
+
+ dmultu ($17,$13) # h1*5*r1
+ daddu $10,$11
+ daddu $14,$10
+ mflo ($10,$17,$13)
+ mfhi ($11,$17,$13)
+
+ dmultu ($16,$12) # h0*r1
+ daddu $8,$10
+ daddu $9,$11
+ mflo ($1,$16,$12)
+ mfhi ($25,$16,$12)
+ sltu $10,$8,$10
+ daddu $9,$10
+
+ dmultu ($15,$13) # h1*r0
+ daddu $9,$1
+ sltu $1,$9,$1
+ mflo ($10,$15,$13)
+ mfhi ($11,$15,$13)
+ daddu $25,$1
+
+ dmultu ($17,$14) # h2*5*r1
+ daddu $9,$10
+ daddu $25,$11
+ mflo ($1,$17,$14)
+
+ dmultu ($15,$14) # h2*r0
+ sltu $10,$9,$10
+ daddu $25,$10
+ mflo ($2,$15,$14)
+
+ daddu $9,$1
+ daddu $25,$2
+ sltu $1,$9,$1
+ daddu $25,$1
+
+ li $10,-4 # final reduction
+ and $10,$25
+ dsrl $11,$25,2
+ andi $14,$25,3
+ daddu $10,$11
+ daddu $12,$8,$10
+ sltu $10,$12,$10
+ daddu $13,$9,$10
+ sltu $10,$13,$10
+ daddu $14,$14,$10
+
+ bnez $6,.Loop
+
+ sd $12,0($4) # store hash value
+ sd $13,8($4)
+ sd $14,16($4)
+
+ .set noreorder
+ ld $17,40($29) # epilogue
+ ld $16,32($29)
+ jr $31
+ daddu $29,6*8
+.end poly1305_blocks_internal
+
+.align 5
+.globl poly1305_emit_mips
+.ent poly1305_emit_mips
+poly1305_emit_mips:
+ .frame $29,0,$31
+ .set reorder
+
+ ld $10,0($4)
+ ld $11,8($4)
+ ld $1,16($4)
+
+ daddiu $8,$10,5 # compare to modulus
+ sltiu $2,$8,5
+ daddu $9,$11,$2
+ sltu $2,$9,$2
+ daddu $1,$1,$2
+
+ dsrl $1,2 # see if it carried/borrowed
+ dsubu $1,$0,$1
+ nor $2,$0,$1
+
+ and $8,$1
+ and $10,$2
+ and $9,$1
+ and $11,$2
+ or $8,$10
+ or $9,$11
+
+ lwu $10,0($6) # load nonce
+ lwu $11,4($6)
+ lwu $1,8($6)
+ lwu $2,12($6)
+ dsll $11,32
+ dsll $2,32
+ or $10,$11
+ or $1,$2
+
+ daddu $8,$10 # accumulate nonce
+ daddu $9,$1
+ sltu $10,$8,$10
+ daddu $9,$10
+
+ dsrl $10,$8,8 # write mac value
+ dsrl $11,$8,16
+ dsrl $1,$8,24
+ sb $8,0($5)
+ dsrl $2,$8,32
+ sb $10,1($5)
+ dsrl $10,$8,40
+ sb $11,2($5)
+ dsrl $11,$8,48
+ sb $1,3($5)
+ dsrl $1,$8,56
+ sb $2,4($5)
+ dsrl $2,$9,8
+ sb $10,5($5)
+ dsrl $10,$9,16
+ sb $11,6($5)
+ dsrl $11,$9,24
+ sb $1,7($5)
+
+ sb $9,8($5)
+ dsrl $1,$9,32
+ sb $2,9($5)
+ dsrl $2,$9,40
+ sb $10,10($5)
+ dsrl $10,$9,48
+ sb $11,11($5)
+ dsrl $11,$9,56
+ sb $1,12($5)
+ sb $2,13($5)
+ sb $10,14($5)
+ sb $11,15($5)
+
+ jr $31
+.end poly1305_emit_mips
diff --git a/lib/zinc/poly1305/poly1305-x86_64.S b/lib/zinc/poly1305/poly1305-x86_64.S
new file mode 100644
index 000000000000..7c1b7363f487
--- /dev/null
+++ b/lib/zinc/poly1305/poly1305-x86_64.S
@@ -0,0 +1,2790 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2017 Samuel Neves <sneves@....uc.pt>. All Rights Reserved.
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#include <linux/linkage.h>
+
+.section .rodata.cst192.Lconst, "aM", @progbits, 192
+.align 64
+.Lconst:
+.long 0x0ffffff,0,0x0ffffff,0,0x0ffffff,0,0x0ffffff,0
+.long 16777216,0,16777216,0,16777216,0,16777216,0
+.long 0x3ffffff,0,0x3ffffff,0,0x3ffffff,0,0x3ffffff,0
+.long 2,2,2,3,2,0,2,1
+.long 0,0,0,1, 0,2,0,3, 0,4,0,5, 0,6,0,7
+
+.text
+
+.align 32
+ENTRY(poly1305_init_x86_64)
+ xorq %rax,%rax
+ movq %rax,0(%rdi)
+ movq %rax,8(%rdi)
+ movq %rax,16(%rdi)
+
+ cmpq $0,%rsi
+ je .Lno_key
+
+ movq $0x0ffffffc0fffffff,%rax
+ movq $0x0ffffffc0ffffffc,%rcx
+ andq 0(%rsi),%rax
+ andq 8(%rsi),%rcx
+ movq %rax,24(%rdi)
+ movq %rcx,32(%rdi)
+ movl $1,%eax
+.Lno_key:
+ ret
+ENDPROC(poly1305_init_x86_64)
+
+.align 32
+ENTRY(poly1305_blocks_x86_64)
+.Lblocks:
+ shrq $4,%rdx
+ jz .Lno_data
+
+ pushq %rbx
+ pushq %r12
+ pushq %r13
+ pushq %r14
+ pushq %r15
+ pushq %rdi
+
+.Lblocks_body:
+
+ movq %rdx,%r15
+
+ movq 24(%rdi),%r11
+ movq 32(%rdi),%r13
+
+ movq 0(%rdi),%r14
+ movq 8(%rdi),%rbx
+ movq 16(%rdi),%r10
+
+ movq %r13,%r12
+ shrq $2,%r13
+ movq %r12,%rax
+ addq %r12,%r13
+ jmp .Loop
+
+.align 32
+.Loop:
+
+ addq 0(%rsi),%r14
+ adcq 8(%rsi),%rbx
+ leaq 16(%rsi),%rsi
+ adcq %rcx,%r10
+ mulq %r14
+ movq %rax,%r9
+ movq %r11,%rax
+ movq %rdx,%rdi
+
+ mulq %r14
+ movq %rax,%r14
+ movq %r11,%rax
+ movq %rdx,%r8
+
+ mulq %rbx
+ addq %rax,%r9
+ movq %r13,%rax
+ adcq %rdx,%rdi
+
+ mulq %rbx
+ movq %r10,%rbx
+ addq %rax,%r14
+ adcq %rdx,%r8
+
+ imulq %r13,%rbx
+ addq %rbx,%r9
+ movq %r8,%rbx
+ adcq $0,%rdi
+
+ imulq %r11,%r10
+ addq %r9,%rbx
+ movq $-4,%rax
+ adcq %r10,%rdi
+
+ andq %rdi,%rax
+ movq %rdi,%r10
+ shrq $2,%rdi
+ andq $3,%r10
+ addq %rdi,%rax
+ addq %rax,%r14
+ adcq $0,%rbx
+ adcq $0,%r10
+
+ movq %r12,%rax
+ decq %r15
+ jnz .Loop
+
+ movq 0(%rsp),%rdi
+
+ movq %r14,0(%rdi)
+ movq %rbx,8(%rdi)
+ movq %r10,16(%rdi)
+
+ movq 8(%rsp),%r15
+ movq 16(%rsp),%r14
+ movq 24(%rsp),%r13
+ movq 32(%rsp),%r12
+ movq 40(%rsp),%rbx
+ leaq 48(%rsp),%rsp
+.Lno_data:
+.Lblocks_epilogue:
+ ret
+ENDPROC(poly1305_blocks_x86_64)
+
+.align 32
+ENTRY(poly1305_emit_x86_64)
+.Lemit:
+ movq 0(%rdi),%r8
+ movq 8(%rdi),%r9
+ movq 16(%rdi),%r10
+
+ movq %r8,%rax
+ addq $5,%r8
+ movq %r9,%rcx
+ adcq $0,%r9
+ adcq $0,%r10
+ shrq $2,%r10
+ cmovnzq %r8,%rax
+ cmovnzq %r9,%rcx
+
+ addq 0(%rdx),%rax
+ adcq 8(%rdx),%rcx
+ movq %rax,0(%rsi)
+ movq %rcx,8(%rsi)
+
+ ret
+ENDPROC(poly1305_emit_x86_64)
+
+.macro __poly1305_block
+ mulq %r14
+ movq %rax,%r9
+ movq %r11,%rax
+ movq %rdx,%rdi
+
+ mulq %r14
+ movq %rax,%r14
+ movq %r11,%rax
+ movq %rdx,%r8
+
+ mulq %rbx
+ addq %rax,%r9
+ movq %r13,%rax
+ adcq %rdx,%rdi
+
+ mulq %rbx
+ movq %r10,%rbx
+ addq %rax,%r14
+ adcq %rdx,%r8
+
+ imulq %r13,%rbx
+ addq %rbx,%r9
+ movq %r8,%rbx
+ adcq $0,%rdi
+
+ imulq %r11,%r10
+ addq %r9,%rbx
+ movq $-4,%rax
+ adcq %r10,%rdi
+
+ andq %rdi,%rax
+ movq %rdi,%r10
+ shrq $2,%rdi
+ andq $3,%r10
+ addq %rdi,%rax
+ addq %rax,%r14
+ adcq $0,%rbx
+ adcq $0,%r10
+.endm
+
+.macro __poly1305_init_avx
+ movq %r11,%r14
+ movq %r12,%rbx
+ xorq %r10,%r10
+
+ leaq 48+64(%rdi),%rdi
+
+ movq %r12,%rax
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+
+ movl $0x3ffffff,%eax
+ movl $0x3ffffff,%edx
+ movq %r14,%r8
+ andl %r14d,%eax
+ movq %r11,%r9
+ andl %r11d,%edx
+ movl %eax,-64(%rdi)
+ shrq $26,%r8
+ movl %edx,-60(%rdi)
+ shrq $26,%r9
+
+ movl $0x3ffffff,%eax
+ movl $0x3ffffff,%edx
+ andl %r8d,%eax
+ andl %r9d,%edx
+ movl %eax,-48(%rdi)
+ leal (%rax,%rax,4),%eax
+ movl %edx,-44(%rdi)
+ leal (%rdx,%rdx,4),%edx
+ movl %eax,-32(%rdi)
+ shrq $26,%r8
+ movl %edx,-28(%rdi)
+ shrq $26,%r9
+
+ movq %rbx,%rax
+ movq %r12,%rdx
+ shlq $12,%rax
+ shlq $12,%rdx
+ orq %r8,%rax
+ orq %r9,%rdx
+ andl $0x3ffffff,%eax
+ andl $0x3ffffff,%edx
+ movl %eax,-16(%rdi)
+ leal (%rax,%rax,4),%eax
+ movl %edx,-12(%rdi)
+ leal (%rdx,%rdx,4),%edx
+ movl %eax,0(%rdi)
+ movq %rbx,%r8
+ movl %edx,4(%rdi)
+ movq %r12,%r9
+
+ movl $0x3ffffff,%eax
+ movl $0x3ffffff,%edx
+ shrq $14,%r8
+ shrq $14,%r9
+ andl %r8d,%eax
+ andl %r9d,%edx
+ movl %eax,16(%rdi)
+ leal (%rax,%rax,4),%eax
+ movl %edx,20(%rdi)
+ leal (%rdx,%rdx,4),%edx
+ movl %eax,32(%rdi)
+ shrq $26,%r8
+ movl %edx,36(%rdi)
+ shrq $26,%r9
+
+ movq %r10,%rax
+ shlq $24,%rax
+ orq %rax,%r8
+ movl %r8d,48(%rdi)
+ leaq (%r8,%r8,4),%r8
+ movl %r9d,52(%rdi)
+ leaq (%r9,%r9,4),%r9
+ movl %r8d,64(%rdi)
+ movl %r9d,68(%rdi)
+
+ movq %r12,%rax
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+
+ movl $0x3ffffff,%eax
+ movq %r14,%r8
+ andl %r14d,%eax
+ shrq $26,%r8
+ movl %eax,-52(%rdi)
+
+ movl $0x3ffffff,%edx
+ andl %r8d,%edx
+ movl %edx,-36(%rdi)
+ leal (%rdx,%rdx,4),%edx
+ shrq $26,%r8
+ movl %edx,-20(%rdi)
+
+ movq %rbx,%rax
+ shlq $12,%rax
+ orq %r8,%rax
+ andl $0x3ffffff,%eax
+ movl %eax,-4(%rdi)
+ leal (%rax,%rax,4),%eax
+ movq %rbx,%r8
+ movl %eax,12(%rdi)
+
+ movl $0x3ffffff,%edx
+ shrq $14,%r8
+ andl %r8d,%edx
+ movl %edx,28(%rdi)
+ leal (%rdx,%rdx,4),%edx
+ shrq $26,%r8
+ movl %edx,44(%rdi)
+
+ movq %r10,%rax
+ shlq $24,%rax
+ orq %rax,%r8
+ movl %r8d,60(%rdi)
+ leaq (%r8,%r8,4),%r8
+ movl %r8d,76(%rdi)
+
+ movq %r12,%rax
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+
+ movl $0x3ffffff,%eax
+ movq %r14,%r8
+ andl %r14d,%eax
+ shrq $26,%r8
+ movl %eax,-56(%rdi)
+
+ movl $0x3ffffff,%edx
+ andl %r8d,%edx
+ movl %edx,-40(%rdi)
+ leal (%rdx,%rdx,4),%edx
+ shrq $26,%r8
+ movl %edx,-24(%rdi)
+
+ movq %rbx,%rax
+ shlq $12,%rax
+ orq %r8,%rax
+ andl $0x3ffffff,%eax
+ movl %eax,-8(%rdi)
+ leal (%rax,%rax,4),%eax
+ movq %rbx,%r8
+ movl %eax,8(%rdi)
+
+ movl $0x3ffffff,%edx
+ shrq $14,%r8
+ andl %r8d,%edx
+ movl %edx,24(%rdi)
+ leal (%rdx,%rdx,4),%edx
+ shrq $26,%r8
+ movl %edx,40(%rdi)
+
+ movq %r10,%rax
+ shlq $24,%rax
+ orq %rax,%r8
+ movl %r8d,56(%rdi)
+ leaq (%r8,%r8,4),%r8
+ movl %r8d,72(%rdi)
+
+ leaq -48-64(%rdi),%rdi
+.endm
+
+#ifdef CONFIG_AS_AVX
+.align 32
+ENTRY(poly1305_blocks_avx)
+
+ movl 20(%rdi),%r8d
+ cmpq $128,%rdx
+ jae .Lblocks_avx
+ testl %r8d,%r8d
+ jz .Lblocks
+
+.Lblocks_avx:
+ andq $-16,%rdx
+ jz .Lno_data_avx
+
+ vzeroupper
+
+ testl %r8d,%r8d
+ jz .Lbase2_64_avx
+
+ testq $31,%rdx
+ jz .Leven_avx
+
+ pushq %rbx
+ pushq %r12
+ pushq %r13
+ pushq %r14
+ pushq %r15
+ pushq %rdi
+
+.Lblocks_avx_body:
+
+ movq %rdx,%r15
+
+ movq 0(%rdi),%r8
+ movq 8(%rdi),%r9
+ movl 16(%rdi),%r10d
+
+ movq 24(%rdi),%r11
+ movq 32(%rdi),%r13
+
+
+ movl %r8d,%r14d
+ andq $-2147483648,%r8
+ movq %r9,%r12
+ movl %r9d,%ebx
+ andq $-2147483648,%r9
+
+ shrq $6,%r8
+ shlq $52,%r12
+ addq %r8,%r14
+ shrq $12,%rbx
+ shrq $18,%r9
+ addq %r12,%r14
+ adcq %r9,%rbx
+
+ movq %r10,%r8
+ shlq $40,%r8
+ shrq $24,%r10
+ addq %r8,%rbx
+ adcq $0,%r10
+
+ movq $-4,%r9
+ movq %r10,%r8
+ andq %r10,%r9
+ shrq $2,%r8
+ andq $3,%r10
+ addq %r9,%r8
+ addq %r8,%r14
+ adcq $0,%rbx
+ adcq $0,%r10
+
+ movq %r13,%r12
+ movq %r13,%rax
+ shrq $2,%r13
+ addq %r12,%r13
+
+ addq 0(%rsi),%r14
+ adcq 8(%rsi),%rbx
+ leaq 16(%rsi),%rsi
+ adcq %rcx,%r10
+
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+
+ testq %rcx,%rcx
+ jz .Lstore_base2_64_avx
+
+
+ movq %r14,%rax
+ movq %r14,%rdx
+ shrq $52,%r14
+ movq %rbx,%r11
+ movq %rbx,%r12
+ shrq $26,%rdx
+ andq $0x3ffffff,%rax
+ shlq $12,%r11
+ andq $0x3ffffff,%rdx
+ shrq $14,%rbx
+ orq %r11,%r14
+ shlq $24,%r10
+ andq $0x3ffffff,%r14
+ shrq $40,%r12
+ andq $0x3ffffff,%rbx
+ orq %r12,%r10
+
+ subq $16,%r15
+ jz .Lstore_base2_26_avx
+
+ vmovd %eax,%xmm0
+ vmovd %edx,%xmm1
+ vmovd %r14d,%xmm2
+ vmovd %ebx,%xmm3
+ vmovd %r10d,%xmm4
+ jmp .Lproceed_avx
+
+.align 32
+.Lstore_base2_64_avx:
+ movq %r14,0(%rdi)
+ movq %rbx,8(%rdi)
+ movq %r10,16(%rdi)
+ jmp .Ldone_avx
+
+.align 16
+.Lstore_base2_26_avx:
+ movl %eax,0(%rdi)
+ movl %edx,4(%rdi)
+ movl %r14d,8(%rdi)
+ movl %ebx,12(%rdi)
+ movl %r10d,16(%rdi)
+.align 16
+.Ldone_avx:
+ movq 8(%rsp),%r15
+ movq 16(%rsp),%r14
+ movq 24(%rsp),%r13
+ movq 32(%rsp),%r12
+ movq 40(%rsp),%rbx
+ leaq 48(%rsp),%rsp
+
+.Lno_data_avx:
+.Lblocks_avx_epilogue:
+ ret
+
+.align 32
+.Lbase2_64_avx:
+
+ pushq %rbx
+ pushq %r12
+ pushq %r13
+ pushq %r14
+ pushq %r15
+ pushq %rdi
+
+.Lbase2_64_avx_body:
+
+ movq %rdx,%r15
+
+ movq 24(%rdi),%r11
+ movq 32(%rdi),%r13
+
+ movq 0(%rdi),%r14
+ movq 8(%rdi),%rbx
+ movl 16(%rdi),%r10d
+
+ movq %r13,%r12
+ movq %r13,%rax
+ shrq $2,%r13
+ addq %r12,%r13
+
+ testq $31,%rdx
+ jz .Linit_avx
+
+ addq 0(%rsi),%r14
+ adcq 8(%rsi),%rbx
+ leaq 16(%rsi),%rsi
+ adcq %rcx,%r10
+ subq $16,%r15
+
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+
+.Linit_avx:
+
+ movq %r14,%rax
+ movq %r14,%rdx
+ shrq $52,%r14
+ movq %rbx,%r8
+ movq %rbx,%r9
+ shrq $26,%rdx
+ andq $0x3ffffff,%rax
+ shlq $12,%r8
+ andq $0x3ffffff,%rdx
+ shrq $14,%rbx
+ orq %r8,%r14
+ shlq $24,%r10
+ andq $0x3ffffff,%r14
+ shrq $40,%r9
+ andq $0x3ffffff,%rbx
+ orq %r9,%r10
+
+ vmovd %eax,%xmm0
+ vmovd %edx,%xmm1
+ vmovd %r14d,%xmm2
+ vmovd %ebx,%xmm3
+ vmovd %r10d,%xmm4
+ movl $1,20(%rdi)
+
+ __poly1305_init_avx
+
+.Lproceed_avx:
+ movq %r15,%rdx
+
+ movq 8(%rsp),%r15
+ movq 16(%rsp),%r14
+ movq 24(%rsp),%r13
+ movq 32(%rsp),%r12
+ movq 40(%rsp),%rbx
+ leaq 48(%rsp),%rax
+ leaq 48(%rsp),%rsp
+
+.Lbase2_64_avx_epilogue:
+ jmp .Ldo_avx
+
+
+.align 32
+.Leven_avx:
+ vmovd 0(%rdi),%xmm0
+ vmovd 4(%rdi),%xmm1
+ vmovd 8(%rdi),%xmm2
+ vmovd 12(%rdi),%xmm3
+ vmovd 16(%rdi),%xmm4
+
+.Ldo_avx:
+ leaq 8(%rsp),%r10
+ andq $-32,%rsp
+ subq $8,%rsp
+ leaq -88(%rsp),%r11
+ subq $0x178,%rsp
+ subq $64,%rdx
+ leaq -32(%rsi),%rax
+ cmovcq %rax,%rsi
+
+ vmovdqu 48(%rdi),%xmm14
+ leaq 112(%rdi),%rdi
+ leaq .Lconst(%rip),%rcx
+
+ vmovdqu 32(%rsi),%xmm5
+ vmovdqu 48(%rsi),%xmm6
+ vmovdqa 64(%rcx),%xmm15
+
+ vpsrldq $6,%xmm5,%xmm7
+ vpsrldq $6,%xmm6,%xmm8
+ vpunpckhqdq %xmm6,%xmm5,%xmm9
+ vpunpcklqdq %xmm6,%xmm5,%xmm5
+ vpunpcklqdq %xmm8,%xmm7,%xmm8
+
+ vpsrlq $40,%xmm9,%xmm9
+ vpsrlq $26,%xmm5,%xmm6
+ vpand %xmm15,%xmm5,%xmm5
+ vpsrlq $4,%xmm8,%xmm7
+ vpand %xmm15,%xmm6,%xmm6
+ vpsrlq $30,%xmm8,%xmm8
+ vpand %xmm15,%xmm7,%xmm7
+ vpand %xmm15,%xmm8,%xmm8
+ vpor 32(%rcx),%xmm9,%xmm9
+
+ jbe .Lskip_loop_avx
+
+
+ vmovdqu -48(%rdi),%xmm11
+ vmovdqu -32(%rdi),%xmm12
+ vpshufd $0xEE,%xmm14,%xmm13
+ vpshufd $0x44,%xmm14,%xmm10
+ vmovdqa %xmm13,-144(%r11)
+ vmovdqa %xmm10,0(%rsp)
+ vpshufd $0xEE,%xmm11,%xmm14
+ vmovdqu -16(%rdi),%xmm10
+ vpshufd $0x44,%xmm11,%xmm11
+ vmovdqa %xmm14,-128(%r11)
+ vmovdqa %xmm11,16(%rsp)
+ vpshufd $0xEE,%xmm12,%xmm13
+ vmovdqu 0(%rdi),%xmm11
+ vpshufd $0x44,%xmm12,%xmm12
+ vmovdqa %xmm13,-112(%r11)
+ vmovdqa %xmm12,32(%rsp)
+ vpshufd $0xEE,%xmm10,%xmm14
+ vmovdqu 16(%rdi),%xmm12
+ vpshufd $0x44,%xmm10,%xmm10
+ vmovdqa %xmm14,-96(%r11)
+ vmovdqa %xmm10,48(%rsp)
+ vpshufd $0xEE,%xmm11,%xmm13
+ vmovdqu 32(%rdi),%xmm10
+ vpshufd $0x44,%xmm11,%xmm11
+ vmovdqa %xmm13,-80(%r11)
+ vmovdqa %xmm11,64(%rsp)
+ vpshufd $0xEE,%xmm12,%xmm14
+ vmovdqu 48(%rdi),%xmm11
+ vpshufd $0x44,%xmm12,%xmm12
+ vmovdqa %xmm14,-64(%r11)
+ vmovdqa %xmm12,80(%rsp)
+ vpshufd $0xEE,%xmm10,%xmm13
+ vmovdqu 64(%rdi),%xmm12
+ vpshufd $0x44,%xmm10,%xmm10
+ vmovdqa %xmm13,-48(%r11)
+ vmovdqa %xmm10,96(%rsp)
+ vpshufd $0xEE,%xmm11,%xmm14
+ vpshufd $0x44,%xmm11,%xmm11
+ vmovdqa %xmm14,-32(%r11)
+ vmovdqa %xmm11,112(%rsp)
+ vpshufd $0xEE,%xmm12,%xmm13
+ vmovdqa 0(%rsp),%xmm14
+ vpshufd $0x44,%xmm12,%xmm12
+ vmovdqa %xmm13,-16(%r11)
+ vmovdqa %xmm12,128(%rsp)
+
+ jmp .Loop_avx
+
+.align 32
+.Loop_avx:
+
+ vpmuludq %xmm5,%xmm14,%xmm10
+ vpmuludq %xmm6,%xmm14,%xmm11
+ vmovdqa %xmm2,32(%r11)
+ vpmuludq %xmm7,%xmm14,%xmm12
+ vmovdqa 16(%rsp),%xmm2
+ vpmuludq %xmm8,%xmm14,%xmm13
+ vpmuludq %xmm9,%xmm14,%xmm14
+
+ vmovdqa %xmm0,0(%r11)
+ vpmuludq 32(%rsp),%xmm9,%xmm0
+ vmovdqa %xmm1,16(%r11)
+ vpmuludq %xmm8,%xmm2,%xmm1
+ vpaddq %xmm0,%xmm10,%xmm10
+ vpaddq %xmm1,%xmm14,%xmm14
+ vmovdqa %xmm3,48(%r11)
+ vpmuludq %xmm7,%xmm2,%xmm0
+ vpmuludq %xmm6,%xmm2,%xmm1
+ vpaddq %xmm0,%xmm13,%xmm13
+ vmovdqa 48(%rsp),%xmm3
+ vpaddq %xmm1,%xmm12,%xmm12
+ vmovdqa %xmm4,64(%r11)
+ vpmuludq %xmm5,%xmm2,%xmm2
+ vpmuludq %xmm7,%xmm3,%xmm0
+ vpaddq %xmm2,%xmm11,%xmm11
+
+ vmovdqa 64(%rsp),%xmm4
+ vpaddq %xmm0,%xmm14,%xmm14
+ vpmuludq %xmm6,%xmm3,%xmm1
+ vpmuludq %xmm5,%xmm3,%xmm3
+ vpaddq %xmm1,%xmm13,%xmm13
+ vmovdqa 80(%rsp),%xmm2
+ vpaddq %xmm3,%xmm12,%xmm12
+ vpmuludq %xmm9,%xmm4,%xmm0
+ vpmuludq %xmm8,%xmm4,%xmm4
+ vpaddq %xmm0,%xmm11,%xmm11
+ vmovdqa 96(%rsp),%xmm3
+ vpaddq %xmm4,%xmm10,%xmm10
+
+ vmovdqa 128(%rsp),%xmm4
+ vpmuludq %xmm6,%xmm2,%xmm1
+ vpmuludq %xmm5,%xmm2,%xmm2
+ vpaddq %xmm1,%xmm14,%xmm14
+ vpaddq %xmm2,%xmm13,%xmm13
+ vpmuludq %xmm9,%xmm3,%xmm0
+ vpmuludq %xmm8,%xmm3,%xmm1
+ vpaddq %xmm0,%xmm12,%xmm12
+ vmovdqu 0(%rsi),%xmm0
+ vpaddq %xmm1,%xmm11,%xmm11
+ vpmuludq %xmm7,%xmm3,%xmm3
+ vpmuludq %xmm7,%xmm4,%xmm7
+ vpaddq %xmm3,%xmm10,%xmm10
+
+ vmovdqu 16(%rsi),%xmm1
+ vpaddq %xmm7,%xmm11,%xmm11
+ vpmuludq %xmm8,%xmm4,%xmm8
+ vpmuludq %xmm9,%xmm4,%xmm9
+ vpsrldq $6,%xmm0,%xmm2
+ vpaddq %xmm8,%xmm12,%xmm12
+ vpaddq %xmm9,%xmm13,%xmm13
+ vpsrldq $6,%xmm1,%xmm3
+ vpmuludq 112(%rsp),%xmm5,%xmm9
+ vpmuludq %xmm6,%xmm4,%xmm5
+ vpunpckhqdq %xmm1,%xmm0,%xmm4
+ vpaddq %xmm9,%xmm14,%xmm14
+ vmovdqa -144(%r11),%xmm9
+ vpaddq %xmm5,%xmm10,%xmm10
+
+ vpunpcklqdq %xmm1,%xmm0,%xmm0
+ vpunpcklqdq %xmm3,%xmm2,%xmm3
+
+
+ vpsrldq $5,%xmm4,%xmm4
+ vpsrlq $26,%xmm0,%xmm1
+ vpand %xmm15,%xmm0,%xmm0
+ vpsrlq $4,%xmm3,%xmm2
+ vpand %xmm15,%xmm1,%xmm1
+ vpand 0(%rcx),%xmm4,%xmm4
+ vpsrlq $30,%xmm3,%xmm3
+ vpand %xmm15,%xmm2,%xmm2
+ vpand %xmm15,%xmm3,%xmm3
+ vpor 32(%rcx),%xmm4,%xmm4
+
+ vpaddq 0(%r11),%xmm0,%xmm0
+ vpaddq 16(%r11),%xmm1,%xmm1
+ vpaddq 32(%r11),%xmm2,%xmm2
+ vpaddq 48(%r11),%xmm3,%xmm3
+ vpaddq 64(%r11),%xmm4,%xmm4
+
+ leaq 32(%rsi),%rax
+ leaq 64(%rsi),%rsi
+ subq $64,%rdx
+ cmovcq %rax,%rsi
+
+ vpmuludq %xmm0,%xmm9,%xmm5
+ vpmuludq %xmm1,%xmm9,%xmm6
+ vpaddq %xmm5,%xmm10,%xmm10
+ vpaddq %xmm6,%xmm11,%xmm11
+ vmovdqa -128(%r11),%xmm7
+ vpmuludq %xmm2,%xmm9,%xmm5
+ vpmuludq %xmm3,%xmm9,%xmm6
+ vpaddq %xmm5,%xmm12,%xmm12
+ vpaddq %xmm6,%xmm13,%xmm13
+ vpmuludq %xmm4,%xmm9,%xmm9
+ vpmuludq -112(%r11),%xmm4,%xmm5
+ vpaddq %xmm9,%xmm14,%xmm14
+
+ vpaddq %xmm5,%xmm10,%xmm10
+ vpmuludq %xmm2,%xmm7,%xmm6
+ vpmuludq %xmm3,%xmm7,%xmm5
+ vpaddq %xmm6,%xmm13,%xmm13
+ vmovdqa -96(%r11),%xmm8
+ vpaddq %xmm5,%xmm14,%xmm14
+ vpmuludq %xmm1,%xmm7,%xmm6
+ vpmuludq %xmm0,%xmm7,%xmm7
+ vpaddq %xmm6,%xmm12,%xmm12
+ vpaddq %xmm7,%xmm11,%xmm11
+
+ vmovdqa -80(%r11),%xmm9
+ vpmuludq %xmm2,%xmm8,%xmm5
+ vpmuludq %xmm1,%xmm8,%xmm6
+ vpaddq %xmm5,%xmm14,%xmm14
+ vpaddq %xmm6,%xmm13,%xmm13
+ vmovdqa -64(%r11),%xmm7
+ vpmuludq %xmm0,%xmm8,%xmm8
+ vpmuludq %xmm4,%xmm9,%xmm5
+ vpaddq %xmm8,%xmm12,%xmm12
+ vpaddq %xmm5,%xmm11,%xmm11
+ vmovdqa -48(%r11),%xmm8
+ vpmuludq %xmm3,%xmm9,%xmm9
+ vpmuludq %xmm1,%xmm7,%xmm6
+ vpaddq %xmm9,%xmm10,%xmm10
+
+ vmovdqa -16(%r11),%xmm9
+ vpaddq %xmm6,%xmm14,%xmm14
+ vpmuludq %xmm0,%xmm7,%xmm7
+ vpmuludq %xmm4,%xmm8,%xmm5
+ vpaddq %xmm7,%xmm13,%xmm13
+ vpaddq %xmm5,%xmm12,%xmm12
+ vmovdqu 32(%rsi),%xmm5
+ vpmuludq %xmm3,%xmm8,%xmm7
+ vpmuludq %xmm2,%xmm8,%xmm8
+ vpaddq %xmm7,%xmm11,%xmm11
+ vmovdqu 48(%rsi),%xmm6
+ vpaddq %xmm8,%xmm10,%xmm10
+
+ vpmuludq %xmm2,%xmm9,%xmm2
+ vpmuludq %xmm3,%xmm9,%xmm3
+ vpsrldq $6,%xmm5,%xmm7
+ vpaddq %xmm2,%xmm11,%xmm11
+ vpmuludq %xmm4,%xmm9,%xmm4
+ vpsrldq $6,%xmm6,%xmm8
+ vpaddq %xmm3,%xmm12,%xmm2
+ vpaddq %xmm4,%xmm13,%xmm3
+ vpmuludq -32(%r11),%xmm0,%xmm4
+ vpmuludq %xmm1,%xmm9,%xmm0
+ vpunpckhqdq %xmm6,%xmm5,%xmm9
+ vpaddq %xmm4,%xmm14,%xmm4
+ vpaddq %xmm0,%xmm10,%xmm0
+
+ vpunpcklqdq %xmm6,%xmm5,%xmm5
+ vpunpcklqdq %xmm8,%xmm7,%xmm8
+
+
+ vpsrldq $5,%xmm9,%xmm9
+ vpsrlq $26,%xmm5,%xmm6
+ vmovdqa 0(%rsp),%xmm14
+ vpand %xmm15,%xmm5,%xmm5
+ vpsrlq $4,%xmm8,%xmm7
+ vpand %xmm15,%xmm6,%xmm6
+ vpand 0(%rcx),%xmm9,%xmm9
+ vpsrlq $30,%xmm8,%xmm8
+ vpand %xmm15,%xmm7,%xmm7
+ vpand %xmm15,%xmm8,%xmm8
+ vpor 32(%rcx),%xmm9,%xmm9
+
+ vpsrlq $26,%xmm3,%xmm13
+ vpand %xmm15,%xmm3,%xmm3
+ vpaddq %xmm13,%xmm4,%xmm4
+
+ vpsrlq $26,%xmm0,%xmm10
+ vpand %xmm15,%xmm0,%xmm0
+ vpaddq %xmm10,%xmm11,%xmm1
+
+ vpsrlq $26,%xmm4,%xmm10
+ vpand %xmm15,%xmm4,%xmm4
+
+ vpsrlq $26,%xmm1,%xmm11
+ vpand %xmm15,%xmm1,%xmm1
+ vpaddq %xmm11,%xmm2,%xmm2
+
+ vpaddq %xmm10,%xmm0,%xmm0
+ vpsllq $2,%xmm10,%xmm10
+ vpaddq %xmm10,%xmm0,%xmm0
+
+ vpsrlq $26,%xmm2,%xmm12
+ vpand %xmm15,%xmm2,%xmm2
+ vpaddq %xmm12,%xmm3,%xmm3
+
+ vpsrlq $26,%xmm0,%xmm10
+ vpand %xmm15,%xmm0,%xmm0
+ vpaddq %xmm10,%xmm1,%xmm1
+
+ vpsrlq $26,%xmm3,%xmm13
+ vpand %xmm15,%xmm3,%xmm3
+ vpaddq %xmm13,%xmm4,%xmm4
+
+ ja .Loop_avx
+
+.Lskip_loop_avx:
+ vpshufd $0x10,%xmm14,%xmm14
+ addq $32,%rdx
+ jnz .Long_tail_avx
+
+ vpaddq %xmm2,%xmm7,%xmm7
+ vpaddq %xmm0,%xmm5,%xmm5
+ vpaddq %xmm1,%xmm6,%xmm6
+ vpaddq %xmm3,%xmm8,%xmm8
+ vpaddq %xmm4,%xmm9,%xmm9
+
+.Long_tail_avx:
+ vmovdqa %xmm2,32(%r11)
+ vmovdqa %xmm0,0(%r11)
+ vmovdqa %xmm1,16(%r11)
+ vmovdqa %xmm3,48(%r11)
+ vmovdqa %xmm4,64(%r11)
+
+ vpmuludq %xmm7,%xmm14,%xmm12
+ vpmuludq %xmm5,%xmm14,%xmm10
+ vpshufd $0x10,-48(%rdi),%xmm2
+ vpmuludq %xmm6,%xmm14,%xmm11
+ vpmuludq %xmm8,%xmm14,%xmm13
+ vpmuludq %xmm9,%xmm14,%xmm14
+
+ vpmuludq %xmm8,%xmm2,%xmm0
+ vpaddq %xmm0,%xmm14,%xmm14
+ vpshufd $0x10,-32(%rdi),%xmm3
+ vpmuludq %xmm7,%xmm2,%xmm1
+ vpaddq %xmm1,%xmm13,%xmm13
+ vpshufd $0x10,-16(%rdi),%xmm4
+ vpmuludq %xmm6,%xmm2,%xmm0
+ vpaddq %xmm0,%xmm12,%xmm12
+ vpmuludq %xmm5,%xmm2,%xmm2
+ vpaddq %xmm2,%xmm11,%xmm11
+ vpmuludq %xmm9,%xmm3,%xmm3
+ vpaddq %xmm3,%xmm10,%xmm10
+
+ vpshufd $0x10,0(%rdi),%xmm2
+ vpmuludq %xmm7,%xmm4,%xmm1
+ vpaddq %xmm1,%xmm14,%xmm14
+ vpmuludq %xmm6,%xmm4,%xmm0
+ vpaddq %xmm0,%xmm13,%xmm13
+ vpshufd $0x10,16(%rdi),%xmm3
+ vpmuludq %xmm5,%xmm4,%xmm4
+ vpaddq %xmm4,%xmm12,%xmm12
+ vpmuludq %xmm9,%xmm2,%xmm1
+ vpaddq %xmm1,%xmm11,%xmm11
+ vpshufd $0x10,32(%rdi),%xmm4
+ vpmuludq %xmm8,%xmm2,%xmm2
+ vpaddq %xmm2,%xmm10,%xmm10
+
+ vpmuludq %xmm6,%xmm3,%xmm0
+ vpaddq %xmm0,%xmm14,%xmm14
+ vpmuludq %xmm5,%xmm3,%xmm3
+ vpaddq %xmm3,%xmm13,%xmm13
+ vpshufd $0x10,48(%rdi),%xmm2
+ vpmuludq %xmm9,%xmm4,%xmm1
+ vpaddq %xmm1,%xmm12,%xmm12
+ vpshufd $0x10,64(%rdi),%xmm3
+ vpmuludq %xmm8,%xmm4,%xmm0
+ vpaddq %xmm0,%xmm11,%xmm11
+ vpmuludq %xmm7,%xmm4,%xmm4
+ vpaddq %xmm4,%xmm10,%xmm10
+
+ vpmuludq %xmm5,%xmm2,%xmm2
+ vpaddq %xmm2,%xmm14,%xmm14
+ vpmuludq %xmm9,%xmm3,%xmm1
+ vpaddq %xmm1,%xmm13,%xmm13
+ vpmuludq %xmm8,%xmm3,%xmm0
+ vpaddq %xmm0,%xmm12,%xmm12
+ vpmuludq %xmm7,%xmm3,%xmm1
+ vpaddq %xmm1,%xmm11,%xmm11
+ vpmuludq %xmm6,%xmm3,%xmm3
+ vpaddq %xmm3,%xmm10,%xmm10
+
+ jz .Lshort_tail_avx
+
+ vmovdqu 0(%rsi),%xmm0
+ vmovdqu 16(%rsi),%xmm1
+
+ vpsrldq $6,%xmm0,%xmm2
+ vpsrldq $6,%xmm1,%xmm3
+ vpunpckhqdq %xmm1,%xmm0,%xmm4
+ vpunpcklqdq %xmm1,%xmm0,%xmm0
+ vpunpcklqdq %xmm3,%xmm2,%xmm3
+
+ vpsrlq $40,%xmm4,%xmm4
+ vpsrlq $26,%xmm0,%xmm1
+ vpand %xmm15,%xmm0,%xmm0
+ vpsrlq $4,%xmm3,%xmm2
+ vpand %xmm15,%xmm1,%xmm1
+ vpsrlq $30,%xmm3,%xmm3
+ vpand %xmm15,%xmm2,%xmm2
+ vpand %xmm15,%xmm3,%xmm3
+ vpor 32(%rcx),%xmm4,%xmm4
+
+ vpshufd $0x32,-64(%rdi),%xmm9
+ vpaddq 0(%r11),%xmm0,%xmm0
+ vpaddq 16(%r11),%xmm1,%xmm1
+ vpaddq 32(%r11),%xmm2,%xmm2
+ vpaddq 48(%r11),%xmm3,%xmm3
+ vpaddq 64(%r11),%xmm4,%xmm4
+
+ vpmuludq %xmm0,%xmm9,%xmm5
+ vpaddq %xmm5,%xmm10,%xmm10
+ vpmuludq %xmm1,%xmm9,%xmm6
+ vpaddq %xmm6,%xmm11,%xmm11
+ vpmuludq %xmm2,%xmm9,%xmm5
+ vpaddq %xmm5,%xmm12,%xmm12
+ vpshufd $0x32,-48(%rdi),%xmm7
+ vpmuludq %xmm3,%xmm9,%xmm6
+ vpaddq %xmm6,%xmm13,%xmm13
+ vpmuludq %xmm4,%xmm9,%xmm9
+ vpaddq %xmm9,%xmm14,%xmm14
+
+ vpmuludq %xmm3,%xmm7,%xmm5
+ vpaddq %xmm5,%xmm14,%xmm14
+ vpshufd $0x32,-32(%rdi),%xmm8
+ vpmuludq %xmm2,%xmm7,%xmm6
+ vpaddq %xmm6,%xmm13,%xmm13
+ vpshufd $0x32,-16(%rdi),%xmm9
+ vpmuludq %xmm1,%xmm7,%xmm5
+ vpaddq %xmm5,%xmm12,%xmm12
+ vpmuludq %xmm0,%xmm7,%xmm7
+ vpaddq %xmm7,%xmm11,%xmm11
+ vpmuludq %xmm4,%xmm8,%xmm8
+ vpaddq %xmm8,%xmm10,%xmm10
+
+ vpshufd $0x32,0(%rdi),%xmm7
+ vpmuludq %xmm2,%xmm9,%xmm6
+ vpaddq %xmm6,%xmm14,%xmm14
+ vpmuludq %xmm1,%xmm9,%xmm5
+ vpaddq %xmm5,%xmm13,%xmm13
+ vpshufd $0x32,16(%rdi),%xmm8
+ vpmuludq %xmm0,%xmm9,%xmm9
+ vpaddq %xmm9,%xmm12,%xmm12
+ vpmuludq %xmm4,%xmm7,%xmm6
+ vpaddq %xmm6,%xmm11,%xmm11
+ vpshufd $0x32,32(%rdi),%xmm9
+ vpmuludq %xmm3,%xmm7,%xmm7
+ vpaddq %xmm7,%xmm10,%xmm10
+
+ vpmuludq %xmm1,%xmm8,%xmm5
+ vpaddq %xmm5,%xmm14,%xmm14
+ vpmuludq %xmm0,%xmm8,%xmm8
+ vpaddq %xmm8,%xmm13,%xmm13
+ vpshufd $0x32,48(%rdi),%xmm7
+ vpmuludq %xmm4,%xmm9,%xmm6
+ vpaddq %xmm6,%xmm12,%xmm12
+ vpshufd $0x32,64(%rdi),%xmm8
+ vpmuludq %xmm3,%xmm9,%xmm5
+ vpaddq %xmm5,%xmm11,%xmm11
+ vpmuludq %xmm2,%xmm9,%xmm9
+ vpaddq %xmm9,%xmm10,%xmm10
+
+ vpmuludq %xmm0,%xmm7,%xmm7
+ vpaddq %xmm7,%xmm14,%xmm14
+ vpmuludq %xmm4,%xmm8,%xmm6
+ vpaddq %xmm6,%xmm13,%xmm13
+ vpmuludq %xmm3,%xmm8,%xmm5
+ vpaddq %xmm5,%xmm12,%xmm12
+ vpmuludq %xmm2,%xmm8,%xmm6
+ vpaddq %xmm6,%xmm11,%xmm11
+ vpmuludq %xmm1,%xmm8,%xmm8
+ vpaddq %xmm8,%xmm10,%xmm10
+
+.Lshort_tail_avx:
+
+ vpsrldq $8,%xmm14,%xmm9
+ vpsrldq $8,%xmm13,%xmm8
+ vpsrldq $8,%xmm11,%xmm6
+ vpsrldq $8,%xmm10,%xmm5
+ vpsrldq $8,%xmm12,%xmm7
+ vpaddq %xmm8,%xmm13,%xmm13
+ vpaddq %xmm9,%xmm14,%xmm14
+ vpaddq %xmm5,%xmm10,%xmm10
+ vpaddq %xmm6,%xmm11,%xmm11
+ vpaddq %xmm7,%xmm12,%xmm12
+
+ vpsrlq $26,%xmm13,%xmm3
+ vpand %xmm15,%xmm13,%xmm13
+ vpaddq %xmm3,%xmm14,%xmm14
+
+ vpsrlq $26,%xmm10,%xmm0
+ vpand %xmm15,%xmm10,%xmm10
+ vpaddq %xmm0,%xmm11,%xmm11
+
+ vpsrlq $26,%xmm14,%xmm4
+ vpand %xmm15,%xmm14,%xmm14
+
+ vpsrlq $26,%xmm11,%xmm1
+ vpand %xmm15,%xmm11,%xmm11
+ vpaddq %xmm1,%xmm12,%xmm12
+
+ vpaddq %xmm4,%xmm10,%xmm10
+ vpsllq $2,%xmm4,%xmm4
+ vpaddq %xmm4,%xmm10,%xmm10
+
+ vpsrlq $26,%xmm12,%xmm2
+ vpand %xmm15,%xmm12,%xmm12
+ vpaddq %xmm2,%xmm13,%xmm13
+
+ vpsrlq $26,%xmm10,%xmm0
+ vpand %xmm15,%xmm10,%xmm10
+ vpaddq %xmm0,%xmm11,%xmm11
+
+ vpsrlq $26,%xmm13,%xmm3
+ vpand %xmm15,%xmm13,%xmm13
+ vpaddq %xmm3,%xmm14,%xmm14
+
+ vmovd %xmm10,-112(%rdi)
+ vmovd %xmm11,-108(%rdi)
+ vmovd %xmm12,-104(%rdi)
+ vmovd %xmm13,-100(%rdi)
+ vmovd %xmm14,-96(%rdi)
+ leaq -8(%r10),%rsp
+
+ vzeroupper
+ ret
+ENDPROC(poly1305_blocks_avx)
+
+.align 32
+ENTRY(poly1305_emit_avx)
+ cmpl $0,20(%rdi)
+ je .Lemit
+
+ movl 0(%rdi),%eax
+ movl 4(%rdi),%ecx
+ movl 8(%rdi),%r8d
+ movl 12(%rdi),%r11d
+ movl 16(%rdi),%r10d
+
+ shlq $26,%rcx
+ movq %r8,%r9
+ shlq $52,%r8
+ addq %rcx,%rax
+ shrq $12,%r9
+ addq %rax,%r8
+ adcq $0,%r9
+
+ shlq $14,%r11
+ movq %r10,%rax
+ shrq $24,%r10
+ addq %r11,%r9
+ shlq $40,%rax
+ addq %rax,%r9
+ adcq $0,%r10
+
+ movq %r10,%rax
+ movq %r10,%rcx
+ andq $3,%r10
+ shrq $2,%rax
+ andq $-4,%rcx
+ addq %rcx,%rax
+ addq %rax,%r8
+ adcq $0,%r9
+ adcq $0,%r10
+
+ movq %r8,%rax
+ addq $5,%r8
+ movq %r9,%rcx
+ adcq $0,%r9
+ adcq $0,%r10
+ shrq $2,%r10
+ cmovnzq %r8,%rax
+ cmovnzq %r9,%rcx
+
+ addq 0(%rdx),%rax
+ adcq 8(%rdx),%rcx
+ movq %rax,0(%rsi)
+ movq %rcx,8(%rsi)
+
+ ret
+ENDPROC(poly1305_emit_avx)
+#endif /* CONFIG_AS_AVX */
+
+#ifdef CONFIG_AS_AVX2
+.align 32
+ENTRY(poly1305_blocks_avx2)
+
+ movl 20(%rdi),%r8d
+ cmpq $128,%rdx
+ jae .Lblocks_avx2
+ testl %r8d,%r8d
+ jz .Lblocks
+
+.Lblocks_avx2:
+ andq $-16,%rdx
+ jz .Lno_data_avx2
+
+ vzeroupper
+
+ testl %r8d,%r8d
+ jz .Lbase2_64_avx2
+
+ testq $63,%rdx
+ jz .Leven_avx2
+
+ pushq %rbx
+ pushq %r12
+ pushq %r13
+ pushq %r14
+ pushq %r15
+ pushq %rdi
+
+.Lblocks_avx2_body:
+
+ movq %rdx,%r15
+
+ movq 0(%rdi),%r8
+ movq 8(%rdi),%r9
+ movl 16(%rdi),%r10d
+
+ movq 24(%rdi),%r11
+ movq 32(%rdi),%r13
+
+
+ movl %r8d,%r14d
+ andq $-2147483648,%r8
+ movq %r9,%r12
+ movl %r9d,%ebx
+ andq $-2147483648,%r9
+
+ shrq $6,%r8
+ shlq $52,%r12
+ addq %r8,%r14
+ shrq $12,%rbx
+ shrq $18,%r9
+ addq %r12,%r14
+ adcq %r9,%rbx
+
+ movq %r10,%r8
+ shlq $40,%r8
+ shrq $24,%r10
+ addq %r8,%rbx
+ adcq $0,%r10
+
+ movq $-4,%r9
+ movq %r10,%r8
+ andq %r10,%r9
+ shrq $2,%r8
+ andq $3,%r10
+ addq %r9,%r8
+ addq %r8,%r14
+ adcq $0,%rbx
+ adcq $0,%r10
+
+ movq %r13,%r12
+ movq %r13,%rax
+ shrq $2,%r13
+ addq %r12,%r13
+
+.Lbase2_26_pre_avx2:
+ addq 0(%rsi),%r14
+ adcq 8(%rsi),%rbx
+ leaq 16(%rsi),%rsi
+ adcq %rcx,%r10
+ subq $16,%r15
+
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+ movq %r12,%rax
+
+ testq $63,%r15
+ jnz .Lbase2_26_pre_avx2
+
+ testq %rcx,%rcx
+ jz .Lstore_base2_64_avx2
+
+
+ movq %r14,%rax
+ movq %r14,%rdx
+ shrq $52,%r14
+ movq %rbx,%r11
+ movq %rbx,%r12
+ shrq $26,%rdx
+ andq $0x3ffffff,%rax
+ shlq $12,%r11
+ andq $0x3ffffff,%rdx
+ shrq $14,%rbx
+ orq %r11,%r14
+ shlq $24,%r10
+ andq $0x3ffffff,%r14
+ shrq $40,%r12
+ andq $0x3ffffff,%rbx
+ orq %r12,%r10
+
+ testq %r15,%r15
+ jz .Lstore_base2_26_avx2
+
+ vmovd %eax,%xmm0
+ vmovd %edx,%xmm1
+ vmovd %r14d,%xmm2
+ vmovd %ebx,%xmm3
+ vmovd %r10d,%xmm4
+ jmp .Lproceed_avx2
+
+.align 32
+.Lstore_base2_64_avx2:
+ movq %r14,0(%rdi)
+ movq %rbx,8(%rdi)
+ movq %r10,16(%rdi)
+ jmp .Ldone_avx2
+
+.align 16
+.Lstore_base2_26_avx2:
+ movl %eax,0(%rdi)
+ movl %edx,4(%rdi)
+ movl %r14d,8(%rdi)
+ movl %ebx,12(%rdi)
+ movl %r10d,16(%rdi)
+.align 16
+.Ldone_avx2:
+ movq 8(%rsp),%r15
+ movq 16(%rsp),%r14
+ movq 24(%rsp),%r13
+ movq 32(%rsp),%r12
+ movq 40(%rsp),%rbx
+ leaq 48(%rsp),%rsp
+
+.Lno_data_avx2:
+.Lblocks_avx2_epilogue:
+ ret
+
+
+.align 32
+.Lbase2_64_avx2:
+
+
+ pushq %rbx
+ pushq %r12
+ pushq %r13
+ pushq %r14
+ pushq %r15
+ pushq %rdi
+
+.Lbase2_64_avx2_body:
+
+ movq %rdx,%r15
+
+ movq 24(%rdi),%r11
+ movq 32(%rdi),%r13
+
+ movq 0(%rdi),%r14
+ movq 8(%rdi),%rbx
+ movl 16(%rdi),%r10d
+
+ movq %r13,%r12
+ movq %r13,%rax
+ shrq $2,%r13
+ addq %r12,%r13
+
+ testq $63,%rdx
+ jz .Linit_avx2
+
+.Lbase2_64_pre_avx2:
+ addq 0(%rsi),%r14
+ adcq 8(%rsi),%rbx
+ leaq 16(%rsi),%rsi
+ adcq %rcx,%r10
+ subq $16,%r15
+
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+ movq %r12,%rax
+
+ testq $63,%r15
+ jnz .Lbase2_64_pre_avx2
+
+.Linit_avx2:
+
+ movq %r14,%rax
+ movq %r14,%rdx
+ shrq $52,%r14
+ movq %rbx,%r8
+ movq %rbx,%r9
+ shrq $26,%rdx
+ andq $0x3ffffff,%rax
+ shlq $12,%r8
+ andq $0x3ffffff,%rdx
+ shrq $14,%rbx
+ orq %r8,%r14
+ shlq $24,%r10
+ andq $0x3ffffff,%r14
+ shrq $40,%r9
+ andq $0x3ffffff,%rbx
+ orq %r9,%r10
+
+ vmovd %eax,%xmm0
+ vmovd %edx,%xmm1
+ vmovd %r14d,%xmm2
+ vmovd %ebx,%xmm3
+ vmovd %r10d,%xmm4
+ movl $1,20(%rdi)
+
+ __poly1305_init_avx
+
+.Lproceed_avx2:
+ movq %r15,%rdx
+
+ movq 8(%rsp),%r15
+ movq 16(%rsp),%r14
+ movq 24(%rsp),%r13
+ movq 32(%rsp),%r12
+ movq 40(%rsp),%rbx
+ leaq 48(%rsp),%rax
+ leaq 48(%rsp),%rsp
+
+.Lbase2_64_avx2_epilogue:
+ jmp .Ldo_avx2
+
+
+.align 32
+.Leven_avx2:
+
+ vmovd 0(%rdi),%xmm0
+ vmovd 4(%rdi),%xmm1
+ vmovd 8(%rdi),%xmm2
+ vmovd 12(%rdi),%xmm3
+ vmovd 16(%rdi),%xmm4
+
+.Ldo_avx2:
+ leaq 8(%rsp),%r10
+ subq $0x128,%rsp
+ leaq .Lconst(%rip),%rcx
+ leaq 48+64(%rdi),%rdi
+ vmovdqa 96(%rcx),%ymm7
+
+
+ vmovdqu -64(%rdi),%xmm9
+ andq $-512,%rsp
+ vmovdqu -48(%rdi),%xmm10
+ vmovdqu -32(%rdi),%xmm6
+ vmovdqu -16(%rdi),%xmm11
+ vmovdqu 0(%rdi),%xmm12
+ vmovdqu 16(%rdi),%xmm13
+ leaq 144(%rsp),%rax
+ vmovdqu 32(%rdi),%xmm14
+ vpermd %ymm9,%ymm7,%ymm9
+ vmovdqu 48(%rdi),%xmm15
+ vpermd %ymm10,%ymm7,%ymm10
+ vmovdqu 64(%rdi),%xmm5
+ vpermd %ymm6,%ymm7,%ymm6
+ vmovdqa %ymm9,0(%rsp)
+ vpermd %ymm11,%ymm7,%ymm11
+ vmovdqa %ymm10,32-144(%rax)
+ vpermd %ymm12,%ymm7,%ymm12
+ vmovdqa %ymm6,64-144(%rax)
+ vpermd %ymm13,%ymm7,%ymm13
+ vmovdqa %ymm11,96-144(%rax)
+ vpermd %ymm14,%ymm7,%ymm14
+ vmovdqa %ymm12,128-144(%rax)
+ vpermd %ymm15,%ymm7,%ymm15
+ vmovdqa %ymm13,160-144(%rax)
+ vpermd %ymm5,%ymm7,%ymm5
+ vmovdqa %ymm14,192-144(%rax)
+ vmovdqa %ymm15,224-144(%rax)
+ vmovdqa %ymm5,256-144(%rax)
+ vmovdqa 64(%rcx),%ymm5
+
+
+
+ vmovdqu 0(%rsi),%xmm7
+ vmovdqu 16(%rsi),%xmm8
+ vinserti128 $1,32(%rsi),%ymm7,%ymm7
+ vinserti128 $1,48(%rsi),%ymm8,%ymm8
+ leaq 64(%rsi),%rsi
+
+ vpsrldq $6,%ymm7,%ymm9
+ vpsrldq $6,%ymm8,%ymm10
+ vpunpckhqdq %ymm8,%ymm7,%ymm6
+ vpunpcklqdq %ymm10,%ymm9,%ymm9
+ vpunpcklqdq %ymm8,%ymm7,%ymm7
+
+ vpsrlq $30,%ymm9,%ymm10
+ vpsrlq $4,%ymm9,%ymm9
+ vpsrlq $26,%ymm7,%ymm8
+ vpsrlq $40,%ymm6,%ymm6
+ vpand %ymm5,%ymm9,%ymm9
+ vpand %ymm5,%ymm7,%ymm7
+ vpand %ymm5,%ymm8,%ymm8
+ vpand %ymm5,%ymm10,%ymm10
+ vpor 32(%rcx),%ymm6,%ymm6
+
+ vpaddq %ymm2,%ymm9,%ymm2
+ subq $64,%rdx
+ jz .Ltail_avx2
+ jmp .Loop_avx2
+
+.align 32
+.Loop_avx2:
+
+ vpaddq %ymm0,%ymm7,%ymm0
+ vmovdqa 0(%rsp),%ymm7
+ vpaddq %ymm1,%ymm8,%ymm1
+ vmovdqa 32(%rsp),%ymm8
+ vpaddq %ymm3,%ymm10,%ymm3
+ vmovdqa 96(%rsp),%ymm9
+ vpaddq %ymm4,%ymm6,%ymm4
+ vmovdqa 48(%rax),%ymm10
+ vmovdqa 112(%rax),%ymm5
+
+ vpmuludq %ymm2,%ymm7,%ymm13
+ vpmuludq %ymm2,%ymm8,%ymm14
+ vpmuludq %ymm2,%ymm9,%ymm15
+ vpmuludq %ymm2,%ymm10,%ymm11
+ vpmuludq %ymm2,%ymm5,%ymm12
+
+ vpmuludq %ymm0,%ymm8,%ymm6
+ vpmuludq %ymm1,%ymm8,%ymm2
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq 64(%rsp),%ymm4,%ymm2
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm11,%ymm11
+ vmovdqa -16(%rax),%ymm8
+
+ vpmuludq %ymm0,%ymm7,%ymm6
+ vpmuludq %ymm1,%ymm7,%ymm2
+ vpaddq %ymm6,%ymm11,%ymm11
+ vpaddq %ymm2,%ymm12,%ymm12
+ vpmuludq %ymm3,%ymm7,%ymm6
+ vpmuludq %ymm4,%ymm7,%ymm2
+ vmovdqu 0(%rsi),%xmm7
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm2,%ymm15,%ymm15
+ vinserti128 $1,32(%rsi),%ymm7,%ymm7
+
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq %ymm4,%ymm8,%ymm2
+ vmovdqu 16(%rsi),%xmm8
+ vpaddq %ymm6,%ymm11,%ymm11
+ vpaddq %ymm2,%ymm12,%ymm12
+ vmovdqa 16(%rax),%ymm2
+ vpmuludq %ymm1,%ymm9,%ymm6
+ vpmuludq %ymm0,%ymm9,%ymm9
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm9,%ymm13,%ymm13
+ vinserti128 $1,48(%rsi),%ymm8,%ymm8
+ leaq 64(%rsi),%rsi
+
+ vpmuludq %ymm1,%ymm2,%ymm6
+ vpmuludq %ymm0,%ymm2,%ymm2
+ vpsrldq $6,%ymm7,%ymm9
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm14,%ymm14
+ vpmuludq %ymm3,%ymm10,%ymm6
+ vpmuludq %ymm4,%ymm10,%ymm2
+ vpsrldq $6,%ymm8,%ymm10
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+ vpunpckhqdq %ymm8,%ymm7,%ymm6
+
+ vpmuludq %ymm3,%ymm5,%ymm3
+ vpmuludq %ymm4,%ymm5,%ymm4
+ vpunpcklqdq %ymm8,%ymm7,%ymm7
+ vpaddq %ymm3,%ymm13,%ymm2
+ vpaddq %ymm4,%ymm14,%ymm3
+ vpunpcklqdq %ymm10,%ymm9,%ymm10
+ vpmuludq 80(%rax),%ymm0,%ymm4
+ vpmuludq %ymm1,%ymm5,%ymm0
+ vmovdqa 64(%rcx),%ymm5
+ vpaddq %ymm4,%ymm15,%ymm4
+ vpaddq %ymm0,%ymm11,%ymm0
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm12,%ymm1
+
+ vpsrlq $26,%ymm4,%ymm15
+ vpand %ymm5,%ymm4,%ymm4
+
+ vpsrlq $4,%ymm10,%ymm9
+
+ vpsrlq $26,%ymm1,%ymm12
+ vpand %ymm5,%ymm1,%ymm1
+ vpaddq %ymm12,%ymm2,%ymm2
+
+ vpaddq %ymm15,%ymm0,%ymm0
+ vpsllq $2,%ymm15,%ymm15
+ vpaddq %ymm15,%ymm0,%ymm0
+
+ vpand %ymm5,%ymm9,%ymm9
+ vpsrlq $26,%ymm7,%ymm8
+
+ vpsrlq $26,%ymm2,%ymm13
+ vpand %ymm5,%ymm2,%ymm2
+ vpaddq %ymm13,%ymm3,%ymm3
+
+ vpaddq %ymm9,%ymm2,%ymm2
+ vpsrlq $30,%ymm10,%ymm10
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm1,%ymm1
+
+ vpsrlq $40,%ymm6,%ymm6
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vpand %ymm5,%ymm7,%ymm7
+ vpand %ymm5,%ymm8,%ymm8
+ vpand %ymm5,%ymm10,%ymm10
+ vpor 32(%rcx),%ymm6,%ymm6
+
+ subq $64,%rdx
+ jnz .Loop_avx2
+
+.byte 0x66,0x90
+.Ltail_avx2:
+
+ vpaddq %ymm0,%ymm7,%ymm0
+ vmovdqu 4(%rsp),%ymm7
+ vpaddq %ymm1,%ymm8,%ymm1
+ vmovdqu 36(%rsp),%ymm8
+ vpaddq %ymm3,%ymm10,%ymm3
+ vmovdqu 100(%rsp),%ymm9
+ vpaddq %ymm4,%ymm6,%ymm4
+ vmovdqu 52(%rax),%ymm10
+ vmovdqu 116(%rax),%ymm5
+
+ vpmuludq %ymm2,%ymm7,%ymm13
+ vpmuludq %ymm2,%ymm8,%ymm14
+ vpmuludq %ymm2,%ymm9,%ymm15
+ vpmuludq %ymm2,%ymm10,%ymm11
+ vpmuludq %ymm2,%ymm5,%ymm12
+
+ vpmuludq %ymm0,%ymm8,%ymm6
+ vpmuludq %ymm1,%ymm8,%ymm2
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq 68(%rsp),%ymm4,%ymm2
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm11,%ymm11
+
+ vpmuludq %ymm0,%ymm7,%ymm6
+ vpmuludq %ymm1,%ymm7,%ymm2
+ vpaddq %ymm6,%ymm11,%ymm11
+ vmovdqu -12(%rax),%ymm8
+ vpaddq %ymm2,%ymm12,%ymm12
+ vpmuludq %ymm3,%ymm7,%ymm6
+ vpmuludq %ymm4,%ymm7,%ymm2
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm2,%ymm15,%ymm15
+
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq %ymm4,%ymm8,%ymm2
+ vpaddq %ymm6,%ymm11,%ymm11
+ vpaddq %ymm2,%ymm12,%ymm12
+ vmovdqu 20(%rax),%ymm2
+ vpmuludq %ymm1,%ymm9,%ymm6
+ vpmuludq %ymm0,%ymm9,%ymm9
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm9,%ymm13,%ymm13
+
+ vpmuludq %ymm1,%ymm2,%ymm6
+ vpmuludq %ymm0,%ymm2,%ymm2
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm14,%ymm14
+ vpmuludq %ymm3,%ymm10,%ymm6
+ vpmuludq %ymm4,%ymm10,%ymm2
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+
+ vpmuludq %ymm3,%ymm5,%ymm3
+ vpmuludq %ymm4,%ymm5,%ymm4
+ vpaddq %ymm3,%ymm13,%ymm2
+ vpaddq %ymm4,%ymm14,%ymm3
+ vpmuludq 84(%rax),%ymm0,%ymm4
+ vpmuludq %ymm1,%ymm5,%ymm0
+ vmovdqa 64(%rcx),%ymm5
+ vpaddq %ymm4,%ymm15,%ymm4
+ vpaddq %ymm0,%ymm11,%ymm0
+
+ vpsrldq $8,%ymm12,%ymm8
+ vpsrldq $8,%ymm2,%ymm9
+ vpsrldq $8,%ymm3,%ymm10
+ vpsrldq $8,%ymm4,%ymm6
+ vpsrldq $8,%ymm0,%ymm7
+ vpaddq %ymm8,%ymm12,%ymm12
+ vpaddq %ymm9,%ymm2,%ymm2
+ vpaddq %ymm10,%ymm3,%ymm3
+ vpaddq %ymm6,%ymm4,%ymm4
+ vpaddq %ymm7,%ymm0,%ymm0
+
+ vpermq $0x2,%ymm3,%ymm10
+ vpermq $0x2,%ymm4,%ymm6
+ vpermq $0x2,%ymm0,%ymm7
+ vpermq $0x2,%ymm12,%ymm8
+ vpermq $0x2,%ymm2,%ymm9
+ vpaddq %ymm10,%ymm3,%ymm3
+ vpaddq %ymm6,%ymm4,%ymm4
+ vpaddq %ymm7,%ymm0,%ymm0
+ vpaddq %ymm8,%ymm12,%ymm12
+ vpaddq %ymm9,%ymm2,%ymm2
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm12,%ymm1
+
+ vpsrlq $26,%ymm4,%ymm15
+ vpand %ymm5,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm1,%ymm12
+ vpand %ymm5,%ymm1,%ymm1
+ vpaddq %ymm12,%ymm2,%ymm2
+
+ vpaddq %ymm15,%ymm0,%ymm0
+ vpsllq $2,%ymm15,%ymm15
+ vpaddq %ymm15,%ymm0,%ymm0
+
+ vpsrlq $26,%ymm2,%ymm13
+ vpand %ymm5,%ymm2,%ymm2
+ vpaddq %ymm13,%ymm3,%ymm3
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm1,%ymm1
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vmovd %xmm0,-112(%rdi)
+ vmovd %xmm1,-108(%rdi)
+ vmovd %xmm2,-104(%rdi)
+ vmovd %xmm3,-100(%rdi)
+ vmovd %xmm4,-96(%rdi)
+ leaq -8(%r10),%rsp
+
+ vzeroupper
+ ret
+
+ENDPROC(poly1305_blocks_avx2)
+#endif /* CONFIG_AS_AVX2 */
+
+#ifdef CONFIG_AS_AVX512
+.align 32
+ENTRY(poly1305_blocks_avx512)
+
+ movl 20(%rdi),%r8d
+ cmpq $128,%rdx
+ jae .Lblocks_avx2_512
+ testl %r8d,%r8d
+ jz .Lblocks
+
+.Lblocks_avx2_512:
+ andq $-16,%rdx
+ jz .Lno_data_avx2_512
+
+ vzeroupper
+
+ testl %r8d,%r8d
+ jz .Lbase2_64_avx2_512
+
+ testq $63,%rdx
+ jz .Leven_avx2_512
+
+ pushq %rbx
+ pushq %r12
+ pushq %r13
+ pushq %r14
+ pushq %r15
+ pushq %rdi
+
+.Lblocks_avx2_body_512:
+
+ movq %rdx,%r15
+
+ movq 0(%rdi),%r8
+ movq 8(%rdi),%r9
+ movl 16(%rdi),%r10d
+
+ movq 24(%rdi),%r11
+ movq 32(%rdi),%r13
+
+
+ movl %r8d,%r14d
+ andq $-2147483648,%r8
+ movq %r9,%r12
+ movl %r9d,%ebx
+ andq $-2147483648,%r9
+
+ shrq $6,%r8
+ shlq $52,%r12
+ addq %r8,%r14
+ shrq $12,%rbx
+ shrq $18,%r9
+ addq %r12,%r14
+ adcq %r9,%rbx
+
+ movq %r10,%r8
+ shlq $40,%r8
+ shrq $24,%r10
+ addq %r8,%rbx
+ adcq $0,%r10
+
+ movq $-4,%r9
+ movq %r10,%r8
+ andq %r10,%r9
+ shrq $2,%r8
+ andq $3,%r10
+ addq %r9,%r8
+ addq %r8,%r14
+ adcq $0,%rbx
+ adcq $0,%r10
+
+ movq %r13,%r12
+ movq %r13,%rax
+ shrq $2,%r13
+ addq %r12,%r13
+
+.Lbase2_26_pre_avx2_512:
+ addq 0(%rsi),%r14
+ adcq 8(%rsi),%rbx
+ leaq 16(%rsi),%rsi
+ adcq %rcx,%r10
+ subq $16,%r15
+
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+ movq %r12,%rax
+
+ testq $63,%r15
+ jnz .Lbase2_26_pre_avx2_512
+
+ testq %rcx,%rcx
+ jz .Lstore_base2_64_avx2_512
+
+
+ movq %r14,%rax
+ movq %r14,%rdx
+ shrq $52,%r14
+ movq %rbx,%r11
+ movq %rbx,%r12
+ shrq $26,%rdx
+ andq $0x3ffffff,%rax
+ shlq $12,%r11
+ andq $0x3ffffff,%rdx
+ shrq $14,%rbx
+ orq %r11,%r14
+ shlq $24,%r10
+ andq $0x3ffffff,%r14
+ shrq $40,%r12
+ andq $0x3ffffff,%rbx
+ orq %r12,%r10
+
+ testq %r15,%r15
+ jz .Lstore_base2_26_avx2_512
+
+ vmovd %eax,%xmm0
+ vmovd %edx,%xmm1
+ vmovd %r14d,%xmm2
+ vmovd %ebx,%xmm3
+ vmovd %r10d,%xmm4
+ jmp .Lproceed_avx2_512
+
+.align 32
+.Lstore_base2_64_avx2_512:
+ movq %r14,0(%rdi)
+ movq %rbx,8(%rdi)
+ movq %r10,16(%rdi)
+ jmp .Ldone_avx2_512
+
+.align 16
+.Lstore_base2_26_avx2_512:
+ movl %eax,0(%rdi)
+ movl %edx,4(%rdi)
+ movl %r14d,8(%rdi)
+ movl %ebx,12(%rdi)
+ movl %r10d,16(%rdi)
+.align 16
+.Ldone_avx2_512:
+ movq 8(%rsp),%r15
+ movq 16(%rsp),%r14
+ movq 24(%rsp),%r13
+ movq 32(%rsp),%r12
+ movq 40(%rsp),%rbx
+ leaq 48(%rsp),%rsp
+
+.Lno_data_avx2_512:
+.Lblocks_avx2_epilogue_512:
+ ret
+
+
+.align 32
+.Lbase2_64_avx2_512:
+
+ pushq %rbx
+ pushq %r12
+ pushq %r13
+ pushq %r14
+ pushq %r15
+ pushq %rdi
+
+.Lbase2_64_avx2_body_512:
+
+ movq %rdx,%r15
+
+ movq 24(%rdi),%r11
+ movq 32(%rdi),%r13
+
+ movq 0(%rdi),%r14
+ movq 8(%rdi),%rbx
+ movl 16(%rdi),%r10d
+
+ movq %r13,%r12
+ movq %r13,%rax
+ shrq $2,%r13
+ addq %r12,%r13
+
+ testq $63,%rdx
+ jz .Linit_avx2_512
+
+.Lbase2_64_pre_avx2_512:
+ addq 0(%rsi),%r14
+ adcq 8(%rsi),%rbx
+ leaq 16(%rsi),%rsi
+ adcq %rcx,%r10
+ subq $16,%r15
+
+ movq %rdi,0(%rsp)
+ __poly1305_block
+ movq 0(%rsp),%rdi
+ movq %r12,%rax
+
+ testq $63,%r15
+ jnz .Lbase2_64_pre_avx2_512
+
+.Linit_avx2_512:
+
+ movq %r14,%rax
+ movq %r14,%rdx
+ shrq $52,%r14
+ movq %rbx,%r8
+ movq %rbx,%r9
+ shrq $26,%rdx
+ andq $0x3ffffff,%rax
+ shlq $12,%r8
+ andq $0x3ffffff,%rdx
+ shrq $14,%rbx
+ orq %r8,%r14
+ shlq $24,%r10
+ andq $0x3ffffff,%r14
+ shrq $40,%r9
+ andq $0x3ffffff,%rbx
+ orq %r9,%r10
+
+ vmovd %eax,%xmm0
+ vmovd %edx,%xmm1
+ vmovd %r14d,%xmm2
+ vmovd %ebx,%xmm3
+ vmovd %r10d,%xmm4
+ movl $1,20(%rdi)
+
+ __poly1305_init_avx
+
+.Lproceed_avx2_512:
+ movq %r15,%rdx
+
+ movq 8(%rsp),%r15
+ movq 16(%rsp),%r14
+ movq 24(%rsp),%r13
+ movq 32(%rsp),%r12
+ movq 40(%rsp),%rbx
+ leaq 48(%rsp),%rax
+ leaq 48(%rsp),%rsp
+
+.Lbase2_64_avx2_epilogue_512:
+ jmp .Ldo_avx2_512
+
+
+.align 32
+.Leven_avx2_512:
+
+ vmovd 0(%rdi),%xmm0
+ vmovd 4(%rdi),%xmm1
+ vmovd 8(%rdi),%xmm2
+ vmovd 12(%rdi),%xmm3
+ vmovd 16(%rdi),%xmm4
+
+.Ldo_avx2_512:
+ cmpq $512,%rdx
+ jae .Lblocks_avx512
+.Lskip_avx512:
+ leaq 8(%rsp),%r10
+
+ subq $0x128,%rsp
+ leaq .Lconst(%rip),%rcx
+ leaq 48+64(%rdi),%rdi
+ vmovdqa 96(%rcx),%ymm7
+
+
+ vmovdqu -64(%rdi),%xmm9
+ andq $-512,%rsp
+ vmovdqu -48(%rdi),%xmm10
+ vmovdqu -32(%rdi),%xmm6
+ vmovdqu -16(%rdi),%xmm11
+ vmovdqu 0(%rdi),%xmm12
+ vmovdqu 16(%rdi),%xmm13
+ leaq 144(%rsp),%rax
+ vmovdqu 32(%rdi),%xmm14
+ vpermd %ymm9,%ymm7,%ymm9
+ vmovdqu 48(%rdi),%xmm15
+ vpermd %ymm10,%ymm7,%ymm10
+ vmovdqu 64(%rdi),%xmm5
+ vpermd %ymm6,%ymm7,%ymm6
+ vmovdqa %ymm9,0(%rsp)
+ vpermd %ymm11,%ymm7,%ymm11
+ vmovdqa %ymm10,32-144(%rax)
+ vpermd %ymm12,%ymm7,%ymm12
+ vmovdqa %ymm6,64-144(%rax)
+ vpermd %ymm13,%ymm7,%ymm13
+ vmovdqa %ymm11,96-144(%rax)
+ vpermd %ymm14,%ymm7,%ymm14
+ vmovdqa %ymm12,128-144(%rax)
+ vpermd %ymm15,%ymm7,%ymm15
+ vmovdqa %ymm13,160-144(%rax)
+ vpermd %ymm5,%ymm7,%ymm5
+ vmovdqa %ymm14,192-144(%rax)
+ vmovdqa %ymm15,224-144(%rax)
+ vmovdqa %ymm5,256-144(%rax)
+ vmovdqa 64(%rcx),%ymm5
+
+
+
+ vmovdqu 0(%rsi),%xmm7
+ vmovdqu 16(%rsi),%xmm8
+ vinserti128 $1,32(%rsi),%ymm7,%ymm7
+ vinserti128 $1,48(%rsi),%ymm8,%ymm8
+ leaq 64(%rsi),%rsi
+
+ vpsrldq $6,%ymm7,%ymm9
+ vpsrldq $6,%ymm8,%ymm10
+ vpunpckhqdq %ymm8,%ymm7,%ymm6
+ vpunpcklqdq %ymm10,%ymm9,%ymm9
+ vpunpcklqdq %ymm8,%ymm7,%ymm7
+
+ vpsrlq $30,%ymm9,%ymm10
+ vpsrlq $4,%ymm9,%ymm9
+ vpsrlq $26,%ymm7,%ymm8
+ vpsrlq $40,%ymm6,%ymm6
+ vpand %ymm5,%ymm9,%ymm9
+ vpand %ymm5,%ymm7,%ymm7
+ vpand %ymm5,%ymm8,%ymm8
+ vpand %ymm5,%ymm10,%ymm10
+ vpor 32(%rcx),%ymm6,%ymm6
+
+ vpaddq %ymm2,%ymm9,%ymm2
+ subq $64,%rdx
+ jz .Ltail_avx2_512
+ jmp .Loop_avx2_512
+
+.align 32
+.Loop_avx2_512:
+
+ vpaddq %ymm0,%ymm7,%ymm0
+ vmovdqa 0(%rsp),%ymm7
+ vpaddq %ymm1,%ymm8,%ymm1
+ vmovdqa 32(%rsp),%ymm8
+ vpaddq %ymm3,%ymm10,%ymm3
+ vmovdqa 96(%rsp),%ymm9
+ vpaddq %ymm4,%ymm6,%ymm4
+ vmovdqa 48(%rax),%ymm10
+ vmovdqa 112(%rax),%ymm5
+
+ vpmuludq %ymm2,%ymm7,%ymm13
+ vpmuludq %ymm2,%ymm8,%ymm14
+ vpmuludq %ymm2,%ymm9,%ymm15
+ vpmuludq %ymm2,%ymm10,%ymm11
+ vpmuludq %ymm2,%ymm5,%ymm12
+
+ vpmuludq %ymm0,%ymm8,%ymm6
+ vpmuludq %ymm1,%ymm8,%ymm2
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq 64(%rsp),%ymm4,%ymm2
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm11,%ymm11
+ vmovdqa -16(%rax),%ymm8
+
+ vpmuludq %ymm0,%ymm7,%ymm6
+ vpmuludq %ymm1,%ymm7,%ymm2
+ vpaddq %ymm6,%ymm11,%ymm11
+ vpaddq %ymm2,%ymm12,%ymm12
+ vpmuludq %ymm3,%ymm7,%ymm6
+ vpmuludq %ymm4,%ymm7,%ymm2
+ vmovdqu 0(%rsi),%xmm7
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm2,%ymm15,%ymm15
+ vinserti128 $1,32(%rsi),%ymm7,%ymm7
+
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq %ymm4,%ymm8,%ymm2
+ vmovdqu 16(%rsi),%xmm8
+ vpaddq %ymm6,%ymm11,%ymm11
+ vpaddq %ymm2,%ymm12,%ymm12
+ vmovdqa 16(%rax),%ymm2
+ vpmuludq %ymm1,%ymm9,%ymm6
+ vpmuludq %ymm0,%ymm9,%ymm9
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm9,%ymm13,%ymm13
+ vinserti128 $1,48(%rsi),%ymm8,%ymm8
+ leaq 64(%rsi),%rsi
+
+ vpmuludq %ymm1,%ymm2,%ymm6
+ vpmuludq %ymm0,%ymm2,%ymm2
+ vpsrldq $6,%ymm7,%ymm9
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm14,%ymm14
+ vpmuludq %ymm3,%ymm10,%ymm6
+ vpmuludq %ymm4,%ymm10,%ymm2
+ vpsrldq $6,%ymm8,%ymm10
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+ vpunpckhqdq %ymm8,%ymm7,%ymm6
+
+ vpmuludq %ymm3,%ymm5,%ymm3
+ vpmuludq %ymm4,%ymm5,%ymm4
+ vpunpcklqdq %ymm8,%ymm7,%ymm7
+ vpaddq %ymm3,%ymm13,%ymm2
+ vpaddq %ymm4,%ymm14,%ymm3
+ vpunpcklqdq %ymm10,%ymm9,%ymm10
+ vpmuludq 80(%rax),%ymm0,%ymm4
+ vpmuludq %ymm1,%ymm5,%ymm0
+ vmovdqa 64(%rcx),%ymm5
+ vpaddq %ymm4,%ymm15,%ymm4
+ vpaddq %ymm0,%ymm11,%ymm0
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm12,%ymm1
+
+ vpsrlq $26,%ymm4,%ymm15
+ vpand %ymm5,%ymm4,%ymm4
+
+ vpsrlq $4,%ymm10,%ymm9
+
+ vpsrlq $26,%ymm1,%ymm12
+ vpand %ymm5,%ymm1,%ymm1
+ vpaddq %ymm12,%ymm2,%ymm2
+
+ vpaddq %ymm15,%ymm0,%ymm0
+ vpsllq $2,%ymm15,%ymm15
+ vpaddq %ymm15,%ymm0,%ymm0
+
+ vpand %ymm5,%ymm9,%ymm9
+ vpsrlq $26,%ymm7,%ymm8
+
+ vpsrlq $26,%ymm2,%ymm13
+ vpand %ymm5,%ymm2,%ymm2
+ vpaddq %ymm13,%ymm3,%ymm3
+
+ vpaddq %ymm9,%ymm2,%ymm2
+ vpsrlq $30,%ymm10,%ymm10
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm1,%ymm1
+
+ vpsrlq $40,%ymm6,%ymm6
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vpand %ymm5,%ymm7,%ymm7
+ vpand %ymm5,%ymm8,%ymm8
+ vpand %ymm5,%ymm10,%ymm10
+ vpor 32(%rcx),%ymm6,%ymm6
+
+ subq $64,%rdx
+ jnz .Loop_avx2_512
+
+.byte 0x66,0x90
+.Ltail_avx2_512:
+
+ vpaddq %ymm0,%ymm7,%ymm0
+ vmovdqu 4(%rsp),%ymm7
+ vpaddq %ymm1,%ymm8,%ymm1
+ vmovdqu 36(%rsp),%ymm8
+ vpaddq %ymm3,%ymm10,%ymm3
+ vmovdqu 100(%rsp),%ymm9
+ vpaddq %ymm4,%ymm6,%ymm4
+ vmovdqu 52(%rax),%ymm10
+ vmovdqu 116(%rax),%ymm5
+
+ vpmuludq %ymm2,%ymm7,%ymm13
+ vpmuludq %ymm2,%ymm8,%ymm14
+ vpmuludq %ymm2,%ymm9,%ymm15
+ vpmuludq %ymm2,%ymm10,%ymm11
+ vpmuludq %ymm2,%ymm5,%ymm12
+
+ vpmuludq %ymm0,%ymm8,%ymm6
+ vpmuludq %ymm1,%ymm8,%ymm2
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq 68(%rsp),%ymm4,%ymm2
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm11,%ymm11
+
+ vpmuludq %ymm0,%ymm7,%ymm6
+ vpmuludq %ymm1,%ymm7,%ymm2
+ vpaddq %ymm6,%ymm11,%ymm11
+ vmovdqu -12(%rax),%ymm8
+ vpaddq %ymm2,%ymm12,%ymm12
+ vpmuludq %ymm3,%ymm7,%ymm6
+ vpmuludq %ymm4,%ymm7,%ymm2
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm2,%ymm15,%ymm15
+
+ vpmuludq %ymm3,%ymm8,%ymm6
+ vpmuludq %ymm4,%ymm8,%ymm2
+ vpaddq %ymm6,%ymm11,%ymm11
+ vpaddq %ymm2,%ymm12,%ymm12
+ vmovdqu 20(%rax),%ymm2
+ vpmuludq %ymm1,%ymm9,%ymm6
+ vpmuludq %ymm0,%ymm9,%ymm9
+ vpaddq %ymm6,%ymm14,%ymm14
+ vpaddq %ymm9,%ymm13,%ymm13
+
+ vpmuludq %ymm1,%ymm2,%ymm6
+ vpmuludq %ymm0,%ymm2,%ymm2
+ vpaddq %ymm6,%ymm15,%ymm15
+ vpaddq %ymm2,%ymm14,%ymm14
+ vpmuludq %ymm3,%ymm10,%ymm6
+ vpmuludq %ymm4,%ymm10,%ymm2
+ vpaddq %ymm6,%ymm12,%ymm12
+ vpaddq %ymm2,%ymm13,%ymm13
+
+ vpmuludq %ymm3,%ymm5,%ymm3
+ vpmuludq %ymm4,%ymm5,%ymm4
+ vpaddq %ymm3,%ymm13,%ymm2
+ vpaddq %ymm4,%ymm14,%ymm3
+ vpmuludq 84(%rax),%ymm0,%ymm4
+ vpmuludq %ymm1,%ymm5,%ymm0
+ vmovdqa 64(%rcx),%ymm5
+ vpaddq %ymm4,%ymm15,%ymm4
+ vpaddq %ymm0,%ymm11,%ymm0
+
+ vpsrldq $8,%ymm12,%ymm8
+ vpsrldq $8,%ymm2,%ymm9
+ vpsrldq $8,%ymm3,%ymm10
+ vpsrldq $8,%ymm4,%ymm6
+ vpsrldq $8,%ymm0,%ymm7
+ vpaddq %ymm8,%ymm12,%ymm12
+ vpaddq %ymm9,%ymm2,%ymm2
+ vpaddq %ymm10,%ymm3,%ymm3
+ vpaddq %ymm6,%ymm4,%ymm4
+ vpaddq %ymm7,%ymm0,%ymm0
+
+ vpermq $0x2,%ymm3,%ymm10
+ vpermq $0x2,%ymm4,%ymm6
+ vpermq $0x2,%ymm0,%ymm7
+ vpermq $0x2,%ymm12,%ymm8
+ vpermq $0x2,%ymm2,%ymm9
+ vpaddq %ymm10,%ymm3,%ymm3
+ vpaddq %ymm6,%ymm4,%ymm4
+ vpaddq %ymm7,%ymm0,%ymm0
+ vpaddq %ymm8,%ymm12,%ymm12
+ vpaddq %ymm9,%ymm2,%ymm2
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm12,%ymm1
+
+ vpsrlq $26,%ymm4,%ymm15
+ vpand %ymm5,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm1,%ymm12
+ vpand %ymm5,%ymm1,%ymm1
+ vpaddq %ymm12,%ymm2,%ymm2
+
+ vpaddq %ymm15,%ymm0,%ymm0
+ vpsllq $2,%ymm15,%ymm15
+ vpaddq %ymm15,%ymm0,%ymm0
+
+ vpsrlq $26,%ymm2,%ymm13
+ vpand %ymm5,%ymm2,%ymm2
+ vpaddq %ymm13,%ymm3,%ymm3
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm11,%ymm1,%ymm1
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vmovd %xmm0,-112(%rdi)
+ vmovd %xmm1,-108(%rdi)
+ vmovd %xmm2,-104(%rdi)
+ vmovd %xmm3,-100(%rdi)
+ vmovd %xmm4,-96(%rdi)
+ leaq -8(%r10),%rsp
+
+ vzeroupper
+ ret
+
+.Lblocks_avx512:
+
+ movl $15,%eax
+ kmovw %eax,%k2
+ leaq 8(%rsp),%r10
+
+ subq $0x128,%rsp
+ leaq .Lconst(%rip),%rcx
+ leaq 48+64(%rdi),%rdi
+ vmovdqa 96(%rcx),%ymm9
+
+ vmovdqu32 -64(%rdi),%zmm16{%k2}{z}
+ andq $-512,%rsp
+ vmovdqu32 -48(%rdi),%zmm17{%k2}{z}
+ movq $0x20,%rax
+ vmovdqu32 -32(%rdi),%zmm21{%k2}{z}
+ vmovdqu32 -16(%rdi),%zmm18{%k2}{z}
+ vmovdqu32 0(%rdi),%zmm22{%k2}{z}
+ vmovdqu32 16(%rdi),%zmm19{%k2}{z}
+ vmovdqu32 32(%rdi),%zmm23{%k2}{z}
+ vmovdqu32 48(%rdi),%zmm20{%k2}{z}
+ vmovdqu32 64(%rdi),%zmm24{%k2}{z}
+ vpermd %zmm16,%zmm9,%zmm16
+ vpbroadcastq 64(%rcx),%zmm5
+ vpermd %zmm17,%zmm9,%zmm17
+ vpermd %zmm21,%zmm9,%zmm21
+ vpermd %zmm18,%zmm9,%zmm18
+ vmovdqa64 %zmm16,0(%rsp){%k2}
+ vpsrlq $32,%zmm16,%zmm7
+ vpermd %zmm22,%zmm9,%zmm22
+ vmovdqu64 %zmm17,0(%rsp,%rax,1){%k2}
+ vpsrlq $32,%zmm17,%zmm8
+ vpermd %zmm19,%zmm9,%zmm19
+ vmovdqa64 %zmm21,64(%rsp){%k2}
+ vpermd %zmm23,%zmm9,%zmm23
+ vpermd %zmm20,%zmm9,%zmm20
+ vmovdqu64 %zmm18,64(%rsp,%rax,1){%k2}
+ vpermd %zmm24,%zmm9,%zmm24
+ vmovdqa64 %zmm22,128(%rsp){%k2}
+ vmovdqu64 %zmm19,128(%rsp,%rax,1){%k2}
+ vmovdqa64 %zmm23,192(%rsp){%k2}
+ vmovdqu64 %zmm20,192(%rsp,%rax,1){%k2}
+ vmovdqa64 %zmm24,256(%rsp){%k2}
+
+ vpmuludq %zmm7,%zmm16,%zmm11
+ vpmuludq %zmm7,%zmm17,%zmm12
+ vpmuludq %zmm7,%zmm18,%zmm13
+ vpmuludq %zmm7,%zmm19,%zmm14
+ vpmuludq %zmm7,%zmm20,%zmm15
+ vpsrlq $32,%zmm18,%zmm9
+
+ vpmuludq %zmm8,%zmm24,%zmm25
+ vpmuludq %zmm8,%zmm16,%zmm26
+ vpmuludq %zmm8,%zmm17,%zmm27
+ vpmuludq %zmm8,%zmm18,%zmm28
+ vpmuludq %zmm8,%zmm19,%zmm29
+ vpsrlq $32,%zmm19,%zmm10
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm27,%zmm13,%zmm13
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+
+ vpmuludq %zmm9,%zmm23,%zmm25
+ vpmuludq %zmm9,%zmm24,%zmm26
+ vpmuludq %zmm9,%zmm17,%zmm28
+ vpmuludq %zmm9,%zmm18,%zmm29
+ vpmuludq %zmm9,%zmm16,%zmm27
+ vpsrlq $32,%zmm20,%zmm6
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vpmuludq %zmm10,%zmm22,%zmm25
+ vpmuludq %zmm10,%zmm16,%zmm28
+ vpmuludq %zmm10,%zmm17,%zmm29
+ vpmuludq %zmm10,%zmm23,%zmm26
+ vpmuludq %zmm10,%zmm24,%zmm27
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vpmuludq %zmm6,%zmm24,%zmm28
+ vpmuludq %zmm6,%zmm16,%zmm29
+ vpmuludq %zmm6,%zmm21,%zmm25
+ vpmuludq %zmm6,%zmm22,%zmm26
+ vpmuludq %zmm6,%zmm23,%zmm27
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vmovdqu64 0(%rsi),%zmm10
+ vmovdqu64 64(%rsi),%zmm6
+ leaq 128(%rsi),%rsi
+
+ vpsrlq $26,%zmm14,%zmm28
+ vpandq %zmm5,%zmm14,%zmm14
+ vpaddq %zmm28,%zmm15,%zmm15
+
+ vpsrlq $26,%zmm11,%zmm25
+ vpandq %zmm5,%zmm11,%zmm11
+ vpaddq %zmm25,%zmm12,%zmm12
+
+ vpsrlq $26,%zmm15,%zmm29
+ vpandq %zmm5,%zmm15,%zmm15
+
+ vpsrlq $26,%zmm12,%zmm26
+ vpandq %zmm5,%zmm12,%zmm12
+ vpaddq %zmm26,%zmm13,%zmm13
+
+ vpaddq %zmm29,%zmm11,%zmm11
+ vpsllq $2,%zmm29,%zmm29
+ vpaddq %zmm29,%zmm11,%zmm11
+
+ vpsrlq $26,%zmm13,%zmm27
+ vpandq %zmm5,%zmm13,%zmm13
+ vpaddq %zmm27,%zmm14,%zmm14
+
+ vpsrlq $26,%zmm11,%zmm25
+ vpandq %zmm5,%zmm11,%zmm11
+ vpaddq %zmm25,%zmm12,%zmm12
+
+ vpsrlq $26,%zmm14,%zmm28
+ vpandq %zmm5,%zmm14,%zmm14
+ vpaddq %zmm28,%zmm15,%zmm15
+
+ vpunpcklqdq %zmm6,%zmm10,%zmm7
+ vpunpckhqdq %zmm6,%zmm10,%zmm6
+
+ vmovdqa32 128(%rcx),%zmm25
+ movl $0x7777,%eax
+ kmovw %eax,%k1
+
+ vpermd %zmm16,%zmm25,%zmm16
+ vpermd %zmm17,%zmm25,%zmm17
+ vpermd %zmm18,%zmm25,%zmm18
+ vpermd %zmm19,%zmm25,%zmm19
+ vpermd %zmm20,%zmm25,%zmm20
+
+ vpermd %zmm11,%zmm25,%zmm16{%k1}
+ vpermd %zmm12,%zmm25,%zmm17{%k1}
+ vpermd %zmm13,%zmm25,%zmm18{%k1}
+ vpermd %zmm14,%zmm25,%zmm19{%k1}
+ vpermd %zmm15,%zmm25,%zmm20{%k1}
+
+ vpslld $2,%zmm17,%zmm21
+ vpslld $2,%zmm18,%zmm22
+ vpslld $2,%zmm19,%zmm23
+ vpslld $2,%zmm20,%zmm24
+ vpaddd %zmm17,%zmm21,%zmm21
+ vpaddd %zmm18,%zmm22,%zmm22
+ vpaddd %zmm19,%zmm23,%zmm23
+ vpaddd %zmm20,%zmm24,%zmm24
+
+ vpbroadcastq 32(%rcx),%zmm30
+
+ vpsrlq $52,%zmm7,%zmm9
+ vpsllq $12,%zmm6,%zmm10
+ vporq %zmm10,%zmm9,%zmm9
+ vpsrlq $26,%zmm7,%zmm8
+ vpsrlq $14,%zmm6,%zmm10
+ vpsrlq $40,%zmm6,%zmm6
+ vpandq %zmm5,%zmm9,%zmm9
+ vpandq %zmm5,%zmm7,%zmm7
+
+ vpaddq %zmm2,%zmm9,%zmm2
+ subq $192,%rdx
+ jbe .Ltail_avx512
+ jmp .Loop_avx512
+
+.align 32
+.Loop_avx512:
+
+ vpmuludq %zmm2,%zmm17,%zmm14
+ vpaddq %zmm0,%zmm7,%zmm0
+ vpmuludq %zmm2,%zmm18,%zmm15
+ vpandq %zmm5,%zmm8,%zmm8
+ vpmuludq %zmm2,%zmm23,%zmm11
+ vpandq %zmm5,%zmm10,%zmm10
+ vpmuludq %zmm2,%zmm24,%zmm12
+ vporq %zmm30,%zmm6,%zmm6
+ vpmuludq %zmm2,%zmm16,%zmm13
+ vpaddq %zmm1,%zmm8,%zmm1
+ vpaddq %zmm3,%zmm10,%zmm3
+ vpaddq %zmm4,%zmm6,%zmm4
+
+ vmovdqu64 0(%rsi),%zmm10
+ vmovdqu64 64(%rsi),%zmm6
+ leaq 128(%rsi),%rsi
+ vpmuludq %zmm0,%zmm19,%zmm28
+ vpmuludq %zmm0,%zmm20,%zmm29
+ vpmuludq %zmm0,%zmm16,%zmm25
+ vpmuludq %zmm0,%zmm17,%zmm26
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm26,%zmm12,%zmm12
+
+ vpmuludq %zmm1,%zmm18,%zmm28
+ vpmuludq %zmm1,%zmm19,%zmm29
+ vpmuludq %zmm1,%zmm24,%zmm25
+ vpmuludq %zmm0,%zmm18,%zmm27
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vpunpcklqdq %zmm6,%zmm10,%zmm7
+ vpunpckhqdq %zmm6,%zmm10,%zmm6
+
+ vpmuludq %zmm3,%zmm16,%zmm28
+ vpmuludq %zmm3,%zmm17,%zmm29
+ vpmuludq %zmm1,%zmm16,%zmm26
+ vpmuludq %zmm1,%zmm17,%zmm27
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vpmuludq %zmm4,%zmm24,%zmm28
+ vpmuludq %zmm4,%zmm16,%zmm29
+ vpmuludq %zmm3,%zmm22,%zmm25
+ vpmuludq %zmm3,%zmm23,%zmm26
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpmuludq %zmm3,%zmm24,%zmm27
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vpmuludq %zmm4,%zmm21,%zmm25
+ vpmuludq %zmm4,%zmm22,%zmm26
+ vpmuludq %zmm4,%zmm23,%zmm27
+ vpaddq %zmm25,%zmm11,%zmm0
+ vpaddq %zmm26,%zmm12,%zmm1
+ vpaddq %zmm27,%zmm13,%zmm2
+
+ vpsrlq $52,%zmm7,%zmm9
+ vpsllq $12,%zmm6,%zmm10
+
+ vpsrlq $26,%zmm14,%zmm3
+ vpandq %zmm5,%zmm14,%zmm14
+ vpaddq %zmm3,%zmm15,%zmm4
+
+ vporq %zmm10,%zmm9,%zmm9
+
+ vpsrlq $26,%zmm0,%zmm11
+ vpandq %zmm5,%zmm0,%zmm0
+ vpaddq %zmm11,%zmm1,%zmm1
+
+ vpandq %zmm5,%zmm9,%zmm9
+
+ vpsrlq $26,%zmm4,%zmm15
+ vpandq %zmm5,%zmm4,%zmm4
+
+ vpsrlq $26,%zmm1,%zmm12
+ vpandq %zmm5,%zmm1,%zmm1
+ vpaddq %zmm12,%zmm2,%zmm2
+
+ vpaddq %zmm15,%zmm0,%zmm0
+ vpsllq $2,%zmm15,%zmm15
+ vpaddq %zmm15,%zmm0,%zmm0
+
+ vpaddq %zmm9,%zmm2,%zmm2
+ vpsrlq $26,%zmm7,%zmm8
+
+ vpsrlq $26,%zmm2,%zmm13
+ vpandq %zmm5,%zmm2,%zmm2
+ vpaddq %zmm13,%zmm14,%zmm3
+
+ vpsrlq $14,%zmm6,%zmm10
+
+ vpsrlq $26,%zmm0,%zmm11
+ vpandq %zmm5,%zmm0,%zmm0
+ vpaddq %zmm11,%zmm1,%zmm1
+
+ vpsrlq $40,%zmm6,%zmm6
+
+ vpsrlq $26,%zmm3,%zmm14
+ vpandq %zmm5,%zmm3,%zmm3
+ vpaddq %zmm14,%zmm4,%zmm4
+
+ vpandq %zmm5,%zmm7,%zmm7
+
+ subq $128,%rdx
+ ja .Loop_avx512
+
+.Ltail_avx512:
+
+ vpsrlq $32,%zmm16,%zmm16
+ vpsrlq $32,%zmm17,%zmm17
+ vpsrlq $32,%zmm18,%zmm18
+ vpsrlq $32,%zmm23,%zmm23
+ vpsrlq $32,%zmm24,%zmm24
+ vpsrlq $32,%zmm19,%zmm19
+ vpsrlq $32,%zmm20,%zmm20
+ vpsrlq $32,%zmm21,%zmm21
+ vpsrlq $32,%zmm22,%zmm22
+
+ leaq (%rsi,%rdx,1),%rsi
+
+ vpaddq %zmm0,%zmm7,%zmm0
+
+ vpmuludq %zmm2,%zmm17,%zmm14
+ vpmuludq %zmm2,%zmm18,%zmm15
+ vpmuludq %zmm2,%zmm23,%zmm11
+ vpandq %zmm5,%zmm8,%zmm8
+ vpmuludq %zmm2,%zmm24,%zmm12
+ vpandq %zmm5,%zmm10,%zmm10
+ vpmuludq %zmm2,%zmm16,%zmm13
+ vporq %zmm30,%zmm6,%zmm6
+ vpaddq %zmm1,%zmm8,%zmm1
+ vpaddq %zmm3,%zmm10,%zmm3
+ vpaddq %zmm4,%zmm6,%zmm4
+
+ vmovdqu 0(%rsi),%xmm7
+ vpmuludq %zmm0,%zmm19,%zmm28
+ vpmuludq %zmm0,%zmm20,%zmm29
+ vpmuludq %zmm0,%zmm16,%zmm25
+ vpmuludq %zmm0,%zmm17,%zmm26
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm26,%zmm12,%zmm12
+
+ vmovdqu 16(%rsi),%xmm8
+ vpmuludq %zmm1,%zmm18,%zmm28
+ vpmuludq %zmm1,%zmm19,%zmm29
+ vpmuludq %zmm1,%zmm24,%zmm25
+ vpmuludq %zmm0,%zmm18,%zmm27
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vinserti128 $1,32(%rsi),%ymm7,%ymm7
+ vpmuludq %zmm3,%zmm16,%zmm28
+ vpmuludq %zmm3,%zmm17,%zmm29
+ vpmuludq %zmm1,%zmm16,%zmm26
+ vpmuludq %zmm1,%zmm17,%zmm27
+ vpaddq %zmm28,%zmm14,%zmm14
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vinserti128 $1,48(%rsi),%ymm8,%ymm8
+ vpmuludq %zmm4,%zmm24,%zmm28
+ vpmuludq %zmm4,%zmm16,%zmm29
+ vpmuludq %zmm3,%zmm22,%zmm25
+ vpmuludq %zmm3,%zmm23,%zmm26
+ vpmuludq %zmm3,%zmm24,%zmm27
+ vpaddq %zmm28,%zmm14,%zmm3
+ vpaddq %zmm29,%zmm15,%zmm15
+ vpaddq %zmm25,%zmm11,%zmm11
+ vpaddq %zmm26,%zmm12,%zmm12
+ vpaddq %zmm27,%zmm13,%zmm13
+
+ vpmuludq %zmm4,%zmm21,%zmm25
+ vpmuludq %zmm4,%zmm22,%zmm26
+ vpmuludq %zmm4,%zmm23,%zmm27
+ vpaddq %zmm25,%zmm11,%zmm0
+ vpaddq %zmm26,%zmm12,%zmm1
+ vpaddq %zmm27,%zmm13,%zmm2
+
+ movl $1,%eax
+ vpermq $0xb1,%zmm3,%zmm14
+ vpermq $0xb1,%zmm15,%zmm4
+ vpermq $0xb1,%zmm0,%zmm11
+ vpermq $0xb1,%zmm1,%zmm12
+ vpermq $0xb1,%zmm2,%zmm13
+ vpaddq %zmm14,%zmm3,%zmm3
+ vpaddq %zmm15,%zmm4,%zmm4
+ vpaddq %zmm11,%zmm0,%zmm0
+ vpaddq %zmm12,%zmm1,%zmm1
+ vpaddq %zmm13,%zmm2,%zmm2
+
+ kmovw %eax,%k3
+ vpermq $0x2,%zmm3,%zmm14
+ vpermq $0x2,%zmm4,%zmm15
+ vpermq $0x2,%zmm0,%zmm11
+ vpermq $0x2,%zmm1,%zmm12
+ vpermq $0x2,%zmm2,%zmm13
+ vpaddq %zmm14,%zmm3,%zmm3
+ vpaddq %zmm15,%zmm4,%zmm4
+ vpaddq %zmm11,%zmm0,%zmm0
+ vpaddq %zmm12,%zmm1,%zmm1
+ vpaddq %zmm13,%zmm2,%zmm2
+
+ vextracti64x4 $0x1,%zmm3,%ymm14
+ vextracti64x4 $0x1,%zmm4,%ymm15
+ vextracti64x4 $0x1,%zmm0,%ymm11
+ vextracti64x4 $0x1,%zmm1,%ymm12
+ vextracti64x4 $0x1,%zmm2,%ymm13
+ vpaddq %zmm14,%zmm3,%zmm3{%k3}{z}
+ vpaddq %zmm15,%zmm4,%zmm4{%k3}{z}
+ vpaddq %zmm11,%zmm0,%zmm0{%k3}{z}
+ vpaddq %zmm12,%zmm1,%zmm1{%k3}{z}
+ vpaddq %zmm13,%zmm2,%zmm2{%k3}{z}
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpsrldq $6,%ymm7,%ymm9
+ vpsrldq $6,%ymm8,%ymm10
+ vpunpckhqdq %ymm8,%ymm7,%ymm6
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpunpcklqdq %ymm10,%ymm9,%ymm9
+ vpunpcklqdq %ymm8,%ymm7,%ymm7
+ vpaddq %ymm11,%ymm1,%ymm1
+
+ vpsrlq $26,%ymm4,%ymm15
+ vpand %ymm5,%ymm4,%ymm4
+
+ vpsrlq $26,%ymm1,%ymm12
+ vpand %ymm5,%ymm1,%ymm1
+ vpsrlq $30,%ymm9,%ymm10
+ vpsrlq $4,%ymm9,%ymm9
+ vpaddq %ymm12,%ymm2,%ymm2
+
+ vpaddq %ymm15,%ymm0,%ymm0
+ vpsllq $2,%ymm15,%ymm15
+ vpsrlq $26,%ymm7,%ymm8
+ vpsrlq $40,%ymm6,%ymm6
+ vpaddq %ymm15,%ymm0,%ymm0
+
+ vpsrlq $26,%ymm2,%ymm13
+ vpand %ymm5,%ymm2,%ymm2
+ vpand %ymm5,%ymm9,%ymm9
+ vpand %ymm5,%ymm7,%ymm7
+ vpaddq %ymm13,%ymm3,%ymm3
+
+ vpsrlq $26,%ymm0,%ymm11
+ vpand %ymm5,%ymm0,%ymm0
+ vpaddq %ymm2,%ymm9,%ymm2
+ vpand %ymm5,%ymm8,%ymm8
+ vpaddq %ymm11,%ymm1,%ymm1
+
+ vpsrlq $26,%ymm3,%ymm14
+ vpand %ymm5,%ymm3,%ymm3
+ vpand %ymm5,%ymm10,%ymm10
+ vpor 32(%rcx),%ymm6,%ymm6
+ vpaddq %ymm14,%ymm4,%ymm4
+
+ leaq 144(%rsp),%rax
+ addq $64,%rdx
+ jnz .Ltail_avx2_512
+
+ vpsubq %ymm9,%ymm2,%ymm2
+ vmovd %xmm0,-112(%rdi)
+ vmovd %xmm1,-108(%rdi)
+ vmovd %xmm2,-104(%rdi)
+ vmovd %xmm3,-100(%rdi)
+ vmovd %xmm4,-96(%rdi)
+ vzeroall
+ leaq -8(%r10),%rsp
+
+ ret
+
+ENDPROC(poly1305_blocks_avx512)
+#endif /* CONFIG_AS_AVX512 */
diff --git a/lib/zinc/poly1305/poly1305.c b/lib/zinc/poly1305/poly1305.c
new file mode 100644
index 000000000000..f9ffd8bbb6b0
--- /dev/null
+++ b/lib/zinc/poly1305/poly1305.c
@@ -0,0 +1,377 @@
+/* SPDX-License-Identifier: OpenSSL OR (BSD-3-Clause OR GPL-2.0)
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright 2016 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#include <zinc/poly1305.h>
+
+#include <linux/kernel.h>
+#include <linux/string.h>
+
+#if defined(CONFIG_X86_64)
+#include <asm/fpu/api.h>
+#include <asm/cpufeature.h>
+#include <asm/processor.h>
+#include <asm/intel-family.h>
+asmlinkage void poly1305_init_x86_64(void *ctx, const u8 key[POLY1305_KEY_SIZE]);
+asmlinkage void poly1305_blocks_x86_64(void *ctx, const u8 *inp, const size_t len, const u32 padbit);
+asmlinkage void poly1305_emit_x86_64(void *ctx, u8 mac[POLY1305_MAC_SIZE], const u32 nonce[4]);
+#ifdef CONFIG_AS_AVX
+asmlinkage void poly1305_emit_avx(void *ctx, u8 mac[POLY1305_MAC_SIZE], const u32 nonce[4]);
+asmlinkage void poly1305_blocks_avx(void *ctx, const u8 *inp, const size_t len, const u32 padbit);
+#endif
+#ifdef CONFIG_AS_AVX2
+asmlinkage void poly1305_blocks_avx2(void *ctx, const u8 *inp, const size_t len, const u32 padbit);
+#endif
+#ifdef CONFIG_AS_AVX512
+asmlinkage void poly1305_blocks_avx512(void *ctx, const u8 *inp, const size_t len, const u32 padbit);
+#endif
+
+static bool poly1305_use_avx __ro_after_init;
+static bool poly1305_use_avx2 __ro_after_init;
+static bool poly1305_use_avx512 __ro_after_init;
+
+void __init poly1305_fpu_init(void)
+{
+#ifndef CONFIG_UML
+ poly1305_use_avx = boot_cpu_has(X86_FEATURE_AVX) &&
+ cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM, NULL);
+ poly1305_use_avx2 = boot_cpu_has(X86_FEATURE_AVX) && boot_cpu_has(X86_FEATURE_AVX2) &&
+ cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM, NULL);
+ poly1305_use_avx512 = boot_cpu_has(X86_FEATURE_AVX) && boot_cpu_has(X86_FEATURE_AVX2) && boot_cpu_has(X86_FEATURE_AVX512F) &&
+ cpu_has_xfeatures(XFEATURE_MASK_SSE | XFEATURE_MASK_YMM | XFEATURE_MASK_AVX512, NULL) &&
+ boot_cpu_data.x86_model != INTEL_FAM6_SKYLAKE_X;
+#endif
+}
+#elif defined(CONFIG_ARM) || defined(CONFIG_ARM64)
+asmlinkage void poly1305_init_arm(void *ctx, const u8 key[16]);
+asmlinkage void poly1305_blocks_arm(void *ctx, const u8 *inp, const size_t len, const u32 padbit);
+asmlinkage void poly1305_emit_arm(void *ctx, u8 mac[16], const u32 nonce[4]);
+#if IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && (!defined(__LINUX_ARM_ARCH__) || __LINUX_ARM_ARCH__ >= 7)
+#define ARM_USE_NEON
+#include <asm/hwcap.h>
+#include <asm/neon.h>
+asmlinkage void poly1305_blocks_neon(void *ctx, const u8 *inp, const size_t len, const u32 padbit);
+asmlinkage void poly1305_emit_neon(void *ctx, u8 mac[16], const u32 nonce[4]);
+#endif
+static bool poly1305_use_neon __ro_after_init;
+void __init poly1305_fpu_init(void)
+{
+#if defined(CONFIG_ARM64)
+ poly1305_use_neon = elf_hwcap & HWCAP_ASIMD;
+#elif defined(CONFIG_ARM)
+ poly1305_use_neon = elf_hwcap & HWCAP_NEON;
+#endif
+}
+#elif defined(CONFIG_MIPS) && (defined(CONFIG_64BIT) || defined(CONFIG_CPU_MIPS32_R2))
+asmlinkage void poly1305_init_mips(void *ctx, const u8 key[16]);
+asmlinkage void poly1305_blocks_mips(void *ctx, const u8 *inp, const size_t len, const u32 padbit);
+asmlinkage void poly1305_emit_mips(void *ctx, u8 mac[16], const u32 nonce[4]);
+void __init poly1305_fpu_init(void) { }
+#else
+void __init poly1305_fpu_init(void) { }
+#endif
+
+#if !(defined(CONFIG_X86_64) || defined(CONFIG_ARM) || defined(CONFIG_ARM64) || (defined(CONFIG_MIPS) && (defined(CONFIG_64BIT) || defined(CONFIG_CPU_MIPS32_R2))))
+struct poly1305_internal {
+ u32 h[5];
+ u32 r[4];
+};
+
+static void poly1305_init_generic(void *ctx, const u8 key[16])
+{
+ struct poly1305_internal *st = (struct poly1305_internal *)ctx;
+
+ /* h = 0 */
+ st->h[0] = 0;
+ st->h[1] = 0;
+ st->h[2] = 0;
+ st->h[3] = 0;
+ st->h[4] = 0;
+
+ /* r &= 0xffffffc0ffffffc0ffffffc0fffffff */
+ st->r[0] = le32_to_cpup((__le32 *)&key[ 0]) & 0x0fffffff;
+ st->r[1] = le32_to_cpup((__le32 *)&key[ 4]) & 0x0ffffffc;
+ st->r[2] = le32_to_cpup((__le32 *)&key[ 8]) & 0x0ffffffc;
+ st->r[3] = le32_to_cpup((__le32 *)&key[12]) & 0x0ffffffc;
+}
+
+static void poly1305_blocks_generic(void *ctx, const u8 *inp, size_t len, const u32 padbit)
+{
+#define CONSTANT_TIME_CARRY(a,b) ((a ^ ((a ^ b) | ((a - b) ^ b))) >> (sizeof(a) * 8 - 1))
+ struct poly1305_internal *st = (struct poly1305_internal *)ctx;
+ u32 r0, r1, r2, r3;
+ u32 s1, s2, s3;
+ u32 h0, h1, h2, h3, h4, c;
+ u64 d0, d1, d2, d3;
+
+ r0 = st->r[0];
+ r1 = st->r[1];
+ r2 = st->r[2];
+ r3 = st->r[3];
+
+ s1 = r1 + (r1 >> 2);
+ s2 = r2 + (r2 >> 2);
+ s3 = r3 + (r3 >> 2);
+
+ h0 = st->h[0];
+ h1 = st->h[1];
+ h2 = st->h[2];
+ h3 = st->h[3];
+ h4 = st->h[4];
+
+ while (len >= POLY1305_BLOCK_SIZE) {
+ /* h += m[i] */
+ h0 = (u32)(d0 = (u64)h0 + le32_to_cpup((__le32 *)(inp + 0)));
+ h1 = (u32)(d1 = (u64)h1 + (d0 >> 32) + le32_to_cpup((__le32 *)(inp + 4)));
+ h2 = (u32)(d2 = (u64)h2 + (d1 >> 32) + le32_to_cpup((__le32 *)(inp + 8)));
+ h3 = (u32)(d3 = (u64)h3 + (d2 >> 32) + le32_to_cpup((__le32 *)(inp + 12)));
+ h4 += (u32)(d3 >> 32) + padbit;
+
+ /* h *= r "%" p, where "%" stands for "partial remainder" */
+ d0 = ((u64)h0 * r0) +
+ ((u64)h1 * s3) +
+ ((u64)h2 * s2) +
+ ((u64)h3 * s1);
+ d1 = ((u64)h0 * r1) +
+ ((u64)h1 * r0) +
+ ((u64)h2 * s3) +
+ ((u64)h3 * s2) +
+ (h4 * s1);
+ d2 = ((u64)h0 * r2) +
+ ((u64)h1 * r1) +
+ ((u64)h2 * r0) +
+ ((u64)h3 * s3) +
+ (h4 * s2);
+ d3 = ((u64)h0 * r3) +
+ ((u64)h1 * r2) +
+ ((u64)h2 * r1) +
+ ((u64)h3 * r0) +
+ (h4 * s3);
+ h4 = (h4 * r0);
+
+ /* last reduction step: */
+ /* a) h4:h0 = h4<<128 + d3<<96 + d2<<64 + d1<<32 + d0 */
+ h0 = (u32)d0;
+ h1 = (u32)(d1 += d0 >> 32);
+ h2 = (u32)(d2 += d1 >> 32);
+ h3 = (u32)(d3 += d2 >> 32);
+ h4 += (u32)(d3 >> 32);
+ /* b) (h4:h0 += (h4:h0>>130) * 5) %= 2^130 */
+ c = (h4 >> 2) + (h4 & ~3U);
+ h4 &= 3;
+ h0 += c;
+ h1 += (c = CONSTANT_TIME_CARRY(h0, c));
+ h2 += (c = CONSTANT_TIME_CARRY(h1, c));
+ h3 += (c = CONSTANT_TIME_CARRY(h2, c));
+ h4 += CONSTANT_TIME_CARRY(h3, c);
+ /*
+ * Occasional overflows to 3rd bit of h4 are taken care of
+ * "naturally". If after this point we end up at the top of
+ * this loop, then the overflow bit will be accounted for
+ * in next iteration. If we end up in poly1305_emit, then
+ * comparison to modulus below will still count as "carry
+ * into 131st bit", so that properly reduced value will be
+ * picked in conditional move.
+ */
+
+ inp += POLY1305_BLOCK_SIZE;
+ len -= POLY1305_BLOCK_SIZE;
+ }
+
+ st->h[0] = h0;
+ st->h[1] = h1;
+ st->h[2] = h2;
+ st->h[3] = h3;
+ st->h[4] = h4;
+#undef CONSTANT_TIME_CARRY
+}
+
+static void poly1305_emit_generic(void *ctx, u8 mac[16], const u32 nonce[4])
+{
+ struct poly1305_internal *st = (struct poly1305_internal *)ctx;
+ __le32 *omac = (__force __le32 *)mac;
+ u32 h0, h1, h2, h3, h4;
+ u32 g0, g1, g2, g3, g4;
+ u64 t;
+ u32 mask;
+
+ h0 = st->h[0];
+ h1 = st->h[1];
+ h2 = st->h[2];
+ h3 = st->h[3];
+ h4 = st->h[4];
+
+ /* compare to modulus by computing h + -p */
+ g0 = (u32)(t = (u64)h0 + 5);
+ g1 = (u32)(t = (u64)h1 + (t >> 32));
+ g2 = (u32)(t = (u64)h2 + (t >> 32));
+ g3 = (u32)(t = (u64)h3 + (t >> 32));
+ g4 = h4 + (u32)(t >> 32);
+
+ /* if there was carry into 131st bit, h3:h0 = g3:g0 */
+ mask = 0 - (g4 >> 2);
+ g0 &= mask;
+ g1 &= mask;
+ g2 &= mask;
+ g3 &= mask;
+ mask = ~mask;
+ h0 = (h0 & mask) | g0;
+ h1 = (h1 & mask) | g1;
+ h2 = (h2 & mask) | g2;
+ h3 = (h3 & mask) | g3;
+
+ /* mac = (h + nonce) % (2^128) */
+ h0 = (u32)(t = (u64)h0 + nonce[0]);
+ h1 = (u32)(t = (u64)h1 + (t >> 32) + nonce[1]);
+ h2 = (u32)(t = (u64)h2 + (t >> 32) + nonce[2]);
+ h3 = (u32)(t = (u64)h3 + (t >> 32) + nonce[3]);
+
+ omac[0] = cpu_to_le32(h0);
+ omac[1] = cpu_to_le32(h1);
+ omac[2] = cpu_to_le32(h2);
+ omac[3] = cpu_to_le32(h3);
+}
+#endif
+
+void poly1305_init(struct poly1305_ctx *ctx, const u8 key[POLY1305_KEY_SIZE], bool have_simd)
+{
+ ctx->nonce[0] = le32_to_cpup((__le32 *)&key[16]);
+ ctx->nonce[1] = le32_to_cpup((__le32 *)&key[20]);
+ ctx->nonce[2] = le32_to_cpup((__le32 *)&key[24]);
+ ctx->nonce[3] = le32_to_cpup((__le32 *)&key[28]);
+
+#if defined(CONFIG_X86_64)
+ poly1305_init_x86_64(ctx->opaque, key);
+#elif defined(CONFIG_ARM) || defined(CONFIG_ARM64)
+ poly1305_init_arm(ctx->opaque, key);
+#elif defined(CONFIG_MIPS) && (defined(CONFIG_64BIT) || defined(CONFIG_CPU_MIPS32_R2))
+ poly1305_init_mips(ctx->opaque, key);
+#else
+ poly1305_init_generic(ctx->opaque, key);
+#endif
+ ctx->num = 0;
+}
+EXPORT_SYMBOL(poly1305_init);
+
+static inline void poly1305_blocks(void *ctx, const u8 *inp, const size_t len, const u32 padbit, bool have_simd)
+{
+#if defined(CONFIG_X86_64)
+#ifdef CONFIG_AS_AVX512
+ if (poly1305_use_avx512 && have_simd)
+ poly1305_blocks_avx512(ctx, inp, len, padbit);
+ else
+#endif
+#ifdef CONFIG_AS_AVX2
+ if (poly1305_use_avx2 && have_simd)
+ poly1305_blocks_avx2(ctx, inp, len, padbit);
+ else
+#endif
+#ifdef CONFIG_AS_AVX
+ if (poly1305_use_avx && have_simd)
+ poly1305_blocks_avx(ctx, inp, len, padbit);
+ else
+#endif
+ poly1305_blocks_x86_64(ctx, inp, len, padbit);
+#elif defined(CONFIG_ARM) || defined(CONFIG_ARM64)
+#if defined(ARM_USE_NEON)
+ if (poly1305_use_neon && have_simd)
+ poly1305_blocks_neon(ctx, inp, len, padbit);
+ else
+#endif
+ poly1305_blocks_arm(ctx, inp, len, padbit);
+#elif defined(CONFIG_MIPS) && (defined(CONFIG_64BIT) || defined(CONFIG_CPU_MIPS32_R2))
+ poly1305_blocks_mips(ctx, inp, len, padbit);
+#else
+ poly1305_blocks_generic(ctx, inp, len, padbit);
+#endif
+}
+
+static inline void poly1305_emit(void *ctx, u8 mac[POLY1305_KEY_SIZE], const u32 nonce[4], bool have_simd)
+{
+#if defined(CONFIG_X86_64)
+#ifdef CONFIG_AS_AVX512
+ if (poly1305_use_avx512 && have_simd)
+ poly1305_emit_avx(ctx, mac, nonce);
+ else
+#endif
+#ifdef CONFIG_AS_AVX2
+ if (poly1305_use_avx2 && have_simd)
+ poly1305_emit_avx(ctx, mac, nonce);
+ else
+#endif
+#ifdef CONFIG_AS_AVX
+ if (poly1305_use_avx && have_simd)
+ poly1305_emit_avx(ctx, mac, nonce);
+ else
+#endif
+ poly1305_emit_x86_64(ctx, mac, nonce);
+#elif defined(CONFIG_ARM) || defined(CONFIG_ARM64)
+#if defined(ARM_USE_NEON)
+ if (poly1305_use_neon && have_simd)
+ poly1305_emit_neon(ctx, mac, nonce);
+ else
+#endif
+ poly1305_emit_arm(ctx, mac, nonce);
+#elif defined(CONFIG_MIPS) && (defined(CONFIG_64BIT) || defined(CONFIG_CPU_MIPS32_R2))
+ poly1305_emit_mips(ctx, mac, nonce);
+#else
+ poly1305_emit_generic(ctx, mac, nonce);
+#endif
+}
+
+void poly1305_update(struct poly1305_ctx *ctx, const u8 *inp, size_t len, bool have_simd)
+{
+ const size_t num = ctx->num % POLY1305_BLOCK_SIZE;
+ size_t rem;
+
+ if (num) {
+ rem = POLY1305_BLOCK_SIZE - num;
+ if (len >= rem) {
+ memcpy(ctx->data + num, inp, rem);
+ poly1305_blocks(ctx->opaque, ctx->data, POLY1305_BLOCK_SIZE, 1, have_simd);
+ inp += rem;
+ len -= rem;
+ } else {
+ /* Still not enough data to process a block. */
+ memcpy(ctx->data + num, inp, len);
+ ctx->num = num + len;
+ return;
+ }
+ }
+
+ rem = len % POLY1305_BLOCK_SIZE;
+ len -= rem;
+
+ if (len >= POLY1305_BLOCK_SIZE) {
+ poly1305_blocks(ctx->opaque, inp, len, 1, have_simd);
+ inp += len;
+ }
+
+ if (rem)
+ memcpy(ctx->data, inp, rem);
+
+ ctx->num = rem;
+}
+EXPORT_SYMBOL(poly1305_update);
+
+void poly1305_finish(struct poly1305_ctx *ctx, u8 mac[POLY1305_MAC_SIZE], bool have_simd)
+{
+ size_t num = ctx->num % POLY1305_BLOCK_SIZE;
+
+ if (num) {
+ ctx->data[num++] = 1; /* pad bit */
+ while (num < POLY1305_BLOCK_SIZE)
+ ctx->data[num++] = 0;
+ poly1305_blocks(ctx->opaque, ctx->data, POLY1305_BLOCK_SIZE, 0, have_simd);
+ }
+
+ poly1305_emit(ctx->opaque, mac, ctx->nonce, have_simd);
+
+ /* zero out the state */
+ memzero_explicit(ctx, sizeof(*ctx));
+}
+EXPORT_SYMBOL(poly1305_finish);
+
+#include "selftest/poly1305.h"
diff --git a/lib/zinc/selftest/blake2s.h b/lib/zinc/selftest/blake2s.h
new file mode 100644
index 000000000000..d2286f73d20c
--- /dev/null
+++ b/lib/zinc/selftest/blake2s.h
@@ -0,0 +1,559 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifdef CONFIG_ZINC_DEBUG
+static const u8 blake2s_testvecs[][BLAKE2S_OUTBYTES] __initconst = {
+ { 0x69, 0x21, 0x7A, 0x30, 0x79, 0x90, 0x80, 0x94, 0xE1, 0x11, 0x21, 0xD0, 0x42, 0x35, 0x4A, 0x7C, 0x1F, 0x55, 0xB6, 0x48, 0x2C, 0xA1, 0xA5, 0x1E, 0x1B, 0x25, 0x0D, 0xFD, 0x1E, 0xD0, 0xEE, 0xF9 },
+ { 0xE3, 0x4D, 0x74, 0xDB, 0xAF, 0x4F, 0xF4, 0xC6, 0xAB, 0xD8, 0x71, 0xCC, 0x22, 0x04, 0x51, 0xD2, 0xEA, 0x26, 0x48, 0x84, 0x6C, 0x77, 0x57, 0xFB, 0xAA, 0xC8, 0x2F, 0xE5, 0x1A, 0xD6, 0x4B, 0xEA },
+ { 0xDD, 0xAD, 0x9A, 0xB1, 0x5D, 0xAC, 0x45, 0x49, 0xBA, 0x42, 0xF4, 0x9D, 0x26, 0x24, 0x96, 0xBE, 0xF6, 0xC0, 0xBA, 0xE1, 0xDD, 0x34, 0x2A, 0x88, 0x08, 0xF8, 0xEA, 0x26, 0x7C, 0x6E, 0x21, 0x0C },
+ { 0xE8, 0xF9, 0x1C, 0x6E, 0xF2, 0x32, 0xA0, 0x41, 0x45, 0x2A, 0xB0, 0xE1, 0x49, 0x07, 0x0C, 0xDD, 0x7D, 0xD1, 0x76, 0x9E, 0x75, 0xB3, 0xA5, 0x92, 0x1B, 0xE3, 0x78, 0x76, 0xC4, 0x5C, 0x99, 0x00 },
+ { 0x0C, 0xC7, 0x0E, 0x00, 0x34, 0x8B, 0x86, 0xBA, 0x29, 0x44, 0xD0, 0xC3, 0x20, 0x38, 0xB2, 0x5C, 0x55, 0x58, 0x4F, 0x90, 0xDF, 0x23, 0x04, 0xF5, 0x5F, 0xA3, 0x32, 0xAF, 0x5F, 0xB0, 0x1E, 0x20 },
+ { 0xEC, 0x19, 0x64, 0x19, 0x10, 0x87, 0xA4, 0xFE, 0x9D, 0xF1, 0xC7, 0x95, 0x34, 0x2A, 0x02, 0xFF, 0xC1, 0x91, 0xA5, 0xB2, 0x51, 0x76, 0x48, 0x56, 0xAE, 0x5B, 0x8B, 0x57, 0x69, 0xF0, 0xC6, 0xCD },
+ { 0xE1, 0xFA, 0x51, 0x61, 0x8D, 0x7D, 0xF4, 0xEB, 0x70, 0xCF, 0x0D, 0x5A, 0x9E, 0x90, 0x6F, 0x80, 0x6E, 0x9D, 0x19, 0xF7, 0xF4, 0xF0, 0x1E, 0x3B, 0x62, 0x12, 0x88, 0xE4, 0x12, 0x04, 0x05, 0xD6 },
+ { 0x59, 0x80, 0x01, 0xFA, 0xFB, 0xE8, 0xF9, 0x4E, 0xC6, 0x6D, 0xC8, 0x27, 0xD0, 0x12, 0xCF, 0xCB, 0xBA, 0x22, 0x28, 0x56, 0x9F, 0x44, 0x8E, 0x89, 0xEA, 0x22, 0x08, 0xC8, 0xBF, 0x76, 0x92, 0x93 },
+ { 0xC7, 0xE8, 0x87, 0xB5, 0x46, 0x62, 0x36, 0x35, 0xE9, 0x3E, 0x04, 0x95, 0x59, 0x8F, 0x17, 0x26, 0x82, 0x19, 0x96, 0xC2, 0x37, 0x77, 0x05, 0xB9, 0x3A, 0x1F, 0x63, 0x6F, 0x87, 0x2B, 0xFA, 0x2D },
+ { 0xC3, 0x15, 0xA4, 0x37, 0xDD, 0x28, 0x06, 0x2A, 0x77, 0x0D, 0x48, 0x19, 0x67, 0x13, 0x6B, 0x1B, 0x5E, 0xB8, 0x8B, 0x21, 0xEE, 0x53, 0xD0, 0x32, 0x9C, 0x58, 0x97, 0x12, 0x6E, 0x9D, 0xB0, 0x2C },
+ { 0xBB, 0x47, 0x3D, 0xED, 0xDC, 0x05, 0x5F, 0xEA, 0x62, 0x28, 0xF2, 0x07, 0xDA, 0x57, 0x53, 0x47, 0xBB, 0x00, 0x40, 0x4C, 0xD3, 0x49, 0xD3, 0x8C, 0x18, 0x02, 0x63, 0x07, 0xA2, 0x24, 0xCB, 0xFF },
+ { 0x68, 0x7E, 0x18, 0x73, 0xA8, 0x27, 0x75, 0x91, 0xBB, 0x33, 0xD9, 0xAD, 0xF9, 0xA1, 0x39, 0x12, 0xEF, 0xEF, 0xE5, 0x57, 0xCA, 0xFC, 0x39, 0xA7, 0x95, 0x26, 0x23, 0xE4, 0x72, 0x55, 0xF1, 0x6D },
+ { 0x1A, 0xC7, 0xBA, 0x75, 0x4D, 0x6E, 0x2F, 0x94, 0xE0, 0xE8, 0x6C, 0x46, 0xBF, 0xB2, 0x62, 0xAB, 0xBB, 0x74, 0xF4, 0x50, 0xEF, 0x45, 0x6D, 0x6B, 0x4D, 0x97, 0xAA, 0x80, 0xCE, 0x6D, 0xA7, 0x67 },
+ { 0x01, 0x2C, 0x97, 0x80, 0x96, 0x14, 0x81, 0x6B, 0x5D, 0x94, 0x94, 0x47, 0x7D, 0x4B, 0x68, 0x7D, 0x15, 0xB9, 0x6E, 0xB6, 0x9C, 0x0E, 0x80, 0x74, 0xA8, 0x51, 0x6F, 0x31, 0x22, 0x4B, 0x5C, 0x98 },
+ { 0x91, 0xFF, 0xD2, 0x6C, 0xFA, 0x4D, 0xA5, 0x13, 0x4C, 0x7E, 0xA2, 0x62, 0xF7, 0x88, 0x9C, 0x32, 0x9F, 0x61, 0xF6, 0xA6, 0x57, 0x22, 0x5C, 0xC2, 0x12, 0xF4, 0x00, 0x56, 0xD9, 0x86, 0xB3, 0xF4 },
+ { 0xD9, 0x7C, 0x82, 0x8D, 0x81, 0x82, 0xA7, 0x21, 0x80, 0xA0, 0x6A, 0x78, 0x26, 0x83, 0x30, 0x67, 0x3F, 0x7C, 0x4E, 0x06, 0x35, 0x94, 0x7C, 0x04, 0xC0, 0x23, 0x23, 0xFD, 0x45, 0xC0, 0xA5, 0x2D },
+ { 0xEF, 0xC0, 0x4C, 0xDC, 0x39, 0x1C, 0x7E, 0x91, 0x19, 0xBD, 0x38, 0x66, 0x8A, 0x53, 0x4E, 0x65, 0xFE, 0x31, 0x03, 0x6D, 0x6A, 0x62, 0x11, 0x2E, 0x44, 0xEB, 0xEB, 0x11, 0xF9, 0xC5, 0x70, 0x80 },
+ { 0x99, 0x2C, 0xF5, 0xC0, 0x53, 0x44, 0x2A, 0x5F, 0xBC, 0x4F, 0xAF, 0x58, 0x3E, 0x04, 0xE5, 0x0B, 0xB7, 0x0D, 0x2F, 0x39, 0xFB, 0xB6, 0xA5, 0x03, 0xF8, 0x9E, 0x56, 0xA6, 0x3E, 0x18, 0x57, 0x8A },
+ { 0x38, 0x64, 0x0E, 0x9F, 0x21, 0x98, 0x3E, 0x67, 0xB5, 0x39, 0xCA, 0xCC, 0xAE, 0x5E, 0xCF, 0x61, 0x5A, 0xE2, 0x76, 0x4F, 0x75, 0xA0, 0x9C, 0x9C, 0x59, 0xB7, 0x64, 0x83, 0xC1, 0xFB, 0xC7, 0x35 },
+ { 0x21, 0x3D, 0xD3, 0x4C, 0x7E, 0xFE, 0x4F, 0xB2, 0x7A, 0x6B, 0x35, 0xF6, 0xB4, 0x00, 0x0D, 0x1F, 0xE0, 0x32, 0x81, 0xAF, 0x3C, 0x72, 0x3E, 0x5C, 0x9F, 0x94, 0x74, 0x7A, 0x5F, 0x31, 0xCD, 0x3B },
+ { 0xEC, 0x24, 0x6E, 0xEE, 0xB9, 0xCE, 0xD3, 0xF7, 0xAD, 0x33, 0xED, 0x28, 0x66, 0x0D, 0xD9, 0xBB, 0x07, 0x32, 0x51, 0x3D, 0xB4, 0xE2, 0xFA, 0x27, 0x8B, 0x60, 0xCD, 0xE3, 0x68, 0x2A, 0x4C, 0xCD },
+ { 0xAC, 0x9B, 0x61, 0xD4, 0x46, 0x64, 0x8C, 0x30, 0x05, 0xD7, 0x89, 0x2B, 0xF3, 0xA8, 0x71, 0x9F, 0x4C, 0x81, 0x81, 0xCF, 0xDC, 0xBC, 0x2B, 0x79, 0xFE, 0xF1, 0x0A, 0x27, 0x9B, 0x91, 0x10, 0x95 },
+ { 0x7B, 0xF8, 0xB2, 0x29, 0x59, 0xE3, 0x4E, 0x3A, 0x43, 0xF7, 0x07, 0x92, 0x23, 0xE8, 0x3A, 0x97, 0x54, 0x61, 0x7D, 0x39, 0x1E, 0x21, 0x3D, 0xFD, 0x80, 0x8E, 0x41, 0xB9, 0xBE, 0xAD, 0x4C, 0xE7 },
+ { 0x68, 0xD4, 0xB5, 0xD4, 0xFA, 0x0E, 0x30, 0x2B, 0x64, 0xCC, 0xC5, 0xAF, 0x79, 0x29, 0x13, 0xAC, 0x4C, 0x88, 0xEC, 0x95, 0xC0, 0x7D, 0xDF, 0x40, 0x69, 0x42, 0x56, 0xEB, 0x88, 0xCE, 0x9F, 0x3D },
+ { 0xB2, 0xC2, 0x42, 0x0F, 0x05, 0xF9, 0xAB, 0xE3, 0x63, 0x15, 0x91, 0x93, 0x36, 0xB3, 0x7E, 0x4E, 0x0F, 0xA3, 0x3F, 0xF7, 0xE7, 0x6A, 0x49, 0x27, 0x67, 0x00, 0x6F, 0xDB, 0x5D, 0x93, 0x54, 0x62 },
+ { 0x13, 0x4F, 0x61, 0xBB, 0xD0, 0xBB, 0xB6, 0x9A, 0xED, 0x53, 0x43, 0x90, 0x45, 0x51, 0xA3, 0xE6, 0xC1, 0xAA, 0x7D, 0xCD, 0xD7, 0x7E, 0x90, 0x3E, 0x70, 0x23, 0xEB, 0x7C, 0x60, 0x32, 0x0A, 0xA7 },
+ { 0x46, 0x93, 0xF9, 0xBF, 0xF7, 0xD4, 0xF3, 0x98, 0x6A, 0x7D, 0x17, 0x6E, 0x6E, 0x06, 0xF7, 0x2A, 0xD1, 0x49, 0x0D, 0x80, 0x5C, 0x99, 0xE2, 0x53, 0x47, 0xB8, 0xDE, 0x77, 0xB4, 0xDB, 0x6D, 0x9B },
+ { 0x85, 0x3E, 0x26, 0xF7, 0x41, 0x95, 0x3B, 0x0F, 0xD5, 0xBD, 0xB4, 0x24, 0xE8, 0xAB, 0x9E, 0x8B, 0x37, 0x50, 0xEA, 0xA8, 0xEF, 0x61, 0xE4, 0x79, 0x02, 0xC9, 0x1E, 0x55, 0x4E, 0x9C, 0x73, 0xB9 },
+ { 0xF7, 0xDE, 0x53, 0x63, 0x61, 0xAB, 0xAA, 0x0E, 0x15, 0x81, 0x56, 0xCF, 0x0E, 0xA4, 0xF6, 0x3A, 0x99, 0xB5, 0xE4, 0x05, 0x4F, 0x8F, 0xA4, 0xC9, 0xD4, 0x5F, 0x62, 0x85, 0xCA, 0xD5, 0x56, 0x94 },
+ { 0x4C, 0x23, 0x06, 0x08, 0x86, 0x0A, 0x99, 0xAE, 0x8D, 0x7B, 0xD5, 0xC2, 0xCC, 0x17, 0xFA, 0x52, 0x09, 0x6B, 0x9A, 0x61, 0xBE, 0xDB, 0x17, 0xCB, 0x76, 0x17, 0x86, 0x4A, 0xD2, 0x9C, 0xA7, 0xA6 },
+ { 0xAE, 0xB9, 0x20, 0xEA, 0x87, 0x95, 0x2D, 0xAD, 0xB1, 0xFB, 0x75, 0x92, 0x91, 0xE3, 0x38, 0x81, 0x39, 0xA8, 0x72, 0x86, 0x50, 0x01, 0x88, 0x6E, 0xD8, 0x47, 0x52, 0xE9, 0x3C, 0x25, 0x0C, 0x2A },
+ { 0xAB, 0xA4, 0xAD, 0x9B, 0x48, 0x0B, 0x9D, 0xF3, 0xD0, 0x8C, 0xA5, 0xE8, 0x7B, 0x0C, 0x24, 0x40, 0xD4, 0xE4, 0xEA, 0x21, 0x22, 0x4C, 0x2E, 0xB4, 0x2C, 0xBA, 0xE4, 0x69, 0xD0, 0x89, 0xB9, 0x31 },
+ { 0x05, 0x82, 0x56, 0x07, 0xD7, 0xFD, 0xF2, 0xD8, 0x2E, 0xF4, 0xC3, 0xC8, 0xC2, 0xAE, 0xA9, 0x61, 0xAD, 0x98, 0xD6, 0x0E, 0xDF, 0xF7, 0xD0, 0x18, 0x98, 0x3E, 0x21, 0x20, 0x4C, 0x0D, 0x93, 0xD1 },
+ { 0xA7, 0x42, 0xF8, 0xB6, 0xAF, 0x82, 0xD8, 0xA6, 0xCA, 0x23, 0x57, 0xC5, 0xF1, 0xCF, 0x91, 0xDE, 0xFB, 0xD0, 0x66, 0x26, 0x7D, 0x75, 0xC0, 0x48, 0xB3, 0x52, 0x36, 0x65, 0x85, 0x02, 0x59, 0x62 },
+ { 0x2B, 0xCA, 0xC8, 0x95, 0x99, 0x00, 0x0B, 0x42, 0xC9, 0x5A, 0xE2, 0x38, 0x35, 0xA7, 0x13, 0x70, 0x4E, 0xD7, 0x97, 0x89, 0xC8, 0x4F, 0xEF, 0x14, 0x9A, 0x87, 0x4F, 0xF7, 0x33, 0xF0, 0x17, 0xA2 },
+ { 0xAC, 0x1E, 0xD0, 0x7D, 0x04, 0x8F, 0x10, 0x5A, 0x9E, 0x5B, 0x7A, 0xB8, 0x5B, 0x09, 0xA4, 0x92, 0xD5, 0xBA, 0xFF, 0x14, 0xB8, 0xBF, 0xB0, 0xE9, 0xFD, 0x78, 0x94, 0x86, 0xEE, 0xA2, 0xB9, 0x74 },
+ { 0xE4, 0x8D, 0x0E, 0xCF, 0xAF, 0x49, 0x7D, 0x5B, 0x27, 0xC2, 0x5D, 0x99, 0xE1, 0x56, 0xCB, 0x05, 0x79, 0xD4, 0x40, 0xD6, 0xE3, 0x1F, 0xB6, 0x24, 0x73, 0x69, 0x6D, 0xBF, 0x95, 0xE0, 0x10, 0xE4 },
+ { 0x12, 0xA9, 0x1F, 0xAD, 0xF8, 0xB2, 0x16, 0x44, 0xFD, 0x0F, 0x93, 0x4F, 0x3C, 0x4A, 0x8F, 0x62, 0xBA, 0x86, 0x2F, 0xFD, 0x20, 0xE8, 0xE9, 0x61, 0x15, 0x4C, 0x15, 0xC1, 0x38, 0x84, 0xED, 0x3D },
+ { 0x7C, 0xBE, 0xE9, 0x6E, 0x13, 0x98, 0x97, 0xDC, 0x98, 0xFB, 0xEF, 0x3B, 0xE8, 0x1A, 0xD4, 0xD9, 0x64, 0xD2, 0x35, 0xCB, 0x12, 0x14, 0x1F, 0xB6, 0x67, 0x27, 0xE6, 0xE5, 0xDF, 0x73, 0xA8, 0x78 },
+ { 0xEB, 0xF6, 0x6A, 0xBB, 0x59, 0x7A, 0xE5, 0x72, 0xA7, 0x29, 0x7C, 0xB0, 0x87, 0x1E, 0x35, 0x5A, 0xCC, 0xAF, 0xAD, 0x83, 0x77, 0xB8, 0xE7, 0x8B, 0xF1, 0x64, 0xCE, 0x2A, 0x18, 0xDE, 0x4B, 0xAF },
+ { 0x71, 0xB9, 0x33, 0xB0, 0x7E, 0x4F, 0xF7, 0x81, 0x8C, 0xE0, 0x59, 0xD0, 0x08, 0x82, 0x9E, 0x45, 0x3C, 0x6F, 0xF0, 0x2E, 0xC0, 0xA7, 0xDB, 0x39, 0x3F, 0xC2, 0xD8, 0x70, 0xF3, 0x7A, 0x72, 0x86 },
+ { 0x7C, 0xF7, 0xC5, 0x13, 0x31, 0x22, 0x0B, 0x8D, 0x3E, 0xBA, 0xED, 0x9C, 0x29, 0x39, 0x8A, 0x16, 0xD9, 0x81, 0x56, 0xE2, 0x61, 0x3C, 0xB0, 0x88, 0xF2, 0xB0, 0xE0, 0x8A, 0x1B, 0xE4, 0xCF, 0x4F },
+ { 0x3E, 0x41, 0xA1, 0x08, 0xE0, 0xF6, 0x4A, 0xD2, 0x76, 0xB9, 0x79, 0xE1, 0xCE, 0x06, 0x82, 0x79, 0xE1, 0x6F, 0x7B, 0xC7, 0xE4, 0xAA, 0x1D, 0x21, 0x1E, 0x17, 0xB8, 0x11, 0x61, 0xDF, 0x16, 0x02 },
+ { 0x88, 0x65, 0x02, 0xA8, 0x2A, 0xB4, 0x7B, 0xA8, 0xD8, 0x67, 0x10, 0xAA, 0x9D, 0xE3, 0xD4, 0x6E, 0xA6, 0x5C, 0x47, 0xAF, 0x6E, 0xE8, 0xDE, 0x45, 0x0C, 0xCE, 0xB8, 0xB1, 0x1B, 0x04, 0x5F, 0x50 },
+ { 0xC0, 0x21, 0xBC, 0x5F, 0x09, 0x54, 0xFE, 0xE9, 0x4F, 0x46, 0xEA, 0x09, 0x48, 0x7E, 0x10, 0xA8, 0x48, 0x40, 0xD0, 0x2F, 0x64, 0x81, 0x0B, 0xC0, 0x8D, 0x9E, 0x55, 0x1F, 0x7D, 0x41, 0x68, 0x14 },
+ { 0x20, 0x30, 0x51, 0x6E, 0x8A, 0x5F, 0xE1, 0x9A, 0xE7, 0x9C, 0x33, 0x6F, 0xCE, 0x26, 0x38, 0x2A, 0x74, 0x9D, 0x3F, 0xD0, 0xEC, 0x91, 0xE5, 0x37, 0xD4, 0xBD, 0x23, 0x58, 0xC1, 0x2D, 0xFB, 0x22 },
+ { 0x55, 0x66, 0x98, 0xDA, 0xC8, 0x31, 0x7F, 0xD3, 0x6D, 0xFB, 0xDF, 0x25, 0xA7, 0x9C, 0xB1, 0x12, 0xD5, 0x42, 0x58, 0x60, 0x60, 0x5C, 0xBA, 0xF5, 0x07, 0xF2, 0x3B, 0xF7, 0xE9, 0xF4, 0x2A, 0xFE },
+ { 0x2F, 0x86, 0x7B, 0xA6, 0x77, 0x73, 0xFD, 0xC3, 0xE9, 0x2F, 0xCE, 0xD9, 0x9A, 0x64, 0x09, 0xAD, 0x39, 0xD0, 0xB8, 0x80, 0xFD, 0xE8, 0xF1, 0x09, 0xA8, 0x17, 0x30, 0xC4, 0x45, 0x1D, 0x01, 0x78 },
+ { 0x17, 0x2E, 0xC2, 0x18, 0xF1, 0x19, 0xDF, 0xAE, 0x98, 0x89, 0x6D, 0xFF, 0x29, 0xDD, 0x98, 0x76, 0xC9, 0x4A, 0xF8, 0x74, 0x17, 0xF9, 0xAE, 0x4C, 0x70, 0x14, 0xBB, 0x4E, 0x4B, 0x96, 0xAF, 0xC7 },
+ { 0x3F, 0x85, 0x81, 0x4A, 0x18, 0x19, 0x5F, 0x87, 0x9A, 0xA9, 0x62, 0xF9, 0x5D, 0x26, 0xBD, 0x82, 0xA2, 0x78, 0xF2, 0xB8, 0x23, 0x20, 0x21, 0x8F, 0x6B, 0x3B, 0xD6, 0xF7, 0xF6, 0x67, 0xA6, 0xD9 },
+ { 0x1B, 0x61, 0x8F, 0xBA, 0xA5, 0x66, 0xB3, 0xD4, 0x98, 0xC1, 0x2E, 0x98, 0x2C, 0x9E, 0xC5, 0x2E, 0x4D, 0xA8, 0x5A, 0x8C, 0x54, 0xF3, 0x8F, 0x34, 0xC0, 0x90, 0x39, 0x4F, 0x23, 0xC1, 0x84, 0xC1 },
+ { 0x0C, 0x75, 0x8F, 0xB5, 0x69, 0x2F, 0xFD, 0x41, 0xA3, 0x57, 0x5D, 0x0A, 0xF0, 0x0C, 0xC7, 0xFB, 0xF2, 0xCB, 0xE5, 0x90, 0x5A, 0x58, 0x32, 0x3A, 0x88, 0xAE, 0x42, 0x44, 0xF6, 0xE4, 0xC9, 0x93 },
+ { 0xA9, 0x31, 0x36, 0x0C, 0xAD, 0x62, 0x8C, 0x7F, 0x12, 0xA6, 0xC1, 0xC4, 0xB7, 0x53, 0xB0, 0xF4, 0x06, 0x2A, 0xEF, 0x3C, 0xE6, 0x5A, 0x1A, 0xE3, 0xF1, 0x93, 0x69, 0xDA, 0xDF, 0x3A, 0xE2, 0x3D },
+ { 0xCB, 0xAC, 0x7D, 0x77, 0x3B, 0x1E, 0x3B, 0x3C, 0x66, 0x91, 0xD7, 0xAB, 0xB7, 0xE9, 0xDF, 0x04, 0x5C, 0x8B, 0xA1, 0x92, 0x68, 0xDE, 0xD1, 0x53, 0x20, 0x7F, 0x5E, 0x80, 0x43, 0x52, 0xEC, 0x5D },
+ { 0x23, 0xA1, 0x96, 0xD3, 0x80, 0x2E, 0xD3, 0xC1, 0xB3, 0x84, 0x01, 0x9A, 0x82, 0x32, 0x58, 0x40, 0xD3, 0x2F, 0x71, 0x95, 0x0C, 0x45, 0x80, 0xB0, 0x34, 0x45, 0xE0, 0x89, 0x8E, 0x14, 0x05, 0x3C },
+ { 0xF4, 0x49, 0x54, 0x70, 0xF2, 0x26, 0xC8, 0xC2, 0x14, 0xBE, 0x08, 0xFD, 0xFA, 0xD4, 0xBC, 0x4A, 0x2A, 0x9D, 0xBE, 0xA9, 0x13, 0x6A, 0x21, 0x0D, 0xF0, 0xD4, 0xB6, 0x49, 0x29, 0xE6, 0xFC, 0x14 },
+ { 0xE2, 0x90, 0xDD, 0x27, 0x0B, 0x46, 0x7F, 0x34, 0xAB, 0x1C, 0x00, 0x2D, 0x34, 0x0F, 0xA0, 0x16, 0x25, 0x7F, 0xF1, 0x9E, 0x58, 0x33, 0xFD, 0xBB, 0xF2, 0xCB, 0x40, 0x1C, 0x3B, 0x28, 0x17, 0xDE },
+ { 0x9F, 0xC7, 0xB5, 0xDE, 0xD3, 0xC1, 0x50, 0x42, 0xB2, 0xA6, 0x58, 0x2D, 0xC3, 0x9B, 0xE0, 0x16, 0xD2, 0x4A, 0x68, 0x2D, 0x5E, 0x61, 0xAD, 0x1E, 0xFF, 0x9C, 0x63, 0x30, 0x98, 0x48, 0xF7, 0x06 },
+ { 0x8C, 0xCA, 0x67, 0xA3, 0x6D, 0x17, 0xD5, 0xE6, 0x34, 0x1C, 0xB5, 0x92, 0xFD, 0x7B, 0xEF, 0x99, 0x26, 0xC9, 0xE3, 0xAA, 0x10, 0x27, 0xEA, 0x11, 0xA7, 0xD8, 0xBD, 0x26, 0x0B, 0x57, 0x6E, 0x04 },
+ { 0x40, 0x93, 0x92, 0xF5, 0x60, 0xF8, 0x68, 0x31, 0xDA, 0x43, 0x73, 0xEE, 0x5E, 0x00, 0x74, 0x26, 0x05, 0x95, 0xD7, 0xBC, 0x24, 0x18, 0x3B, 0x60, 0xED, 0x70, 0x0D, 0x45, 0x83, 0xD3, 0xF6, 0xF0 },
+ { 0x28, 0x02, 0x16, 0x5D, 0xE0, 0x90, 0x91, 0x55, 0x46, 0xF3, 0x39, 0x8C, 0xD8, 0x49, 0x16, 0x4A, 0x19, 0xF9, 0x2A, 0xDB, 0xC3, 0x61, 0xAD, 0xC9, 0x9B, 0x0F, 0x20, 0xC8, 0xEA, 0x07, 0x10, 0x54 },
+ { 0xAD, 0x83, 0x91, 0x68, 0xD9, 0xF8, 0xA4, 0xBE, 0x95, 0xBA, 0x9E, 0xF9, 0xA6, 0x92, 0xF0, 0x72, 0x56, 0xAE, 0x43, 0xFE, 0x6F, 0x98, 0x64, 0xE2, 0x90, 0x69, 0x1B, 0x02, 0x56, 0xCE, 0x50, 0xA9 },
+ { 0x75, 0xFD, 0xAA, 0x50, 0x38, 0xC2, 0x84, 0xB8, 0x6D, 0x6E, 0x8A, 0xFF, 0xE8, 0xB2, 0x80, 0x7E, 0x46, 0x7B, 0x86, 0x60, 0x0E, 0x79, 0xAF, 0x36, 0x89, 0xFB, 0xC0, 0x63, 0x28, 0xCB, 0xF8, 0x94 },
+ { 0xE5, 0x7C, 0xB7, 0x94, 0x87, 0xDD, 0x57, 0x90, 0x24, 0x32, 0xB2, 0x50, 0x73, 0x38, 0x13, 0xBD, 0x96, 0xA8, 0x4E, 0xFC, 0xE5, 0x9F, 0x65, 0x0F, 0xAC, 0x26, 0xE6, 0x69, 0x6A, 0xEF, 0xAF, 0xC3 },
+ { 0x56, 0xF3, 0x4E, 0x8B, 0x96, 0x55, 0x7E, 0x90, 0xC1, 0xF2, 0x4B, 0x52, 0xD0, 0xC8, 0x9D, 0x51, 0x08, 0x6A, 0xCF, 0x1B, 0x00, 0xF6, 0x34, 0xCF, 0x1D, 0xDE, 0x92, 0x33, 0xB8, 0xEA, 0xAA, 0x3E },
+ { 0x1B, 0x53, 0xEE, 0x94, 0xAA, 0xF3, 0x4E, 0x4B, 0x15, 0x9D, 0x48, 0xDE, 0x35, 0x2C, 0x7F, 0x06, 0x61, 0xD0, 0xA4, 0x0E, 0xDF, 0xF9, 0x5A, 0x0B, 0x16, 0x39, 0xB4, 0x09, 0x0E, 0x97, 0x44, 0x72 },
+ { 0x05, 0x70, 0x5E, 0x2A, 0x81, 0x75, 0x7C, 0x14, 0xBD, 0x38, 0x3E, 0xA9, 0x8D, 0xDA, 0x54, 0x4E, 0xB1, 0x0E, 0x6B, 0xC0, 0x7B, 0xAE, 0x43, 0x5E, 0x25, 0x18, 0xDB, 0xE1, 0x33, 0x52, 0x53, 0x75 },
+ { 0xD8, 0xB2, 0x86, 0x6E, 0x8A, 0x30, 0x9D, 0xB5, 0x3E, 0x52, 0x9E, 0xC3, 0x29, 0x11, 0xD8, 0x2F, 0x5C, 0xA1, 0x6C, 0xFF, 0x76, 0x21, 0x68, 0x91, 0xA9, 0x67, 0x6A, 0xA3, 0x1A, 0xAA, 0x6C, 0x42 },
+ { 0xF5, 0x04, 0x1C, 0x24, 0x12, 0x70, 0xEB, 0x04, 0xC7, 0x1E, 0xC2, 0xC9, 0x5D, 0x4C, 0x38, 0xD8, 0x03, 0xB1, 0x23, 0x7B, 0x0F, 0x29, 0xFD, 0x4D, 0xB3, 0xEB, 0x39, 0x76, 0x69, 0xE8, 0x86, 0x99 },
+ { 0x9A, 0x4C, 0xE0, 0x77, 0xC3, 0x49, 0x32, 0x2F, 0x59, 0x5E, 0x0E, 0xE7, 0x9E, 0xD0, 0xDA, 0x5F, 0xAB, 0x66, 0x75, 0x2C, 0xBF, 0xEF, 0x8F, 0x87, 0xD0, 0xE9, 0xD0, 0x72, 0x3C, 0x75, 0x30, 0xDD },
+ { 0x65, 0x7B, 0x09, 0xF3, 0xD0, 0xF5, 0x2B, 0x5B, 0x8F, 0x2F, 0x97, 0x16, 0x3A, 0x0E, 0xDF, 0x0C, 0x04, 0xF0, 0x75, 0x40, 0x8A, 0x07, 0xBB, 0xEB, 0x3A, 0x41, 0x01, 0xA8, 0x91, 0x99, 0x0D, 0x62 },
+ { 0x1E, 0x3F, 0x7B, 0xD5, 0xA5, 0x8F, 0xA5, 0x33, 0x34, 0x4A, 0xA8, 0xED, 0x3A, 0xC1, 0x22, 0xBB, 0x9E, 0x70, 0xD4, 0xEF, 0x50, 0xD0, 0x04, 0x53, 0x08, 0x21, 0x94, 0x8F, 0x5F, 0xE6, 0x31, 0x5A },
+ { 0x80, 0xDC, 0xCF, 0x3F, 0xD8, 0x3D, 0xFD, 0x0D, 0x35, 0xAA, 0x28, 0x58, 0x59, 0x22, 0xAB, 0x89, 0xD5, 0x31, 0x39, 0x97, 0x67, 0x3E, 0xAF, 0x90, 0x5C, 0xEA, 0x9C, 0x0B, 0x22, 0x5C, 0x7B, 0x5F },
+ { 0x8A, 0x0D, 0x0F, 0xBF, 0x63, 0x77, 0xD8, 0x3B, 0xB0, 0x8B, 0x51, 0x4B, 0x4B, 0x1C, 0x43, 0xAC, 0xC9, 0x5D, 0x75, 0x17, 0x14, 0xF8, 0x92, 0x56, 0x45, 0xCB, 0x6B, 0xC8, 0x56, 0xCA, 0x15, 0x0A },
+ { 0x9F, 0xA5, 0xB4, 0x87, 0x73, 0x8A, 0xD2, 0x84, 0x4C, 0xC6, 0x34, 0x8A, 0x90, 0x19, 0x18, 0xF6, 0x59, 0xA3, 0xB8, 0x9E, 0x9C, 0x0D, 0xFE, 0xEA, 0xD3, 0x0D, 0xD9, 0x4B, 0xCF, 0x42, 0xEF, 0x8E },
+ { 0x80, 0x83, 0x2C, 0x4A, 0x16, 0x77, 0xF5, 0xEA, 0x25, 0x60, 0xF6, 0x68, 0xE9, 0x35, 0x4D, 0xD3, 0x69, 0x97, 0xF0, 0x37, 0x28, 0xCF, 0xA5, 0x5E, 0x1B, 0x38, 0x33, 0x7C, 0x0C, 0x9E, 0xF8, 0x18 },
+ { 0xAB, 0x37, 0xDD, 0xB6, 0x83, 0x13, 0x7E, 0x74, 0x08, 0x0D, 0x02, 0x6B, 0x59, 0x0B, 0x96, 0xAE, 0x9B, 0xB4, 0x47, 0x72, 0x2F, 0x30, 0x5A, 0x5A, 0xC5, 0x70, 0xEC, 0x1D, 0xF9, 0xB1, 0x74, 0x3C },
+ { 0x3E, 0xE7, 0x35, 0xA6, 0x94, 0xC2, 0x55, 0x9B, 0x69, 0x3A, 0xA6, 0x86, 0x29, 0x36, 0x1E, 0x15, 0xD1, 0x22, 0x65, 0xAD, 0x6A, 0x3D, 0xED, 0xF4, 0x88, 0xB0, 0xB0, 0x0F, 0xAC, 0x97, 0x54, 0xBA },
+ { 0xD6, 0xFC, 0xD2, 0x32, 0x19, 0xB6, 0x47, 0xE4, 0xCB, 0xD5, 0xEB, 0x2D, 0x0A, 0xD0, 0x1E, 0xC8, 0x83, 0x8A, 0x4B, 0x29, 0x01, 0xFC, 0x32, 0x5C, 0xC3, 0x70, 0x19, 0x81, 0xCA, 0x6C, 0x88, 0x8B },
+ { 0x05, 0x20, 0xEC, 0x2F, 0x5B, 0xF7, 0xA7, 0x55, 0xDA, 0xCB, 0x50, 0xC6, 0xBF, 0x23, 0x3E, 0x35, 0x15, 0x43, 0x47, 0x63, 0xDB, 0x01, 0x39, 0xCC, 0xD9, 0xFA, 0xEF, 0xBB, 0x82, 0x07, 0x61, 0x2D },
+ { 0xAF, 0xF3, 0xB7, 0x5F, 0x3F, 0x58, 0x12, 0x64, 0xD7, 0x66, 0x16, 0x62, 0xB9, 0x2F, 0x5A, 0xD3, 0x7C, 0x1D, 0x32, 0xBD, 0x45, 0xFF, 0x81, 0xA4, 0xED, 0x8A, 0xDC, 0x9E, 0xF3, 0x0D, 0xD9, 0x89 },
+ { 0xD0, 0xDD, 0x65, 0x0B, 0xEF, 0xD3, 0xBA, 0x63, 0xDC, 0x25, 0x10, 0x2C, 0x62, 0x7C, 0x92, 0x1B, 0x9C, 0xBE, 0xB0, 0xB1, 0x30, 0x68, 0x69, 0x35, 0xB5, 0xC9, 0x27, 0xCB, 0x7C, 0xCD, 0x5E, 0x3B },
+ { 0xE1, 0x14, 0x98, 0x16, 0xB1, 0x0A, 0x85, 0x14, 0xFB, 0x3E, 0x2C, 0xAB, 0x2C, 0x08, 0xBE, 0xE9, 0xF7, 0x3C, 0xE7, 0x62, 0x21, 0x70, 0x12, 0x46, 0xA5, 0x89, 0xBB, 0xB6, 0x73, 0x02, 0xD8, 0xA9 },
+ { 0x7D, 0xA3, 0xF4, 0x41, 0xDE, 0x90, 0x54, 0x31, 0x7E, 0x72, 0xB5, 0xDB, 0xF9, 0x79, 0xDA, 0x01, 0xE6, 0xBC, 0xEE, 0xBB, 0x84, 0x78, 0xEA, 0xE6, 0xA2, 0x28, 0x49, 0xD9, 0x02, 0x92, 0x63, 0x5C },
+ { 0x12, 0x30, 0xB1, 0xFC, 0x8A, 0x7D, 0x92, 0x15, 0xED, 0xC2, 0xD4, 0xA2, 0xDE, 0xCB, 0xDD, 0x0A, 0x6E, 0x21, 0x6C, 0x92, 0x42, 0x78, 0xC9, 0x1F, 0xC5, 0xD1, 0x0E, 0x7D, 0x60, 0x19, 0x2D, 0x94 },
+ { 0x57, 0x50, 0xD7, 0x16, 0xB4, 0x80, 0x8F, 0x75, 0x1F, 0xEB, 0xC3, 0x88, 0x06, 0xBA, 0x17, 0x0B, 0xF6, 0xD5, 0x19, 0x9A, 0x78, 0x16, 0xBE, 0x51, 0x4E, 0x3F, 0x93, 0x2F, 0xBE, 0x0C, 0xB8, 0x71 },
+ { 0x6F, 0xC5, 0x9B, 0x2F, 0x10, 0xFE, 0xBA, 0x95, 0x4A, 0xA6, 0x82, 0x0B, 0x3C, 0xA9, 0x87, 0xEE, 0x81, 0xD5, 0xCC, 0x1D, 0xA3, 0xC6, 0x3C, 0xE8, 0x27, 0x30, 0x1C, 0x56, 0x9D, 0xFB, 0x39, 0xCE },
+ { 0xC7, 0xC3, 0xFE, 0x1E, 0xEB, 0xDC, 0x7B, 0x5A, 0x93, 0x93, 0x26, 0xE8, 0xDD, 0xB8, 0x3E, 0x8B, 0xF2, 0xB7, 0x80, 0xB6, 0x56, 0x78, 0xCB, 0x62, 0xF2, 0x08, 0xB0, 0x40, 0xAB, 0xDD, 0x35, 0xE2 },
+ { 0x0C, 0x75, 0xC1, 0xA1, 0x5C, 0xF3, 0x4A, 0x31, 0x4E, 0xE4, 0x78, 0xF4, 0xA5, 0xCE, 0x0B, 0x8A, 0x6B, 0x36, 0x52, 0x8E, 0xF7, 0xA8, 0x20, 0x69, 0x6C, 0x3E, 0x42, 0x46, 0xC5, 0xA1, 0x58, 0x64 },
+ { 0x21, 0x6D, 0xC1, 0x2A, 0x10, 0x85, 0x69, 0xA3, 0xC7, 0xCD, 0xDE, 0x4A, 0xED, 0x43, 0xA6, 0xC3, 0x30, 0x13, 0x9D, 0xDA, 0x3C, 0xCC, 0x4A, 0x10, 0x89, 0x05, 0xDB, 0x38, 0x61, 0x89, 0x90, 0x50 },
+ { 0xA5, 0x7B, 0xE6, 0xAE, 0x67, 0x56, 0xF2, 0x8B, 0x02, 0xF5, 0x9D, 0xAD, 0xF7, 0xE0, 0xD7, 0xD8, 0x80, 0x7F, 0x10, 0xFA, 0x15, 0xCE, 0xD1, 0xAD, 0x35, 0x85, 0x52, 0x1A, 0x1D, 0x99, 0x5A, 0x89 },
+ { 0x81, 0x6A, 0xEF, 0x87, 0x59, 0x53, 0x71, 0x6C, 0xD7, 0xA5, 0x81, 0xF7, 0x32, 0xF5, 0x3D, 0xD4, 0x35, 0xDA, 0xB6, 0x6D, 0x09, 0xC3, 0x61, 0xD2, 0xD6, 0x59, 0x2D, 0xE1, 0x77, 0x55, 0xD8, 0xA8 },
+ { 0x9A, 0x76, 0x89, 0x32, 0x26, 0x69, 0x3B, 0x6E, 0xA9, 0x7E, 0x6A, 0x73, 0x8F, 0x9D, 0x10, 0xFB, 0x3D, 0x0B, 0x43, 0xAE, 0x0E, 0x8B, 0x7D, 0x81, 0x23, 0xEA, 0x76, 0xCE, 0x97, 0x98, 0x9C, 0x7E },
+ { 0x8D, 0xAE, 0xDB, 0x9A, 0x27, 0x15, 0x29, 0xDB, 0xB7, 0xDC, 0x3B, 0x60, 0x7F, 0xE5, 0xEB, 0x2D, 0x32, 0x11, 0x77, 0x07, 0x58, 0xDD, 0x3B, 0x0A, 0x35, 0x93, 0xD2, 0xD7, 0x95, 0x4E, 0x2D, 0x5B },
+ { 0x16, 0xDB, 0xC0, 0xAA, 0x5D, 0xD2, 0xC7, 0x74, 0xF5, 0x05, 0x10, 0x0F, 0x73, 0x37, 0x86, 0xD8, 0xA1, 0x75, 0xFC, 0xBB, 0xB5, 0x9C, 0x43, 0xE1, 0xFB, 0xFF, 0x3E, 0x1E, 0xAF, 0x31, 0xCB, 0x4A },
+ { 0x86, 0x06, 0xCB, 0x89, 0x9C, 0x6A, 0xEA, 0xF5, 0x1B, 0x9D, 0xB0, 0xFE, 0x49, 0x24, 0xA9, 0xFD, 0x5D, 0xAB, 0xC1, 0x9F, 0x88, 0x26, 0xF2, 0xBC, 0x1C, 0x1D, 0x7D, 0xA1, 0x4D, 0x2C, 0x2C, 0x99 },
+ { 0x84, 0x79, 0x73, 0x1A, 0xED, 0xA5, 0x7B, 0xD3, 0x7E, 0xAD, 0xB5, 0x1A, 0x50, 0x7E, 0x30, 0x7F, 0x3B, 0xD9, 0x5E, 0x69, 0xDB, 0xCA, 0x94, 0xF3, 0xBC, 0x21, 0x72, 0x60, 0x66, 0xAD, 0x6D, 0xFD },
+ { 0x58, 0x47, 0x3A, 0x9E, 0xA8, 0x2E, 0xFA, 0x3F, 0x3B, 0x3D, 0x8F, 0xC8, 0x3E, 0xD8, 0x86, 0x31, 0x27, 0xB3, 0x3A, 0xE8, 0xDE, 0xAE, 0x63, 0x07, 0x20, 0x1E, 0xDB, 0x6D, 0xDE, 0x61, 0xDE, 0x29 },
+ { 0x9A, 0x92, 0x55, 0xD5, 0x3A, 0xF1, 0x16, 0xDE, 0x8B, 0xA2, 0x7C, 0xE3, 0x5B, 0x4C, 0x7E, 0x15, 0x64, 0x06, 0x57, 0xA0, 0xFC, 0xB8, 0x88, 0xC7, 0x0D, 0x95, 0x43, 0x1D, 0xAC, 0xD8, 0xF8, 0x30 },
+ { 0x9E, 0xB0, 0x5F, 0xFB, 0xA3, 0x9F, 0xD8, 0x59, 0x6A, 0x45, 0x49, 0x3E, 0x18, 0xD2, 0x51, 0x0B, 0xF3, 0xEF, 0x06, 0x5C, 0x51, 0xD6, 0xE1, 0x3A, 0xBE, 0x66, 0xAA, 0x57, 0xE0, 0x5C, 0xFD, 0xB7 },
+ { 0x81, 0xDC, 0xC3, 0xA5, 0x05, 0xEA, 0xCE, 0x3F, 0x87, 0x9D, 0x8F, 0x70, 0x27, 0x76, 0x77, 0x0F, 0x9D, 0xF5, 0x0E, 0x52, 0x1D, 0x14, 0x28, 0xA8, 0x5D, 0xAF, 0x04, 0xF9, 0xAD, 0x21, 0x50, 0xE0 },
+ { 0xE3, 0xE3, 0xC4, 0xAA, 0x3A, 0xCB, 0xBC, 0x85, 0x33, 0x2A, 0xF9, 0xD5, 0x64, 0xBC, 0x24, 0x16, 0x5E, 0x16, 0x87, 0xF6, 0xB1, 0xAD, 0xCB, 0xFA, 0xE7, 0x7A, 0x8F, 0x03, 0xC7, 0x2A, 0xC2, 0x8C },
+ { 0x67, 0x46, 0xC8, 0x0B, 0x4E, 0xB5, 0x6A, 0xEA, 0x45, 0xE6, 0x4E, 0x72, 0x89, 0xBB, 0xA3, 0xED, 0xBF, 0x45, 0xEC, 0xF8, 0x20, 0x64, 0x81, 0xFF, 0x63, 0x02, 0x12, 0x29, 0x84, 0xCD, 0x52, 0x6A },
+ { 0x2B, 0x62, 0x8E, 0x52, 0x76, 0x4D, 0x7D, 0x62, 0xC0, 0x86, 0x8B, 0x21, 0x23, 0x57, 0xCD, 0xD1, 0x2D, 0x91, 0x49, 0x82, 0x2F, 0x4E, 0x98, 0x45, 0xD9, 0x18, 0xA0, 0x8D, 0x1A, 0xE9, 0x90, 0xC0 },
+ { 0xE4, 0xBF, 0xE8, 0x0D, 0x58, 0xC9, 0x19, 0x94, 0x61, 0x39, 0x09, 0xDC, 0x4B, 0x1A, 0x12, 0x49, 0x68, 0x96, 0xC0, 0x04, 0xAF, 0x7B, 0x57, 0x01, 0x48, 0x3D, 0xE4, 0x5D, 0x28, 0x23, 0xD7, 0x8E },
+ { 0xEB, 0xB4, 0xBA, 0x15, 0x0C, 0xEF, 0x27, 0x34, 0x34, 0x5B, 0x5D, 0x64, 0x1B, 0xBE, 0xD0, 0x3A, 0x21, 0xEA, 0xFA, 0xE9, 0x33, 0xC9, 0x9E, 0x00, 0x92, 0x12, 0xEF, 0x04, 0x57, 0x4A, 0x85, 0x30 },
+ { 0x39, 0x66, 0xEC, 0x73, 0xB1, 0x54, 0xAC, 0xC6, 0x97, 0xAC, 0x5C, 0xF5, 0xB2, 0x4B, 0x40, 0xBD, 0xB0, 0xDB, 0x9E, 0x39, 0x88, 0x36, 0xD7, 0x6D, 0x4B, 0x88, 0x0E, 0x3B, 0x2A, 0xF1, 0xAA, 0x27 },
+ { 0xEF, 0x7E, 0x48, 0x31, 0xB3, 0xA8, 0x46, 0x36, 0x51, 0x8D, 0x6E, 0x4B, 0xFC, 0xE6, 0x4A, 0x43, 0xDB, 0x2A, 0x5D, 0xDA, 0x9C, 0xCA, 0x2B, 0x44, 0xF3, 0x90, 0x33, 0xBD, 0xC4, 0x0D, 0x62, 0x43 },
+ { 0x7A, 0xBF, 0x6A, 0xCF, 0x5C, 0x8E, 0x54, 0x9D, 0xDB, 0xB1, 0x5A, 0xE8, 0xD8, 0xB3, 0x88, 0xC1, 0xC1, 0x97, 0xE6, 0x98, 0x73, 0x7C, 0x97, 0x85, 0x50, 0x1E, 0xD1, 0xF9, 0x49, 0x30, 0xB7, 0xD9 },
+ { 0x88, 0x01, 0x8D, 0xED, 0x66, 0x81, 0x3F, 0x0C, 0xA9, 0x5D, 0xEF, 0x47, 0x4C, 0x63, 0x06, 0x92, 0x01, 0x99, 0x67, 0xB9, 0xE3, 0x68, 0x88, 0xDA, 0xDD, 0x94, 0x12, 0x47, 0x19, 0xB6, 0x82, 0xF6 },
+ { 0x39, 0x30, 0x87, 0x6B, 0x9F, 0xC7, 0x52, 0x90, 0x36, 0xB0, 0x08, 0xB1, 0xB8, 0xBB, 0x99, 0x75, 0x22, 0xA4, 0x41, 0x63, 0x5A, 0x0C, 0x25, 0xEC, 0x02, 0xFB, 0x6D, 0x90, 0x26, 0xE5, 0x5A, 0x97 },
+ { 0x0A, 0x40, 0x49, 0xD5, 0x7E, 0x83, 0x3B, 0x56, 0x95, 0xFA, 0xC9, 0x3D, 0xD1, 0xFB, 0xEF, 0x31, 0x66, 0xB4, 0x4B, 0x12, 0xAD, 0x11, 0x24, 0x86, 0x62, 0x38, 0x3A, 0xE0, 0x51, 0xE1, 0x58, 0x27 },
+ { 0x81, 0xDC, 0xC0, 0x67, 0x8B, 0xB6, 0xA7, 0x65, 0xE4, 0x8C, 0x32, 0x09, 0x65, 0x4F, 0xE9, 0x00, 0x89, 0xCE, 0x44, 0xFF, 0x56, 0x18, 0x47, 0x7E, 0x39, 0xAB, 0x28, 0x64, 0x76, 0xDF, 0x05, 0x2B },
+ { 0xE6, 0x9B, 0x3A, 0x36, 0xA4, 0x46, 0x19, 0x12, 0xDC, 0x08, 0x34, 0x6B, 0x11, 0xDD, 0xCB, 0x9D, 0xB7, 0x96, 0xF8, 0x85, 0xFD, 0x01, 0x93, 0x6E, 0x66, 0x2F, 0xE2, 0x92, 0x97, 0xB0, 0x99, 0xA4 },
+ { 0x5A, 0xC6, 0x50, 0x3B, 0x0D, 0x8D, 0xA6, 0x91, 0x76, 0x46, 0xE6, 0xDC, 0xC8, 0x7E, 0xDC, 0x58, 0xE9, 0x42, 0x45, 0x32, 0x4C, 0xC2, 0x04, 0xF4, 0xDD, 0x4A, 0xF0, 0x15, 0x63, 0xAC, 0xD4, 0x27 },
+ { 0xDF, 0x6D, 0xDA, 0x21, 0x35, 0x9A, 0x30, 0xBC, 0x27, 0x17, 0x80, 0x97, 0x1C, 0x1A, 0xBD, 0x56, 0xA6, 0xEF, 0x16, 0x7E, 0x48, 0x08, 0x87, 0x88, 0x8E, 0x73, 0xA8, 0x6D, 0x3B, 0xF6, 0x05, 0xE9 },
+ { 0xE8, 0xE6, 0xE4, 0x70, 0x71, 0xE7, 0xB7, 0xDF, 0x25, 0x80, 0xF2, 0x25, 0xCF, 0xBB, 0xED, 0xF8, 0x4C, 0xE6, 0x77, 0x46, 0x62, 0x66, 0x28, 0xD3, 0x30, 0x97, 0xE4, 0xB7, 0xDC, 0x57, 0x11, 0x07 },
+ { 0x53, 0xE4, 0x0E, 0xAD, 0x62, 0x05, 0x1E, 0x19, 0xCB, 0x9B, 0xA8, 0x13, 0x3E, 0x3E, 0x5C, 0x1C, 0xE0, 0x0D, 0xDC, 0xAD, 0x8A, 0xCF, 0x34, 0x2A, 0x22, 0x43, 0x60, 0xB0, 0xAC, 0xC1, 0x47, 0x77 },
+ { 0x9C, 0xCD, 0x53, 0xFE, 0x80, 0xBE, 0x78, 0x6A, 0xA9, 0x84, 0x63, 0x84, 0x62, 0xFB, 0x28, 0xAF, 0xDF, 0x12, 0x2B, 0x34, 0xD7, 0x8F, 0x46, 0x87, 0xEC, 0x63, 0x2B, 0xB1, 0x9D, 0xE2, 0x37, 0x1A },
+ { 0xCB, 0xD4, 0x80, 0x52, 0xC4, 0x8D, 0x78, 0x84, 0x66, 0xA3, 0xE8, 0x11, 0x8C, 0x56, 0xC9, 0x7F, 0xE1, 0x46, 0xE5, 0x54, 0x6F, 0xAA, 0xF9, 0x3E, 0x2B, 0xC3, 0xC4, 0x7E, 0x45, 0x93, 0x97, 0x53 },
+ { 0x25, 0x68, 0x83, 0xB1, 0x4E, 0x2A, 0xF4, 0x4D, 0xAD, 0xB2, 0x8E, 0x1B, 0x34, 0xB2, 0xAC, 0x0F, 0x0F, 0x4C, 0x91, 0xC3, 0x4E, 0xC9, 0x16, 0x9E, 0x29, 0x03, 0x61, 0x58, 0xAC, 0xAA, 0x95, 0xB9 },
+ { 0x44, 0x71, 0xB9, 0x1A, 0xB4, 0x2D, 0xB7, 0xC4, 0xDD, 0x84, 0x90, 0xAB, 0x95, 0xA2, 0xEE, 0x8D, 0x04, 0xE3, 0xEF, 0x5C, 0x3D, 0x6F, 0xC7, 0x1A, 0xC7, 0x4B, 0x2B, 0x26, 0x91, 0x4D, 0x16, 0x41 },
+ { 0xA5, 0xEB, 0x08, 0x03, 0x8F, 0x8F, 0x11, 0x55, 0xED, 0x86, 0xE6, 0x31, 0x90, 0x6F, 0xC1, 0x30, 0x95, 0xF6, 0xBB, 0xA4, 0x1D, 0xE5, 0xD4, 0xE7, 0x95, 0x75, 0x8E, 0xC8, 0xC8, 0xDF, 0x8A, 0xF1 },
+ { 0xDC, 0x1D, 0xB6, 0x4E, 0xD8, 0xB4, 0x8A, 0x91, 0x0E, 0x06, 0x0A, 0x6B, 0x86, 0x63, 0x74, 0xC5, 0x78, 0x78, 0x4E, 0x9A, 0xC4, 0x9A, 0xB2, 0x77, 0x40, 0x92, 0xAC, 0x71, 0x50, 0x19, 0x34, 0xAC },
+ { 0x28, 0x54, 0x13, 0xB2, 0xF2, 0xEE, 0x87, 0x3D, 0x34, 0x31, 0x9E, 0xE0, 0xBB, 0xFB, 0xB9, 0x0F, 0x32, 0xDA, 0x43, 0x4C, 0xC8, 0x7E, 0x3D, 0xB5, 0xED, 0x12, 0x1B, 0xB3, 0x98, 0xED, 0x96, 0x4B },
+ { 0x02, 0x16, 0xE0, 0xF8, 0x1F, 0x75, 0x0F, 0x26, 0xF1, 0x99, 0x8B, 0xC3, 0x93, 0x4E, 0x3E, 0x12, 0x4C, 0x99, 0x45, 0xE6, 0x85, 0xA6, 0x0B, 0x25, 0xE8, 0xFB, 0xD9, 0x62, 0x5A, 0xB6, 0xB5, 0x99 },
+ { 0x38, 0xC4, 0x10, 0xF5, 0xB9, 0xD4, 0x07, 0x20, 0x50, 0x75, 0x5B, 0x31, 0xDC, 0xA8, 0x9F, 0xD5, 0x39, 0x5C, 0x67, 0x85, 0xEE, 0xB3, 0xD7, 0x90, 0xF3, 0x20, 0xFF, 0x94, 0x1C, 0x5A, 0x93, 0xBF },
+ { 0xF1, 0x84, 0x17, 0xB3, 0x9D, 0x61, 0x7A, 0xB1, 0xC1, 0x8F, 0xDF, 0x91, 0xEB, 0xD0, 0xFC, 0x6D, 0x55, 0x16, 0xBB, 0x34, 0xCF, 0x39, 0x36, 0x40, 0x37, 0xBC, 0xE8, 0x1F, 0xA0, 0x4C, 0xEC, 0xB1 },
+ { 0x1F, 0xA8, 0x77, 0xDE, 0x67, 0x25, 0x9D, 0x19, 0x86, 0x3A, 0x2A, 0x34, 0xBC, 0xC6, 0x96, 0x2A, 0x2B, 0x25, 0xFC, 0xBF, 0x5C, 0xBE, 0xCD, 0x7E, 0xDE, 0x8F, 0x1F, 0xA3, 0x66, 0x88, 0xA7, 0x96 },
+ { 0x5B, 0xD1, 0x69, 0xE6, 0x7C, 0x82, 0xC2, 0xC2, 0xE9, 0x8E, 0xF7, 0x00, 0x8B, 0xDF, 0x26, 0x1F, 0x2D, 0xDF, 0x30, 0xB1, 0xC0, 0x0F, 0x9E, 0x7F, 0x27, 0x5B, 0xB3, 0xE8, 0xA2, 0x8D, 0xC9, 0xA2 },
+ { 0xC8, 0x0A, 0xBE, 0xEB, 0xB6, 0x69, 0xAD, 0x5D, 0xEE, 0xB5, 0xF5, 0xEC, 0x8E, 0xA6, 0xB7, 0xA0, 0x5D, 0xDF, 0x7D, 0x31, 0xEC, 0x4C, 0x0A, 0x2E, 0xE2, 0x0B, 0x0B, 0x98, 0xCA, 0xEC, 0x67, 0x46 },
+ { 0xE7, 0x6D, 0x3F, 0xBD, 0xA5, 0xBA, 0x37, 0x4E, 0x6B, 0xF8, 0xE5, 0x0F, 0xAD, 0xC3, 0xBB, 0xB9, 0xBA, 0x5C, 0x20, 0x6E, 0xBD, 0xEC, 0x89, 0xA3, 0xA5, 0x4C, 0xF3, 0xDD, 0x84, 0xA0, 0x70, 0x16 },
+ { 0x7B, 0xBA, 0x9D, 0xC5, 0xB5, 0xDB, 0x20, 0x71, 0xD1, 0x77, 0x52, 0xB1, 0x04, 0x4C, 0x1E, 0xCE, 0xD9, 0x6A, 0xAF, 0x2D, 0xD4, 0x6E, 0x9B, 0x43, 0x37, 0x50, 0xE8, 0xEA, 0x0D, 0xCC, 0x18, 0x70 },
+ { 0xF2, 0x9B, 0x1B, 0x1A, 0xB9, 0xBA, 0xB1, 0x63, 0x01, 0x8E, 0xE3, 0xDA, 0x15, 0x23, 0x2C, 0xCA, 0x78, 0xEC, 0x52, 0xDB, 0xC3, 0x4E, 0xDA, 0x5B, 0x82, 0x2E, 0xC1, 0xD8, 0x0F, 0xC2, 0x1B, 0xD0 },
+ { 0x9E, 0xE3, 0xE3, 0xE7, 0xE9, 0x00, 0xF1, 0xE1, 0x1D, 0x30, 0x8C, 0x4B, 0x2B, 0x30, 0x76, 0xD2, 0x72, 0xCF, 0x70, 0x12, 0x4F, 0x9F, 0x51, 0xE1, 0xDA, 0x60, 0xF3, 0x78, 0x46, 0xCD, 0xD2, 0xF4 },
+ { 0x70, 0xEA, 0x3B, 0x01, 0x76, 0x92, 0x7D, 0x90, 0x96, 0xA1, 0x85, 0x08, 0xCD, 0x12, 0x3A, 0x29, 0x03, 0x25, 0x92, 0x0A, 0x9D, 0x00, 0xA8, 0x9B, 0x5D, 0xE0, 0x42, 0x73, 0xFB, 0xC7, 0x6B, 0x85 },
+ { 0x67, 0xDE, 0x25, 0xC0, 0x2A, 0x4A, 0xAB, 0xA2, 0x3B, 0xDC, 0x97, 0x3C, 0x8B, 0xB0, 0xB5, 0x79, 0x6D, 0x47, 0xCC, 0x06, 0x59, 0xD4, 0x3D, 0xFF, 0x1F, 0x97, 0xDE, 0x17, 0x49, 0x63, 0xB6, 0x8E },
+ { 0xB2, 0x16, 0x8E, 0x4E, 0x0F, 0x18, 0xB0, 0xE6, 0x41, 0x00, 0xB5, 0x17, 0xED, 0x95, 0x25, 0x7D, 0x73, 0xF0, 0x62, 0x0D, 0xF8, 0x85, 0xC1, 0x3D, 0x2E, 0xCF, 0x79, 0x36, 0x7B, 0x38, 0x4C, 0xEE },
+ { 0x2E, 0x7D, 0xEC, 0x24, 0x28, 0x85, 0x3B, 0x2C, 0x71, 0x76, 0x07, 0x45, 0x54, 0x1F, 0x7A, 0xFE, 0x98, 0x25, 0xB5, 0xDD, 0x77, 0xDF, 0x06, 0x51, 0x1D, 0x84, 0x41, 0xA9, 0x4B, 0xAC, 0xC9, 0x27 },
+ { 0xCA, 0x9F, 0xFA, 0xC4, 0xC4, 0x3F, 0x0B, 0x48, 0x46, 0x1D, 0xC5, 0xC2, 0x63, 0xBE, 0xA3, 0xF6, 0xF0, 0x06, 0x11, 0xCE, 0xAC, 0xAB, 0xF6, 0xF8, 0x95, 0xBA, 0x2B, 0x01, 0x01, 0xDB, 0xB6, 0x8D },
+ { 0x74, 0x10, 0xD4, 0x2D, 0x8F, 0xD1, 0xD5, 0xE9, 0xD2, 0xF5, 0x81, 0x5C, 0xB9, 0x34, 0x17, 0x99, 0x88, 0x28, 0xEF, 0x3C, 0x42, 0x30, 0xBF, 0xBD, 0x41, 0x2D, 0xF0, 0xA4, 0xA7, 0xA2, 0x50, 0x7A },
+ { 0x50, 0x10, 0xF6, 0x84, 0x51, 0x6D, 0xCC, 0xD0, 0xB6, 0xEE, 0x08, 0x52, 0xC2, 0x51, 0x2B, 0x4D, 0xC0, 0x06, 0x6C, 0xF0, 0xD5, 0x6F, 0x35, 0x30, 0x29, 0x78, 0xDB, 0x8A, 0xE3, 0x2C, 0x6A, 0x81 },
+ { 0xAC, 0xAA, 0xB5, 0x85, 0xF7, 0xB7, 0x9B, 0x71, 0x99, 0x35, 0xCE, 0xB8, 0x95, 0x23, 0xDD, 0xC5, 0x48, 0x27, 0xF7, 0x5C, 0x56, 0x88, 0x38, 0x56, 0x15, 0x4A, 0x56, 0xCD, 0xCD, 0x5E, 0xE9, 0x88 },
+ { 0x66, 0x6D, 0xE5, 0xD1, 0x44, 0x0F, 0xEE, 0x73, 0x31, 0xAA, 0xF0, 0x12, 0x3A, 0x62, 0xEF, 0x2D, 0x8B, 0xA5, 0x74, 0x53, 0xA0, 0x76, 0x96, 0x35, 0xAC, 0x6C, 0xD0, 0x1E, 0x63, 0x3F, 0x77, 0x12 },
+ { 0xA6, 0xF9, 0x86, 0x58, 0xF6, 0xEA, 0xBA, 0xF9, 0x02, 0xD8, 0xB3, 0x87, 0x1A, 0x4B, 0x10, 0x1D, 0x16, 0x19, 0x6E, 0x8A, 0x4B, 0x24, 0x1E, 0x15, 0x58, 0xFE, 0x29, 0x96, 0x6E, 0x10, 0x3E, 0x8D },
+ { 0x89, 0x15, 0x46, 0xA8, 0xB2, 0x9F, 0x30, 0x47, 0xDD, 0xCF, 0xE5, 0xB0, 0x0E, 0x45, 0xFD, 0x55, 0x75, 0x63, 0x73, 0x10, 0x5E, 0xA8, 0x63, 0x7D, 0xFC, 0xFF, 0x54, 0x7B, 0x6E, 0xA9, 0x53, 0x5F },
+ { 0x18, 0xDF, 0xBC, 0x1A, 0xC5, 0xD2, 0x5B, 0x07, 0x61, 0x13, 0x7D, 0xBD, 0x22, 0xC1, 0x7C, 0x82, 0x9D, 0x0F, 0x0E, 0xF1, 0xD8, 0x23, 0x44, 0xE9, 0xC8, 0x9C, 0x28, 0x66, 0x94, 0xDA, 0x24, 0xE8 },
+ { 0xB5, 0x4B, 0x9B, 0x67, 0xF8, 0xFE, 0xD5, 0x4B, 0xBF, 0x5A, 0x26, 0x66, 0xDB, 0xDF, 0x4B, 0x23, 0xCF, 0xF1, 0xD1, 0xB6, 0xF4, 0xAF, 0xC9, 0x85, 0xB2, 0xE6, 0xD3, 0x30, 0x5A, 0x9F, 0xF8, 0x0F },
+ { 0x7D, 0xB4, 0x42, 0xE1, 0x32, 0xBA, 0x59, 0xBC, 0x12, 0x89, 0xAA, 0x98, 0xB0, 0xD3, 0xE8, 0x06, 0x00, 0x4F, 0x8E, 0xC1, 0x28, 0x11, 0xAF, 0x1E, 0x2E, 0x33, 0xC6, 0x9B, 0xFD, 0xE7, 0x29, 0xE1 },
+ { 0x25, 0x0F, 0x37, 0xCD, 0xC1, 0x5E, 0x81, 0x7D, 0x2F, 0x16, 0x0D, 0x99, 0x56, 0xC7, 0x1F, 0xE3, 0xEB, 0x5D, 0xB7, 0x45, 0x56, 0xE4, 0xAD, 0xF9, 0xA4, 0xFF, 0xAF, 0xBA, 0x74, 0x01, 0x03, 0x96 },
+ { 0x4A, 0xB8, 0xA3, 0xDD, 0x1D, 0xDF, 0x8A, 0xD4, 0x3D, 0xAB, 0x13, 0xA2, 0x7F, 0x66, 0xA6, 0x54, 0x4F, 0x29, 0x05, 0x97, 0xFA, 0x96, 0x04, 0x0E, 0x0E, 0x1D, 0xB9, 0x26, 0x3A, 0xA4, 0x79, 0xF8 },
+ { 0xEE, 0x61, 0x72, 0x7A, 0x07, 0x66, 0xDF, 0x93, 0x9C, 0xCD, 0xC8, 0x60, 0x33, 0x40, 0x44, 0xC7, 0x9A, 0x3C, 0x9B, 0x15, 0x62, 0x00, 0xBC, 0x3A, 0xA3, 0x29, 0x73, 0x48, 0x3D, 0x83, 0x41, 0xAE },
+ { 0x3F, 0x68, 0xC7, 0xEC, 0x63, 0xAC, 0x11, 0xEB, 0xB9, 0x8F, 0x94, 0xB3, 0x39, 0xB0, 0x5C, 0x10, 0x49, 0x84, 0xFD, 0xA5, 0x01, 0x03, 0x06, 0x01, 0x44, 0xE5, 0xA2, 0xBF, 0xCC, 0xC9, 0xDA, 0x95 },
+ { 0x05, 0x6F, 0x29, 0x81, 0x6B, 0x8A, 0xF8, 0xF5, 0x66, 0x82, 0xBC, 0x4D, 0x7C, 0xF0, 0x94, 0x11, 0x1D, 0xA7, 0x73, 0x3E, 0x72, 0x6C, 0xD1, 0x3D, 0x6B, 0x3E, 0x8E, 0xA0, 0x3E, 0x92, 0xA0, 0xD5 },
+ { 0xF5, 0xEC, 0x43, 0xA2, 0x8A, 0xCB, 0xEF, 0xF1, 0xF3, 0x31, 0x8A, 0x5B, 0xCA, 0xC7, 0xC6, 0x6D, 0xDB, 0x52, 0x30, 0xB7, 0x9D, 0xB2, 0xD1, 0x05, 0xBC, 0xBE, 0x15, 0xF3, 0xC1, 0x14, 0x8D, 0x69 },
+ { 0x2A, 0x69, 0x60, 0xAD, 0x1D, 0x8D, 0xD5, 0x47, 0x55, 0x5C, 0xFB, 0xD5, 0xE4, 0x60, 0x0F, 0x1E, 0xAA, 0x1C, 0x8E, 0xDA, 0x34, 0xDE, 0x03, 0x74, 0xEC, 0x4A, 0x26, 0xEA, 0xAA, 0xA3, 0x3B, 0x4E },
+ { 0xDC, 0xC1, 0xEA, 0x7B, 0xAA, 0xB9, 0x33, 0x84, 0xF7, 0x6B, 0x79, 0x68, 0x66, 0x19, 0x97, 0x54, 0x74, 0x2F, 0x7B, 0x96, 0xD6, 0xB4, 0xC1, 0x20, 0x16, 0x5C, 0x04, 0xA6, 0xC4, 0xF5, 0xCE, 0x10 },
+ { 0x13, 0xD5, 0xDF, 0x17, 0x92, 0x21, 0x37, 0x9C, 0x6A, 0x78, 0xC0, 0x7C, 0x79, 0x3F, 0xF5, 0x34, 0x87, 0xCA, 0xE6, 0xBF, 0x9F, 0xE8, 0x82, 0x54, 0x1A, 0xB0, 0xE7, 0x35, 0xE3, 0xEA, 0xDA, 0x3B },
+ { 0x8C, 0x59, 0xE4, 0x40, 0x76, 0x41, 0xA0, 0x1E, 0x8F, 0xF9, 0x1F, 0x99, 0x80, 0xDC, 0x23, 0x6F, 0x4E, 0xCD, 0x6F, 0xCF, 0x52, 0x58, 0x9A, 0x09, 0x9A, 0x96, 0x16, 0x33, 0x96, 0x77, 0x14, 0xE1 },
+ { 0x83, 0x3B, 0x1A, 0xC6, 0xA2, 0x51, 0xFD, 0x08, 0xFD, 0x6D, 0x90, 0x8F, 0xEA, 0x2A, 0x4E, 0xE1, 0xE0, 0x40, 0xBC, 0xA9, 0x3F, 0xC1, 0xA3, 0x8E, 0xC3, 0x82, 0x0E, 0x0C, 0x10, 0xBD, 0x82, 0xEA },
+ { 0xA2, 0x44, 0xF9, 0x27, 0xF3, 0xB4, 0x0B, 0x8F, 0x6C, 0x39, 0x15, 0x70, 0xC7, 0x65, 0x41, 0x8F, 0x2F, 0x6E, 0x70, 0x8E, 0xAC, 0x90, 0x06, 0xC5, 0x1A, 0x7F, 0xEF, 0xF4, 0xAF, 0x3B, 0x2B, 0x9E },
+ { 0x3D, 0x99, 0xED, 0x95, 0x50, 0xCF, 0x11, 0x96, 0xE6, 0xC4, 0xD2, 0x0C, 0x25, 0x96, 0x20, 0xF8, 0x58, 0xC3, 0xD7, 0x03, 0x37, 0x4C, 0x12, 0x8C, 0xE7, 0xB5, 0x90, 0x31, 0x0C, 0x83, 0x04, 0x6D },
+ { 0x2B, 0x35, 0xC4, 0x7D, 0x7B, 0x87, 0x76, 0x1F, 0x0A, 0xE4, 0x3A, 0xC5, 0x6A, 0xC2, 0x7B, 0x9F, 0x25, 0x83, 0x03, 0x67, 0xB5, 0x95, 0xBE, 0x8C, 0x24, 0x0E, 0x94, 0x60, 0x0C, 0x6E, 0x33, 0x12 },
+ { 0x5D, 0x11, 0xED, 0x37, 0xD2, 0x4D, 0xC7, 0x67, 0x30, 0x5C, 0xB7, 0xE1, 0x46, 0x7D, 0x87, 0xC0, 0x65, 0xAC, 0x4B, 0xC8, 0xA4, 0x26, 0xDE, 0x38, 0x99, 0x1F, 0xF5, 0x9A, 0xA8, 0x73, 0x5D, 0x02 },
+ { 0xB8, 0x36, 0x47, 0x8E, 0x1C, 0xA0, 0x64, 0x0D, 0xCE, 0x6F, 0xD9, 0x10, 0xA5, 0x09, 0x62, 0x72, 0xC8, 0x33, 0x09, 0x90, 0xCD, 0x97, 0x86, 0x4A, 0xC2, 0xBF, 0x14, 0xEF, 0x6B, 0x23, 0x91, 0x4A },
+ { 0x91, 0x00, 0xF9, 0x46, 0xD6, 0xCC, 0xDE, 0x3A, 0x59, 0x7F, 0x90, 0xD3, 0x9F, 0xC1, 0x21, 0x5B, 0xAD, 0xDC, 0x74, 0x13, 0x64, 0x3D, 0x85, 0xC2, 0x1C, 0x3E, 0xEE, 0x5D, 0x2D, 0xD3, 0x28, 0x94 },
+ { 0xDA, 0x70, 0xEE, 0xDD, 0x23, 0xE6, 0x63, 0xAA, 0x1A, 0x74, 0xB9, 0x76, 0x69, 0x35, 0xB4, 0x79, 0x22, 0x2A, 0x72, 0xAF, 0xBA, 0x5C, 0x79, 0x51, 0x58, 0xDA, 0xD4, 0x1A, 0x3B, 0xD7, 0x7E, 0x40 },
+ { 0xF0, 0x67, 0xED, 0x6A, 0x0D, 0xBD, 0x43, 0xAA, 0x0A, 0x92, 0x54, 0xE6, 0x9F, 0xD6, 0x6B, 0xDD, 0x8A, 0xCB, 0x87, 0xDE, 0x93, 0x6C, 0x25, 0x8C, 0xFB, 0x02, 0x28, 0x5F, 0x2C, 0x11, 0xFA, 0x79 },
+ { 0x71, 0x5C, 0x99, 0xC7, 0xD5, 0x75, 0x80, 0xCF, 0x97, 0x53, 0xB4, 0xC1, 0xD7, 0x95, 0xE4, 0x5A, 0x83, 0xFB, 0xB2, 0x28, 0xC0, 0xD3, 0x6F, 0xBE, 0x20, 0xFA, 0xF3, 0x9B, 0xDD, 0x6D, 0x4E, 0x85 },
+ { 0xE4, 0x57, 0xD6, 0xAD, 0x1E, 0x67, 0xCB, 0x9B, 0xBD, 0x17, 0xCB, 0xD6, 0x98, 0xFA, 0x6D, 0x7D, 0xAE, 0x0C, 0x9B, 0x7A, 0xD6, 0xCB, 0xD6, 0x53, 0x96, 0x34, 0xE3, 0x2A, 0x71, 0x9C, 0x84, 0x92 },
+ { 0xEC, 0xE3, 0xEA, 0x81, 0x03, 0xE0, 0x24, 0x83, 0xC6, 0x4A, 0x70, 0xA4, 0xBD, 0xCE, 0xE8, 0xCE, 0xB6, 0x27, 0x8F, 0x25, 0x33, 0xF3, 0xF4, 0x8D, 0xBE, 0xED, 0xFB, 0xA9, 0x45, 0x31, 0xD4, 0xAE },
+ { 0x38, 0x8A, 0xA5, 0xD3, 0x66, 0x7A, 0x97, 0xC6, 0x8D, 0x3D, 0x56, 0xF8, 0xF3, 0xEE, 0x8D, 0x3D, 0x36, 0x09, 0x1F, 0x17, 0xFE, 0x5D, 0x1B, 0x0D, 0x5D, 0x84, 0xC9, 0x3B, 0x2F, 0xFE, 0x40, 0xBD },
+ { 0x8B, 0x6B, 0x31, 0xB9, 0xAD, 0x7C, 0x3D, 0x5C, 0xD8, 0x4B, 0xF9, 0x89, 0x47, 0xB9, 0xCD, 0xB5, 0x9D, 0xF8, 0xA2, 0x5F, 0xF7, 0x38, 0x10, 0x10, 0x13, 0xBE, 0x4F, 0xD6, 0x5E, 0x1D, 0xD1, 0xA3 },
+ { 0x06, 0x62, 0x91, 0xF6, 0xBB, 0xD2, 0x5F, 0x3C, 0x85, 0x3D, 0xB7, 0xD8, 0xB9, 0x5C, 0x9A, 0x1C, 0xFB, 0x9B, 0xF1, 0xC1, 0xC9, 0x9F, 0xB9, 0x5A, 0x9B, 0x78, 0x69, 0xD9, 0x0F, 0x1C, 0x29, 0x03 },
+ { 0xA7, 0x07, 0xEF, 0xBC, 0xCD, 0xCE, 0xED, 0x42, 0x96, 0x7A, 0x66, 0xF5, 0x53, 0x9B, 0x93, 0xED, 0x75, 0x60, 0xD4, 0x67, 0x30, 0x40, 0x16, 0xC4, 0x78, 0x0D, 0x77, 0x55, 0xA5, 0x65, 0xD4, 0xC4 },
+ { 0x38, 0xC5, 0x3D, 0xFB, 0x70, 0xBE, 0x7E, 0x79, 0x2B, 0x07, 0xA6, 0xA3, 0x5B, 0x8A, 0x6A, 0x0A, 0xBA, 0x02, 0xC5, 0xC5, 0xF3, 0x8B, 0xAF, 0x5C, 0x82, 0x3F, 0xDF, 0xD9, 0xE4, 0x2D, 0x65, 0x7E },
+ { 0xF2, 0x91, 0x13, 0x86, 0x50, 0x1D, 0x9A, 0xB9, 0xD7, 0x20, 0xCF, 0x8A, 0xD1, 0x05, 0x03, 0xD5, 0x63, 0x4B, 0xF4, 0xB7, 0xD1, 0x2B, 0x56, 0xDF, 0xB7, 0x4F, 0xEC, 0xC6, 0xE4, 0x09, 0x3F, 0x68 },
+ { 0xC6, 0xF2, 0xBD, 0xD5, 0x2B, 0x81, 0xE6, 0xE4, 0xF6, 0x59, 0x5A, 0xBD, 0x4D, 0x7F, 0xB3, 0x1F, 0x65, 0x11, 0x69, 0xD0, 0x0F, 0xF3, 0x26, 0x92, 0x6B, 0x34, 0x94, 0x7B, 0x28, 0xA8, 0x39, 0x59 },
+ { 0x29, 0x3D, 0x94, 0xB1, 0x8C, 0x98, 0xBB, 0x32, 0x23, 0x36, 0x6B, 0x8C, 0xE7, 0x4C, 0x28, 0xFB, 0xDF, 0x28, 0xE1, 0xF8, 0x4A, 0x33, 0x50, 0xB0, 0xEB, 0x2D, 0x18, 0x04, 0xA5, 0x77, 0x57, 0x9B },
+ { 0x2C, 0x2F, 0xA5, 0xC0, 0xB5, 0x15, 0x33, 0x16, 0x5B, 0xC3, 0x75, 0xC2, 0x2E, 0x27, 0x81, 0x76, 0x82, 0x70, 0xA3, 0x83, 0x98, 0x5D, 0x13, 0xBD, 0x6B, 0x67, 0xB6, 0xFD, 0x67, 0xF8, 0x89, 0xEB },
+ { 0xCA, 0xA0, 0x9B, 0x82, 0xB7, 0x25, 0x62, 0xE4, 0x3F, 0x4B, 0x22, 0x75, 0xC0, 0x91, 0x91, 0x8E, 0x62, 0x4D, 0x91, 0x16, 0x61, 0xCC, 0x81, 0x1B, 0xB5, 0xFA, 0xEC, 0x51, 0xF6, 0x08, 0x8E, 0xF7 },
+ { 0x24, 0x76, 0x1E, 0x45, 0xE6, 0x74, 0x39, 0x53, 0x79, 0xFB, 0x17, 0x72, 0x9C, 0x78, 0xCB, 0x93, 0x9E, 0x6F, 0x74, 0xC5, 0xDF, 0xFB, 0x9C, 0x96, 0x1F, 0x49, 0x59, 0x82, 0xC3, 0xED, 0x1F, 0xE3 },
+ { 0x55, 0xB7, 0x0A, 0x82, 0x13, 0x1E, 0xC9, 0x48, 0x88, 0xD7, 0xAB, 0x54, 0xA7, 0xC5, 0x15, 0x25, 0x5C, 0x39, 0x38, 0xBB, 0x10, 0xBC, 0x78, 0x4D, 0xC9, 0xB6, 0x7F, 0x07, 0x6E, 0x34, 0x1A, 0x73 },
+ { 0x6A, 0xB9, 0x05, 0x7B, 0x97, 0x7E, 0xBC, 0x3C, 0xA4, 0xD4, 0xCE, 0x74, 0x50, 0x6C, 0x25, 0xCC, 0xCD, 0xC5, 0x66, 0x49, 0x7C, 0x45, 0x0B, 0x54, 0x15, 0xA3, 0x94, 0x86, 0xF8, 0x65, 0x7A, 0x03 },
+ { 0x24, 0x06, 0x6D, 0xEE, 0xE0, 0xEC, 0xEE, 0x15, 0xA4, 0x5F, 0x0A, 0x32, 0x6D, 0x0F, 0x8D, 0xBC, 0x79, 0x76, 0x1E, 0xBB, 0x93, 0xCF, 0x8C, 0x03, 0x77, 0xAF, 0x44, 0x09, 0x78, 0xFC, 0xF9, 0x94 },
+ { 0x20, 0x00, 0x0D, 0x3F, 0x66, 0xBA, 0x76, 0x86, 0x0D, 0x5A, 0x95, 0x06, 0x88, 0xB9, 0xAA, 0x0D, 0x76, 0xCF, 0xEA, 0x59, 0xB0, 0x05, 0xD8, 0x59, 0x91, 0x4B, 0x1A, 0x46, 0x65, 0x3A, 0x93, 0x9B },
+ { 0xB9, 0x2D, 0xAA, 0x79, 0x60, 0x3E, 0x3B, 0xDB, 0xC3, 0xBF, 0xE0, 0xF4, 0x19, 0xE4, 0x09, 0xB2, 0xEA, 0x10, 0xDC, 0x43, 0x5B, 0xEE, 0xFE, 0x29, 0x59, 0xDA, 0x16, 0x89, 0x5D, 0x5D, 0xCA, 0x1C },
+ { 0xE9, 0x47, 0x94, 0x87, 0x05, 0xB2, 0x06, 0xD5, 0x72, 0xB0, 0xE8, 0xF6, 0x2F, 0x66, 0xA6, 0x55, 0x1C, 0xBD, 0x6B, 0xC3, 0x05, 0xD2, 0x6C, 0xE7, 0x53, 0x9A, 0x12, 0xF9, 0xAA, 0xDF, 0x75, 0x71 },
+ { 0x3D, 0x67, 0xC1, 0xB3, 0xF9, 0xB2, 0x39, 0x10, 0xE3, 0xD3, 0x5E, 0x6B, 0x0F, 0x2C, 0xCF, 0x44, 0xA0, 0xB5, 0x40, 0xA4, 0x5C, 0x18, 0xBA, 0x3C, 0x36, 0x26, 0x4D, 0xD4, 0x8E, 0x96, 0xAF, 0x6A },
+ { 0xC7, 0x55, 0x8B, 0xAB, 0xDA, 0x04, 0xBC, 0xCB, 0x76, 0x4D, 0x0B, 0xBF, 0x33, 0x58, 0x42, 0x51, 0x41, 0x90, 0x2D, 0x22, 0x39, 0x1D, 0x9F, 0x8C, 0x59, 0x15, 0x9F, 0xEC, 0x9E, 0x49, 0xB1, 0x51 },
+ { 0x0B, 0x73, 0x2B, 0xB0, 0x35, 0x67, 0x5A, 0x50, 0xFF, 0x58, 0xF2, 0xC2, 0x42, 0xE4, 0x71, 0x0A, 0xEC, 0xE6, 0x46, 0x70, 0x07, 0x9C, 0x13, 0x04, 0x4C, 0x79, 0xC9, 0xB7, 0x49, 0x1F, 0x70, 0x00 },
+ { 0xD1, 0x20, 0xB5, 0xEF, 0x6D, 0x57, 0xEB, 0xF0, 0x6E, 0xAF, 0x96, 0xBC, 0x93, 0x3C, 0x96, 0x7B, 0x16, 0xCB, 0xE6, 0xE2, 0xBF, 0x00, 0x74, 0x1C, 0x30, 0xAA, 0x1C, 0x54, 0xBA, 0x64, 0x80, 0x1F },
+ { 0x58, 0xD2, 0x12, 0xAD, 0x6F, 0x58, 0xAE, 0xF0, 0xF8, 0x01, 0x16, 0xB4, 0x41, 0xE5, 0x7F, 0x61, 0x95, 0xBF, 0xEF, 0x26, 0xB6, 0x14, 0x63, 0xED, 0xEC, 0x11, 0x83, 0xCD, 0xB0, 0x4F, 0xE7, 0x6D },
+ { 0xB8, 0x83, 0x6F, 0x51, 0xD1, 0xE2, 0x9B, 0xDF, 0xDB, 0xA3, 0x25, 0x56, 0x53, 0x60, 0x26, 0x8B, 0x8F, 0xAD, 0x62, 0x74, 0x73, 0xED, 0xEC, 0xEF, 0x7E, 0xAE, 0xFE, 0xE8, 0x37, 0xC7, 0x40, 0x03 },
+ { 0xC5, 0x47, 0xA3, 0xC1, 0x24, 0xAE, 0x56, 0x85, 0xFF, 0xA7, 0xB8, 0xED, 0xAF, 0x96, 0xEC, 0x86, 0xF8, 0xB2, 0xD0, 0xD5, 0x0C, 0xEE, 0x8B, 0xE3, 0xB1, 0xF0, 0xC7, 0x67, 0x63, 0x06, 0x9D, 0x9C },
+ { 0x5D, 0x16, 0x8B, 0x76, 0x9A, 0x2F, 0x67, 0x85, 0x3D, 0x62, 0x95, 0xF7, 0x56, 0x8B, 0xE4, 0x0B, 0xB7, 0xA1, 0x6B, 0x8D, 0x65, 0xBA, 0x87, 0x63, 0x5D, 0x19, 0x78, 0xD2, 0xAB, 0x11, 0xBA, 0x2A },
+ { 0xA2, 0xF6, 0x75, 0xDC, 0x73, 0x02, 0x63, 0x8C, 0xB6, 0x02, 0x01, 0x06, 0x4C, 0xA5, 0x50, 0x77, 0x71, 0x4D, 0x71, 0xFE, 0x09, 0x6A, 0x31, 0x5F, 0x2F, 0xE7, 0x40, 0x12, 0x77, 0xCA, 0xA5, 0xAF },
+ { 0xC8, 0xAA, 0xB5, 0xCD, 0x01, 0x60, 0xAE, 0x78, 0xCD, 0x2E, 0x8A, 0xC5, 0xFB, 0x0E, 0x09, 0x3C, 0xDB, 0x5C, 0x4B, 0x60, 0x52, 0xA0, 0xA9, 0x7B, 0xB0, 0x42, 0x16, 0x82, 0x6F, 0xA7, 0xA4, 0x37 },
+ { 0xFF, 0x68, 0xCA, 0x40, 0x35, 0xBF, 0xEB, 0x43, 0xFB, 0xF1, 0x45, 0xFD, 0xDD, 0x5E, 0x43, 0xF1, 0xCE, 0xA5, 0x4F, 0x11, 0xF7, 0xBE, 0xE1, 0x30, 0x58, 0xF0, 0x27, 0x32, 0x9A, 0x4A, 0x5F, 0xA4 },
+ { 0x1D, 0x4E, 0x54, 0x87, 0xAE, 0x3C, 0x74, 0x0F, 0x2B, 0xA6, 0xE5, 0x41, 0xAC, 0x91, 0xBC, 0x2B, 0xFC, 0xD2, 0x99, 0x9C, 0x51, 0x8D, 0x80, 0x7B, 0x42, 0x67, 0x48, 0x80, 0x3A, 0x35, 0x0F, 0xD4 },
+ { 0x6D, 0x24, 0x4E, 0x1A, 0x06, 0xCE, 0x4E, 0xF5, 0x78, 0xDD, 0x0F, 0x63, 0xAF, 0xF0, 0x93, 0x67, 0x06, 0x73, 0x51, 0x19, 0xCA, 0x9C, 0x8D, 0x22, 0xD8, 0x6C, 0x80, 0x14, 0x14, 0xAB, 0x97, 0x41 },
+ { 0xDE, 0xCF, 0x73, 0x29, 0xDB, 0xCC, 0x82, 0x7B, 0x8F, 0xC5, 0x24, 0xC9, 0x43, 0x1E, 0x89, 0x98, 0x02, 0x9E, 0xCE, 0x12, 0xCE, 0x93, 0xB7, 0xB2, 0xF3, 0xE7, 0x69, 0xA9, 0x41, 0xFB, 0x8C, 0xEA },
+ { 0x2F, 0xAF, 0xCC, 0x0F, 0x2E, 0x63, 0xCB, 0xD0, 0x77, 0x55, 0xBE, 0x7B, 0x75, 0xEC, 0xEA, 0x0A, 0xDF, 0xF9, 0xAA, 0x5E, 0xDE, 0x2A, 0x52, 0xFD, 0xAB, 0x4D, 0xFD, 0x03, 0x74, 0xCD, 0x48, 0x3F },
+ { 0xAA, 0x85, 0x01, 0x0D, 0xD4, 0x6A, 0x54, 0x6B, 0x53, 0x5E, 0xF4, 0xCF, 0x5F, 0x07, 0xD6, 0x51, 0x61, 0xE8, 0x98, 0x28, 0xF3, 0xA7, 0x7D, 0xB7, 0xB9, 0xB5, 0x6F, 0x0D, 0xF5, 0x9A, 0xAE, 0x45 },
+ { 0x07, 0xE8, 0xE1, 0xEE, 0x73, 0x2C, 0xB0, 0xD3, 0x56, 0xC9, 0xC0, 0xD1, 0x06, 0x9C, 0x89, 0xD1, 0x7A, 0xDF, 0x6A, 0x9A, 0x33, 0x4F, 0x74, 0x5E, 0xC7, 0x86, 0x73, 0x32, 0x54, 0x8C, 0xA8, 0xE9 },
+ { 0x0E, 0x01, 0xE8, 0x1C, 0xAD, 0xA8, 0x16, 0x2B, 0xFD, 0x5F, 0x8A, 0x8C, 0x81, 0x8A, 0x6C, 0x69, 0xFE, 0xDF, 0x02, 0xCE, 0xB5, 0x20, 0x85, 0x23, 0xCB, 0xE5, 0x31, 0x3B, 0x89, 0xCA, 0x10, 0x53 },
+ { 0x6B, 0xB6, 0xC6, 0x47, 0x26, 0x55, 0x08, 0x43, 0x99, 0x85, 0x2E, 0x00, 0x24, 0x9F, 0x8C, 0xB2, 0x47, 0x89, 0x6D, 0x39, 0x2B, 0x02, 0xD7, 0x3B, 0x7F, 0x0D, 0xD8, 0x18, 0xE1, 0xE2, 0x9B, 0x07 },
+ { 0x42, 0xD4, 0x63, 0x6E, 0x20, 0x60, 0xF0, 0x8F, 0x41, 0xC8, 0x82, 0xE7, 0x6B, 0x39, 0x6B, 0x11, 0x2E, 0xF6, 0x27, 0xCC, 0x24, 0xC4, 0x3D, 0xD5, 0xF8, 0x3A, 0x1D, 0x1A, 0x7E, 0xAD, 0x71, 0x1A },
+ { 0x48, 0x58, 0xC9, 0xA1, 0x88, 0xB0, 0x23, 0x4F, 0xB9, 0xA8, 0xD4, 0x7D, 0x0B, 0x41, 0x33, 0x65, 0x0A, 0x03, 0x0B, 0xD0, 0x61, 0x1B, 0x87, 0xC3, 0x89, 0x2E, 0x94, 0x95, 0x1F, 0x8D, 0xF8, 0x52 },
+ { 0x3F, 0xAB, 0x3E, 0x36, 0x98, 0x8D, 0x44, 0x5A, 0x51, 0xC8, 0x78, 0x3E, 0x53, 0x1B, 0xE3, 0xA0, 0x2B, 0xE4, 0x0C, 0xD0, 0x47, 0x96, 0xCF, 0xB6, 0x1D, 0x40, 0x34, 0x74, 0x42, 0xD3, 0xF7, 0x94 },
+ { 0xEB, 0xAB, 0xC4, 0x96, 0x36, 0xBD, 0x43, 0x3D, 0x2E, 0xC8, 0xF0, 0xE5, 0x18, 0x73, 0x2E, 0xF8, 0xFA, 0x21, 0xD4, 0xD0, 0x71, 0xCC, 0x3B, 0xC4, 0x6C, 0xD7, 0x9F, 0xA3, 0x8A, 0x28, 0xB8, 0x10 },
+ { 0xA1, 0xD0, 0x34, 0x35, 0x23, 0xB8, 0x93, 0xFC, 0xA8, 0x4F, 0x47, 0xFE, 0xB4, 0xA6, 0x4D, 0x35, 0x0A, 0x17, 0xD8, 0xEE, 0xF5, 0x49, 0x7E, 0xCE, 0x69, 0x7D, 0x02, 0xD7, 0x91, 0x78, 0xB5, 0x91 },
+ { 0x26, 0x2E, 0xBF, 0xD9, 0x13, 0x0B, 0x7D, 0x28, 0x76, 0x0D, 0x08, 0xEF, 0x8B, 0xFD, 0x3B, 0x86, 0xCD, 0xD3, 0xB2, 0x11, 0x3D, 0x2C, 0xAE, 0xF7, 0xEA, 0x95, 0x1A, 0x30, 0x3D, 0xFA, 0x38, 0x46 },
+ { 0xF7, 0x61, 0x58, 0xED, 0xD5, 0x0A, 0x15, 0x4F, 0xA7, 0x82, 0x03, 0xED, 0x23, 0x62, 0x93, 0x2F, 0xCB, 0x82, 0x53, 0xAA, 0xE3, 0x78, 0x90, 0x3E, 0xDE, 0xD1, 0xE0, 0x3F, 0x70, 0x21, 0xA2, 0x57 },
+ { 0x26, 0x17, 0x8E, 0x95, 0x0A, 0xC7, 0x22, 0xF6, 0x7A, 0xE5, 0x6E, 0x57, 0x1B, 0x28, 0x4C, 0x02, 0x07, 0x68, 0x4A, 0x63, 0x34, 0xA1, 0x77, 0x48, 0xA9, 0x4D, 0x26, 0x0B, 0xC5, 0xF5, 0x52, 0x74 },
+ { 0xC3, 0x78, 0xD1, 0xE4, 0x93, 0xB4, 0x0E, 0xF1, 0x1F, 0xE6, 0xA1, 0x5D, 0x9C, 0x27, 0x37, 0xA3, 0x78, 0x09, 0x63, 0x4C, 0x5A, 0xBA, 0xD5, 0xB3, 0x3D, 0x7E, 0x39, 0x3B, 0x4A, 0xE0, 0x5D, 0x03 },
+ { 0x98, 0x4B, 0xD8, 0x37, 0x91, 0x01, 0xBE, 0x8F, 0xD8, 0x06, 0x12, 0xD8, 0xEA, 0x29, 0x59, 0xA7, 0x86, 0x5E, 0xC9, 0x71, 0x85, 0x23, 0x55, 0x01, 0x07, 0xAE, 0x39, 0x38, 0xDF, 0x32, 0x01, 0x1B },
+ { 0xC6, 0xF2, 0x5A, 0x81, 0x2A, 0x14, 0x48, 0x58, 0xAC, 0x5C, 0xED, 0x37, 0xA9, 0x3A, 0x9F, 0x47, 0x59, 0xBA, 0x0B, 0x1C, 0x0F, 0xDC, 0x43, 0x1D, 0xCE, 0x35, 0xF9, 0xEC, 0x1F, 0x1F, 0x4A, 0x99 },
+ { 0x92, 0x4C, 0x75, 0xC9, 0x44, 0x24, 0xFF, 0x75, 0xE7, 0x4B, 0x8B, 0x4E, 0x94, 0x35, 0x89, 0x58, 0xB0, 0x27, 0xB1, 0x71, 0xDF, 0x5E, 0x57, 0x89, 0x9A, 0xD0, 0xD4, 0xDA, 0xC3, 0x73, 0x53, 0xB6 },
+ { 0x0A, 0xF3, 0x58, 0x92, 0xA6, 0x3F, 0x45, 0x93, 0x1F, 0x68, 0x46, 0xED, 0x19, 0x03, 0x61, 0xCD, 0x07, 0x30, 0x89, 0xE0, 0x77, 0x16, 0x57, 0x14, 0xB5, 0x0B, 0x81, 0xA2, 0xE3, 0xDD, 0x9B, 0xA1 },
+ { 0xCC, 0x80, 0xCE, 0xFB, 0x26, 0xC3, 0xB2, 0xB0, 0xDA, 0xEF, 0x23, 0x3E, 0x60, 0x6D, 0x5F, 0xFC, 0x80, 0xFA, 0x17, 0x42, 0x7D, 0x18, 0xE3, 0x04, 0x89, 0x67, 0x3E, 0x06, 0xEF, 0x4B, 0x87, 0xF7 },
+ { 0xC2, 0xF8, 0xC8, 0x11, 0x74, 0x47, 0xF3, 0x97, 0x8B, 0x08, 0x18, 0xDC, 0xF6, 0xF7, 0x01, 0x16, 0xAC, 0x56, 0xFD, 0x18, 0x4D, 0xD1, 0x27, 0x84, 0x94, 0xE1, 0x03, 0xFC, 0x6D, 0x74, 0xA8, 0x87 },
+ { 0xBD, 0xEC, 0xF6, 0xBF, 0xC1, 0xBA, 0x0D, 0xF6, 0xE8, 0x62, 0xC8, 0x31, 0x99, 0x22, 0x07, 0x79, 0x6A, 0xCC, 0x79, 0x79, 0x68, 0x35, 0x88, 0x28, 0xC0, 0x6E, 0x7A, 0x51, 0xE0, 0x90, 0x09, 0x8F },
+ { 0x24, 0xD1, 0xA2, 0x6E, 0x3D, 0xAB, 0x02, 0xFE, 0x45, 0x72, 0xD2, 0xAA, 0x7D, 0xBD, 0x3E, 0xC3, 0x0F, 0x06, 0x93, 0xDB, 0x26, 0xF2, 0x73, 0xD0, 0xAB, 0x2C, 0xB0, 0xC1, 0x3B, 0x5E, 0x64, 0x51 },
+ { 0xEC, 0x56, 0xF5, 0x8B, 0x09, 0x29, 0x9A, 0x30, 0x0B, 0x14, 0x05, 0x65, 0xD7, 0xD3, 0xE6, 0x87, 0x82, 0xB6, 0xE2, 0xFB, 0xEB, 0x4B, 0x7E, 0xA9, 0x7A, 0xC0, 0x57, 0x98, 0x90, 0x61, 0xDD, 0x3F },
+ { 0x11, 0xA4, 0x37, 0xC1, 0xAB, 0xA3, 0xC1, 0x19, 0xDD, 0xFA, 0xB3, 0x1B, 0x3E, 0x8C, 0x84, 0x1D, 0xEE, 0xEB, 0x91, 0x3E, 0xF5, 0x7F, 0x7E, 0x48, 0xF2, 0xC9, 0xCF, 0x5A, 0x28, 0xFA, 0x42, 0xBC },
+ { 0x53, 0xC7, 0xE6, 0x11, 0x4B, 0x85, 0x0A, 0x2C, 0xB4, 0x96, 0xC9, 0xB3, 0xC6, 0x9A, 0x62, 0x3E, 0xAE, 0xA2, 0xCB, 0x1D, 0x33, 0xDD, 0x81, 0x7E, 0x47, 0x65, 0xED, 0xAA, 0x68, 0x23, 0xC2, 0x28 },
+ { 0x15, 0x4C, 0x3E, 0x96, 0xFE, 0xE5, 0xDB, 0x14, 0xF8, 0x77, 0x3E, 0x18, 0xAF, 0x14, 0x85, 0x79, 0x13, 0x50, 0x9D, 0xA9, 0x99, 0xB4, 0x6C, 0xDD, 0x3D, 0x4C, 0x16, 0x97, 0x60, 0xC8, 0x3A, 0xD2 },
+ { 0x40, 0xB9, 0x91, 0x6F, 0x09, 0x3E, 0x02, 0x7A, 0x87, 0x86, 0x64, 0x18, 0x18, 0x92, 0x06, 0x20, 0x47, 0x2F, 0xBC, 0xF6, 0x8F, 0x70, 0x1D, 0x1B, 0x68, 0x06, 0x32, 0xE6, 0x99, 0x6B, 0xDE, 0xD3 },
+ { 0x24, 0xC4, 0xCB, 0xBA, 0x07, 0x11, 0x98, 0x31, 0xA7, 0x26, 0xB0, 0x53, 0x05, 0xD9, 0x6D, 0xA0, 0x2F, 0xF8, 0xB1, 0x48, 0xF0, 0xDA, 0x44, 0x0F, 0xE2, 0x33, 0xBC, 0xAA, 0x32, 0xC7, 0x2F, 0x6F },
+ { 0x5D, 0x20, 0x15, 0x10, 0x25, 0x00, 0x20, 0xB7, 0x83, 0x68, 0x96, 0x88, 0xAB, 0xBF, 0x8E, 0xCF, 0x25, 0x94, 0xA9, 0x6A, 0x08, 0xF2, 0xBF, 0xEC, 0x6C, 0xE0, 0x57, 0x44, 0x65, 0xDD, 0xED, 0x71 },
+ { 0x04, 0x3B, 0x97, 0xE3, 0x36, 0xEE, 0x6F, 0xDB, 0xBE, 0x2B, 0x50, 0xF2, 0x2A, 0xF8, 0x32, 0x75, 0xA4, 0x08, 0x48, 0x05, 0xD2, 0xD5, 0x64, 0x59, 0x62, 0x45, 0x4B, 0x6C, 0x9B, 0x80, 0x53, 0xA0 },
+ { 0x56, 0x48, 0x35, 0xCB, 0xAE, 0xA7, 0x74, 0x94, 0x85, 0x68, 0xBE, 0x36, 0xCF, 0x52, 0xFC, 0xDD, 0x83, 0x93, 0x4E, 0xB0, 0xA2, 0x75, 0x12, 0xDB, 0xE3, 0xE2, 0xDB, 0x47, 0xB9, 0xE6, 0x63, 0x5A },
+ { 0xF2, 0x1C, 0x33, 0xF4, 0x7B, 0xDE, 0x40, 0xA2, 0xA1, 0x01, 0xC9, 0xCD, 0xE8, 0x02, 0x7A, 0xAF, 0x61, 0xA3, 0x13, 0x7D, 0xE2, 0x42, 0x2B, 0x30, 0x03, 0x5A, 0x04, 0xC2, 0x70, 0x89, 0x41, 0x83 },
+ { 0x9D, 0xB0, 0xEF, 0x74, 0xE6, 0x6C, 0xBB, 0x84, 0x2E, 0xB0, 0xE0, 0x73, 0x43, 0xA0, 0x3C, 0x5C, 0x56, 0x7E, 0x37, 0x2B, 0x3F, 0x23, 0xB9, 0x43, 0xC7, 0x88, 0xA4, 0xF2, 0x50, 0xF6, 0x78, 0x91 },
+ { 0xAB, 0x8D, 0x08, 0x65, 0x5F, 0xF1, 0xD3, 0xFE, 0x87, 0x58, 0xD5, 0x62, 0x23, 0x5F, 0xD2, 0x3E, 0x7C, 0xF9, 0xDC, 0xAA, 0xD6, 0x58, 0x87, 0x2A, 0x49, 0xE5, 0xD3, 0x18, 0x3B, 0x6C, 0xCE, 0xBD },
+ { 0x6F, 0x27, 0xF7, 0x7E, 0x7B, 0xCF, 0x46, 0xA1, 0xE9, 0x63, 0xAD, 0xE0, 0x30, 0x97, 0x33, 0x54, 0x30, 0x31, 0xDC, 0xCD, 0xD4, 0x7C, 0xAA, 0xC1, 0x74, 0xD7, 0xD2, 0x7C, 0xE8, 0x07, 0x7E, 0x8B },
+ { 0xE3, 0xCD, 0x54, 0xDA, 0x7E, 0x44, 0x4C, 0xAA, 0x62, 0x07, 0x56, 0x95, 0x25, 0xA6, 0x70, 0xEB, 0xAE, 0x12, 0x78, 0xDE, 0x4E, 0x3F, 0xE2, 0x68, 0x4B, 0x3E, 0x33, 0xF5, 0xEF, 0x90, 0xCC, 0x1B },
+ { 0xB2, 0xC3, 0xE3, 0x3A, 0x51, 0xD2, 0x2C, 0x4C, 0x08, 0xFC, 0x09, 0x89, 0xC8, 0x73, 0xC9, 0xCC, 0x41, 0x50, 0x57, 0x9B, 0x1E, 0x61, 0x63, 0xFA, 0x69, 0x4A, 0xD5, 0x1D, 0x53, 0xD7, 0x12, 0xDC },
+ { 0xBE, 0x7F, 0xDA, 0x98, 0x3E, 0x13, 0x18, 0x9B, 0x4C, 0x77, 0xE0, 0xA8, 0x09, 0x20, 0xB6, 0xE0, 0xE0, 0xEA, 0x80, 0xC3, 0xB8, 0x4D, 0xBE, 0x7E, 0x71, 0x17, 0xD2, 0x53, 0xF4, 0x81, 0x12, 0xF4 },
+ { 0xB6, 0x00, 0x8C, 0x28, 0xFA, 0xE0, 0x8A, 0xA4, 0x27, 0xE5, 0xBD, 0x3A, 0xAD, 0x36, 0xF1, 0x00, 0x21, 0xF1, 0x6C, 0x77, 0xCF, 0xEA, 0xBE, 0xD0, 0x7F, 0x97, 0xCC, 0x7D, 0xC1, 0xF1, 0x28, 0x4A },
+ { 0x6E, 0x4E, 0x67, 0x60, 0xC5, 0x38, 0xF2, 0xE9, 0x7B, 0x3A, 0xDB, 0xFB, 0xBC, 0xDE, 0x57, 0xF8, 0x96, 0x6B, 0x7E, 0xA8, 0xFC, 0xB5, 0xBF, 0x7E, 0xFE, 0xC9, 0x13, 0xFD, 0x2A, 0x2B, 0x0C, 0x55 },
+ { 0x4A, 0xE5, 0x1F, 0xD1, 0x83, 0x4A, 0xA5, 0xBD, 0x9A, 0x6F, 0x7E, 0xC3, 0x9F, 0xC6, 0x63, 0x33, 0x8D, 0xC5, 0xD2, 0xE2, 0x07, 0x61, 0x56, 0x6D, 0x90, 0xCC, 0x68, 0xB1, 0xCB, 0x87, 0x5E, 0xD8 },
+ { 0xB6, 0x73, 0xAA, 0xD7, 0x5A, 0xB1, 0xFD, 0xB5, 0x40, 0x1A, 0xBF, 0xA1, 0xBF, 0x89, 0xF3, 0xAD, 0xD2, 0xEB, 0xC4, 0x68, 0xDF, 0x36, 0x24, 0xA4, 0x78, 0xF4, 0xFE, 0x85, 0x9D, 0x8D, 0x55, 0xE2 },
+ { 0x13, 0xC9, 0x47, 0x1A, 0x98, 0x55, 0x91, 0x35, 0x39, 0x83, 0x66, 0x60, 0x39, 0x8D, 0xA0, 0xF3, 0xF9, 0x9A, 0xDA, 0x08, 0x47, 0x9C, 0x69, 0xD1, 0xB7, 0xFC, 0xAA, 0x34, 0x61, 0xDD, 0x7E, 0x59 },
+ { 0x2C, 0x11, 0xF4, 0xA7, 0xF9, 0x9A, 0x1D, 0x23, 0xA5, 0x8B, 0xB6, 0x36, 0x35, 0x0F, 0xE8, 0x49, 0xF2, 0x9C, 0xBA, 0xC1, 0xB2, 0xA1, 0x11, 0x2D, 0x9F, 0x1E, 0xD5, 0xBC, 0x5B, 0x31, 0x3C, 0xCD },
+ { 0xC7, 0xD3, 0xC0, 0x70, 0x6B, 0x11, 0xAE, 0x74, 0x1C, 0x05, 0xA1, 0xEF, 0x15, 0x0D, 0xD6, 0x5B, 0x54, 0x94, 0xD6, 0xD5, 0x4C, 0x9A, 0x86, 0xE2, 0x61, 0x78, 0x54, 0xE6, 0xAE, 0xEE, 0xBB, 0xD9 },
+ { 0x19, 0x4E, 0x10, 0xC9, 0x38, 0x93, 0xAF, 0xA0, 0x64, 0xC3, 0xAC, 0x04, 0xC0, 0xDD, 0x80, 0x8D, 0x79, 0x1C, 0x3D, 0x4B, 0x75, 0x56, 0xE8, 0x9D, 0x8D, 0x9C, 0xB2, 0x25, 0xC4, 0xB3, 0x33, 0x39 },
+ { 0x6F, 0xC4, 0x98, 0x8B, 0x8F, 0x78, 0x54, 0x6B, 0x16, 0x88, 0x99, 0x18, 0x45, 0x90, 0x8F, 0x13, 0x4B, 0x6A, 0x48, 0x2E, 0x69, 0x94, 0xB3, 0xD4, 0x83, 0x17, 0xBF, 0x08, 0xDB, 0x29, 0x21, 0x85 },
+ { 0x56, 0x65, 0xBE, 0xB8, 0xB0, 0x95, 0x55, 0x25, 0x81, 0x3B, 0x59, 0x81, 0xCD, 0x14, 0x2E, 0xD4, 0xD0, 0x3F, 0xBA, 0x38, 0xA6, 0xF3, 0xE5, 0xAD, 0x26, 0x8E, 0x0C, 0xC2, 0x70, 0xD1, 0xCD, 0x11 },
+ { 0xB8, 0x83, 0xD6, 0x8F, 0x5F, 0xE5, 0x19, 0x36, 0x43, 0x1B, 0xA4, 0x25, 0x67, 0x38, 0x05, 0x3B, 0x1D, 0x04, 0x26, 0xD4, 0xCB, 0x64, 0xB1, 0x6E, 0x83, 0xBA, 0xDC, 0x5E, 0x9F, 0xBE, 0x3B, 0x81 },
+ { 0x53, 0xE7, 0xB2, 0x7E, 0xA5, 0x9C, 0x2F, 0x6D, 0xBB, 0x50, 0x76, 0x9E, 0x43, 0x55, 0x4D, 0xF3, 0x5A, 0xF8, 0x9F, 0x48, 0x22, 0xD0, 0x46, 0x6B, 0x00, 0x7D, 0xD6, 0xF6, 0xDE, 0xAF, 0xFF, 0x02 },
+ { 0x1F, 0x1A, 0x02, 0x29, 0xD4, 0x64, 0x0F, 0x01, 0x90, 0x15, 0x88, 0xD9, 0xDE, 0xC2, 0x2D, 0x13, 0xFC, 0x3E, 0xB3, 0x4A, 0x61, 0xB3, 0x29, 0x38, 0xEF, 0xBF, 0x53, 0x34, 0xB2, 0x80, 0x0A, 0xFA },
+ { 0xC2, 0xB4, 0x05, 0xAF, 0xA0, 0xFA, 0x66, 0x68, 0x85, 0x2A, 0xEE, 0x4D, 0x88, 0x04, 0x08, 0x53, 0xFA, 0xB8, 0x00, 0xE7, 0x2B, 0x57, 0x58, 0x14, 0x18, 0xE5, 0x50, 0x6F, 0x21, 0x4C, 0x7D, 0x1F },
+ { 0xC0, 0x8A, 0xA1, 0xC2, 0x86, 0xD7, 0x09, 0xFD, 0xC7, 0x47, 0x37, 0x44, 0x97, 0x71, 0x88, 0xC8, 0x95, 0xBA, 0x01, 0x10, 0x14, 0x24, 0x7E, 0x4E, 0xFA, 0x8D, 0x07, 0xE7, 0x8F, 0xEC, 0x69, 0x5C },
+ { 0xF0, 0x3F, 0x57, 0x89, 0xD3, 0x33, 0x6B, 0x80, 0xD0, 0x02, 0xD5, 0x9F, 0xDF, 0x91, 0x8B, 0xDB, 0x77, 0x5B, 0x00, 0x95, 0x6E, 0xD5, 0x52, 0x8E, 0x86, 0xAA, 0x99, 0x4A, 0xCB, 0x38, 0xFE, 0x2D }
+};
+
+static const u8 blake2s_keyed_testvecs[][BLAKE2S_OUTBYTES] __initconst = {
+ { 0x48, 0xA8, 0x99, 0x7D, 0xA4, 0x07, 0x87, 0x6B, 0x3D, 0x79, 0xC0, 0xD9, 0x23, 0x25, 0xAD, 0x3B, 0x89, 0xCB, 0xB7, 0x54, 0xD8, 0x6A, 0xB7, 0x1A, 0xEE, 0x04, 0x7A, 0xD3, 0x45, 0xFD, 0x2C, 0x49 },
+ { 0x40, 0xD1, 0x5F, 0xEE, 0x7C, 0x32, 0x88, 0x30, 0x16, 0x6A, 0xC3, 0xF9, 0x18, 0x65, 0x0F, 0x80, 0x7E, 0x7E, 0x01, 0xE1, 0x77, 0x25, 0x8C, 0xDC, 0x0A, 0x39, 0xB1, 0x1F, 0x59, 0x80, 0x66, 0xF1 },
+ { 0x6B, 0xB7, 0x13, 0x00, 0x64, 0x4C, 0xD3, 0x99, 0x1B, 0x26, 0xCC, 0xD4, 0xD2, 0x74, 0xAC, 0xD1, 0xAD, 0xEA, 0xB8, 0xB1, 0xD7, 0x91, 0x45, 0x46, 0xC1, 0x19, 0x8B, 0xBE, 0x9F, 0xC9, 0xD8, 0x03 },
+ { 0x1D, 0x22, 0x0D, 0xBE, 0x2E, 0xE1, 0x34, 0x66, 0x1F, 0xDF, 0x6D, 0x9E, 0x74, 0xB4, 0x17, 0x04, 0x71, 0x05, 0x56, 0xF2, 0xF6, 0xE5, 0xA0, 0x91, 0xB2, 0x27, 0x69, 0x74, 0x45, 0xDB, 0xEA, 0x6B },
+ { 0xF6, 0xC3, 0xFB, 0xAD, 0xB4, 0xCC, 0x68, 0x7A, 0x00, 0x64, 0xA5, 0xBE, 0x6E, 0x79, 0x1B, 0xEC, 0x63, 0xB8, 0x68, 0xAD, 0x62, 0xFB, 0xA6, 0x1B, 0x37, 0x57, 0xEF, 0x9C, 0xA5, 0x2E, 0x05, 0xB2 },
+ { 0x49, 0xC1, 0xF2, 0x11, 0x88, 0xDF, 0xD7, 0x69, 0xAE, 0xA0, 0xE9, 0x11, 0xDD, 0x6B, 0x41, 0xF1, 0x4D, 0xAB, 0x10, 0x9D, 0x2B, 0x85, 0x97, 0x7A, 0xA3, 0x08, 0x8B, 0x5C, 0x70, 0x7E, 0x85, 0x98 },
+ { 0xFD, 0xD8, 0x99, 0x3D, 0xCD, 0x43, 0xF6, 0x96, 0xD4, 0x4F, 0x3C, 0xEA, 0x0F, 0xF3, 0x53, 0x45, 0x23, 0x4E, 0xC8, 0xEE, 0x08, 0x3E, 0xB3, 0xCA, 0xDA, 0x01, 0x7C, 0x7F, 0x78, 0xC1, 0x71, 0x43 },
+ { 0xE6, 0xC8, 0x12, 0x56, 0x37, 0x43, 0x8D, 0x09, 0x05, 0xB7, 0x49, 0xF4, 0x65, 0x60, 0xAC, 0x89, 0xFD, 0x47, 0x1C, 0xF8, 0x69, 0x2E, 0x28, 0xFA, 0xB9, 0x82, 0xF7, 0x3F, 0x01, 0x9B, 0x83, 0xA9 },
+ { 0x19, 0xFC, 0x8C, 0xA6, 0x97, 0x9D, 0x60, 0xE6, 0xED, 0xD3, 0xB4, 0x54, 0x1E, 0x2F, 0x96, 0x7C, 0xED, 0x74, 0x0D, 0xF6, 0xEC, 0x1E, 0xAE, 0xBB, 0xFE, 0x81, 0x38, 0x32, 0xE9, 0x6B, 0x29, 0x74 },
+ { 0xA6, 0xAD, 0x77, 0x7C, 0xE8, 0x81, 0xB5, 0x2B, 0xB5, 0xA4, 0x42, 0x1A, 0xB6, 0xCD, 0xD2, 0xDF, 0xBA, 0x13, 0xE9, 0x63, 0x65, 0x2D, 0x4D, 0x6D, 0x12, 0x2A, 0xEE, 0x46, 0x54, 0x8C, 0x14, 0xA7 },
+ { 0xF5, 0xC4, 0xB2, 0xBA, 0x1A, 0x00, 0x78, 0x1B, 0x13, 0xAB, 0xA0, 0x42, 0x52, 0x42, 0xC6, 0x9C, 0xB1, 0x55, 0x2F, 0x3F, 0x71, 0xA9, 0xA3, 0xBB, 0x22, 0xB4, 0xA6, 0xB4, 0x27, 0x7B, 0x46, 0xDD },
+ { 0xE3, 0x3C, 0x4C, 0x9B, 0xD0, 0xCC, 0x7E, 0x45, 0xC8, 0x0E, 0x65, 0xC7, 0x7F, 0xA5, 0x99, 0x7F, 0xEC, 0x70, 0x02, 0x73, 0x85, 0x41, 0x50, 0x9E, 0x68, 0xA9, 0x42, 0x38, 0x91, 0xE8, 0x22, 0xA3 },
+ { 0xFB, 0xA1, 0x61, 0x69, 0xB2, 0xC3, 0xEE, 0x10, 0x5B, 0xE6, 0xE1, 0xE6, 0x50, 0xE5, 0xCB, 0xF4, 0x07, 0x46, 0xB6, 0x75, 0x3D, 0x03, 0x6A, 0xB5, 0x51, 0x79, 0x01, 0x4A, 0xD7, 0xEF, 0x66, 0x51 },
+ { 0xF5, 0xC4, 0xBE, 0xC6, 0xD6, 0x2F, 0xC6, 0x08, 0xBF, 0x41, 0xCC, 0x11, 0x5F, 0x16, 0xD6, 0x1C, 0x7E, 0xFD, 0x3F, 0xF6, 0xC6, 0x56, 0x92, 0xBB, 0xE0, 0xAF, 0xFF, 0xB1, 0xFE, 0xDE, 0x74, 0x75 },
+ { 0xA4, 0x86, 0x2E, 0x76, 0xDB, 0x84, 0x7F, 0x05, 0xBA, 0x17, 0xED, 0xE5, 0xDA, 0x4E, 0x7F, 0x91, 0xB5, 0x92, 0x5C, 0xF1, 0xAD, 0x4B, 0xA1, 0x27, 0x32, 0xC3, 0x99, 0x57, 0x42, 0xA5, 0xCD, 0x6E },
+ { 0x65, 0xF4, 0xB8, 0x60, 0xCD, 0x15, 0xB3, 0x8E, 0xF8, 0x14, 0xA1, 0xA8, 0x04, 0x31, 0x4A, 0x55, 0xBE, 0x95, 0x3C, 0xAA, 0x65, 0xFD, 0x75, 0x8A, 0xD9, 0x89, 0xFF, 0x34, 0xA4, 0x1C, 0x1E, 0xEA },
+ { 0x19, 0xBA, 0x23, 0x4F, 0x0A, 0x4F, 0x38, 0x63, 0x7D, 0x18, 0x39, 0xF9, 0xD9, 0xF7, 0x6A, 0xD9, 0x1C, 0x85, 0x22, 0x30, 0x71, 0x43, 0xC9, 0x7D, 0x5F, 0x93, 0xF6, 0x92, 0x74, 0xCE, 0xC9, 0xA7 },
+ { 0x1A, 0x67, 0x18, 0x6C, 0xA4, 0xA5, 0xCB, 0x8E, 0x65, 0xFC, 0xA0, 0xE2, 0xEC, 0xBC, 0x5D, 0xDC, 0x14, 0xAE, 0x38, 0x1B, 0xB8, 0xBF, 0xFE, 0xB9, 0xE0, 0xA1, 0x03, 0x44, 0x9E, 0x3E, 0xF0, 0x3C },
+ { 0xAF, 0xBE, 0xA3, 0x17, 0xB5, 0xA2, 0xE8, 0x9C, 0x0B, 0xD9, 0x0C, 0xCF, 0x5D, 0x7F, 0xD0, 0xED, 0x57, 0xFE, 0x58, 0x5E, 0x4B, 0xE3, 0x27, 0x1B, 0x0A, 0x6B, 0xF0, 0xF5, 0x78, 0x6B, 0x0F, 0x26 },
+ { 0xF1, 0xB0, 0x15, 0x58, 0xCE, 0x54, 0x12, 0x62, 0xF5, 0xEC, 0x34, 0x29, 0x9D, 0x6F, 0xB4, 0x09, 0x00, 0x09, 0xE3, 0x43, 0x4B, 0xE2, 0xF4, 0x91, 0x05, 0xCF, 0x46, 0xAF, 0x4D, 0x2D, 0x41, 0x24 },
+ { 0x13, 0xA0, 0xA0, 0xC8, 0x63, 0x35, 0x63, 0x5E, 0xAA, 0x74, 0xCA, 0x2D, 0x5D, 0x48, 0x8C, 0x79, 0x7B, 0xBB, 0x4F, 0x47, 0xDC, 0x07, 0x10, 0x50, 0x15, 0xED, 0x6A, 0x1F, 0x33, 0x09, 0xEF, 0xCE },
+ { 0x15, 0x80, 0xAF, 0xEE, 0xBE, 0xBB, 0x34, 0x6F, 0x94, 0xD5, 0x9F, 0xE6, 0x2D, 0xA0, 0xB7, 0x92, 0x37, 0xEA, 0xD7, 0xB1, 0x49, 0x1F, 0x56, 0x67, 0xA9, 0x0E, 0x45, 0xED, 0xF6, 0xCA, 0x8B, 0x03 },
+ { 0x20, 0xBE, 0x1A, 0x87, 0x5B, 0x38, 0xC5, 0x73, 0xDD, 0x7F, 0xAA, 0xA0, 0xDE, 0x48, 0x9D, 0x65, 0x5C, 0x11, 0xEF, 0xB6, 0xA5, 0x52, 0x69, 0x8E, 0x07, 0xA2, 0xD3, 0x31, 0xB5, 0xF6, 0x55, 0xC3 },
+ { 0xBE, 0x1F, 0xE3, 0xC4, 0xC0, 0x40, 0x18, 0xC5, 0x4C, 0x4A, 0x0F, 0x6B, 0x9A, 0x2E, 0xD3, 0xC5, 0x3A, 0xBE, 0x3A, 0x9F, 0x76, 0xB4, 0xD2, 0x6D, 0xE5, 0x6F, 0xC9, 0xAE, 0x95, 0x05, 0x9A, 0x99 },
+ { 0xE3, 0xE3, 0xAC, 0xE5, 0x37, 0xEB, 0x3E, 0xDD, 0x84, 0x63, 0xD9, 0xAD, 0x35, 0x82, 0xE1, 0x3C, 0xF8, 0x65, 0x33, 0xFF, 0xDE, 0x43, 0xD6, 0x68, 0xDD, 0x2E, 0x93, 0xBB, 0xDB, 0xD7, 0x19, 0x5A },
+ { 0x11, 0x0C, 0x50, 0xC0, 0xBF, 0x2C, 0x6E, 0x7A, 0xEB, 0x7E, 0x43, 0x5D, 0x92, 0xD1, 0x32, 0xAB, 0x66, 0x55, 0x16, 0x8E, 0x78, 0xA2, 0xDE, 0xCD, 0xEC, 0x33, 0x30, 0x77, 0x76, 0x84, 0xD9, 0xC1 },
+ { 0xE9, 0xBA, 0x8F, 0x50, 0x5C, 0x9C, 0x80, 0xC0, 0x86, 0x66, 0xA7, 0x01, 0xF3, 0x36, 0x7E, 0x6C, 0xC6, 0x65, 0xF3, 0x4B, 0x22, 0xE7, 0x3C, 0x3C, 0x04, 0x17, 0xEB, 0x1C, 0x22, 0x06, 0x08, 0x2F },
+ { 0x26, 0xCD, 0x66, 0xFC, 0xA0, 0x23, 0x79, 0xC7, 0x6D, 0xF1, 0x23, 0x17, 0x05, 0x2B, 0xCA, 0xFD, 0x6C, 0xD8, 0xC3, 0xA7, 0xB8, 0x90, 0xD8, 0x05, 0xF3, 0x6C, 0x49, 0x98, 0x97, 0x82, 0x43, 0x3A },
+ { 0x21, 0x3F, 0x35, 0x96, 0xD6, 0xE3, 0xA5, 0xD0, 0xE9, 0x93, 0x2C, 0xD2, 0x15, 0x91, 0x46, 0x01, 0x5E, 0x2A, 0xBC, 0x94, 0x9F, 0x47, 0x29, 0xEE, 0x26, 0x32, 0xFE, 0x1E, 0xDB, 0x78, 0xD3, 0x37 },
+ { 0x10, 0x15, 0xD7, 0x01, 0x08, 0xE0, 0x3B, 0xE1, 0xC7, 0x02, 0xFE, 0x97, 0x25, 0x36, 0x07, 0xD1, 0x4A, 0xEE, 0x59, 0x1F, 0x24, 0x13, 0xEA, 0x67, 0x87, 0x42, 0x7B, 0x64, 0x59, 0xFF, 0x21, 0x9A },
+ { 0x3C, 0xA9, 0x89, 0xDE, 0x10, 0xCF, 0xE6, 0x09, 0x90, 0x94, 0x72, 0xC8, 0xD3, 0x56, 0x10, 0x80, 0x5B, 0x2F, 0x97, 0x77, 0x34, 0xCF, 0x65, 0x2C, 0xC6, 0x4B, 0x3B, 0xFC, 0x88, 0x2D, 0x5D, 0x89 },
+ { 0xB6, 0x15, 0x6F, 0x72, 0xD3, 0x80, 0xEE, 0x9E, 0xA6, 0xAC, 0xD1, 0x90, 0x46, 0x4F, 0x23, 0x07, 0xA5, 0xC1, 0x79, 0xEF, 0x01, 0xFD, 0x71, 0xF9, 0x9F, 0x2D, 0x0F, 0x7A, 0x57, 0x36, 0x0A, 0xEA },
+ { 0xC0, 0x3B, 0xC6, 0x42, 0xB2, 0x09, 0x59, 0xCB, 0xE1, 0x33, 0xA0, 0x30, 0x3E, 0x0C, 0x1A, 0xBF, 0xF3, 0xE3, 0x1E, 0xC8, 0xE1, 0xA3, 0x28, 0xEC, 0x85, 0x65, 0xC3, 0x6D, 0xEC, 0xFF, 0x52, 0x65 },
+ { 0x2C, 0x3E, 0x08, 0x17, 0x6F, 0x76, 0x0C, 0x62, 0x64, 0xC3, 0xA2, 0xCD, 0x66, 0xFE, 0xC6, 0xC3, 0xD7, 0x8D, 0xE4, 0x3F, 0xC1, 0x92, 0x45, 0x7B, 0x2A, 0x4A, 0x66, 0x0A, 0x1E, 0x0E, 0xB2, 0x2B },
+ { 0xF7, 0x38, 0xC0, 0x2F, 0x3C, 0x1B, 0x19, 0x0C, 0x51, 0x2B, 0x1A, 0x32, 0xDE, 0xAB, 0xF3, 0x53, 0x72, 0x8E, 0x0E, 0x9A, 0xB0, 0x34, 0x49, 0x0E, 0x3C, 0x34, 0x09, 0x94, 0x6A, 0x97, 0xAE, 0xEC },
+ { 0x8B, 0x18, 0x80, 0xDF, 0x30, 0x1C, 0xC9, 0x63, 0x41, 0x88, 0x11, 0x08, 0x89, 0x64, 0x83, 0x92, 0x87, 0xFF, 0x7F, 0xE3, 0x1C, 0x49, 0xEA, 0x6E, 0xBD, 0x9E, 0x48, 0xBD, 0xEE, 0xE4, 0x97, 0xC5 },
+ { 0x1E, 0x75, 0xCB, 0x21, 0xC6, 0x09, 0x89, 0x02, 0x03, 0x75, 0xF1, 0xA7, 0xA2, 0x42, 0x83, 0x9F, 0x0B, 0x0B, 0x68, 0x97, 0x3A, 0x4C, 0x2A, 0x05, 0xCF, 0x75, 0x55, 0xED, 0x5A, 0xAE, 0xC4, 0xC1 },
+ { 0x62, 0xBF, 0x8A, 0x9C, 0x32, 0xA5, 0xBC, 0xCF, 0x29, 0x0B, 0x6C, 0x47, 0x4D, 0x75, 0xB2, 0xA2, 0xA4, 0x09, 0x3F, 0x1A, 0x9E, 0x27, 0x13, 0x94, 0x33, 0xA8, 0xF2, 0xB3, 0xBC, 0xE7, 0xB8, 0xD7 },
+ { 0x16, 0x6C, 0x83, 0x50, 0xD3, 0x17, 0x3B, 0x5E, 0x70, 0x2B, 0x78, 0x3D, 0xFD, 0x33, 0xC6, 0x6E, 0xE0, 0x43, 0x27, 0x42, 0xE9, 0xB9, 0x2B, 0x99, 0x7F, 0xD2, 0x3C, 0x60, 0xDC, 0x67, 0x56, 0xCA },
+ { 0x04, 0x4A, 0x14, 0xD8, 0x22, 0xA9, 0x0C, 0xAC, 0xF2, 0xF5, 0xA1, 0x01, 0x42, 0x8A, 0xDC, 0x8F, 0x41, 0x09, 0x38, 0x6C, 0xCB, 0x15, 0x8B, 0xF9, 0x05, 0xC8, 0x61, 0x8B, 0x8E, 0xE2, 0x4E, 0xC3 },
+ { 0x38, 0x7D, 0x39, 0x7E, 0xA4, 0x3A, 0x99, 0x4B, 0xE8, 0x4D, 0x2D, 0x54, 0x4A, 0xFB, 0xE4, 0x81, 0xA2, 0x00, 0x0F, 0x55, 0x25, 0x26, 0x96, 0xBB, 0xA2, 0xC5, 0x0C, 0x8E, 0xBD, 0x10, 0x13, 0x47 },
+ { 0x56, 0xF8, 0xCC, 0xF1, 0xF8, 0x64, 0x09, 0xB4, 0x6C, 0xE3, 0x61, 0x66, 0xAE, 0x91, 0x65, 0x13, 0x84, 0x41, 0x57, 0x75, 0x89, 0xDB, 0x08, 0xCB, 0xC5, 0xF6, 0x6C, 0xA2, 0x97, 0x43, 0xB9, 0xFD },
+ { 0x97, 0x06, 0xC0, 0x92, 0xB0, 0x4D, 0x91, 0xF5, 0x3D, 0xFF, 0x91, 0xFA, 0x37, 0xB7, 0x49, 0x3D, 0x28, 0xB5, 0x76, 0xB5, 0xD7, 0x10, 0x46, 0x9D, 0xF7, 0x94, 0x01, 0x66, 0x22, 0x36, 0xFC, 0x03 },
+ { 0x87, 0x79, 0x68, 0x68, 0x6C, 0x06, 0x8C, 0xE2, 0xF7, 0xE2, 0xAD, 0xCF, 0xF6, 0x8B, 0xF8, 0x74, 0x8E, 0xDF, 0x3C, 0xF8, 0x62, 0xCF, 0xB4, 0xD3, 0x94, 0x7A, 0x31, 0x06, 0x95, 0x80, 0x54, 0xE3 },
+ { 0x88, 0x17, 0xE5, 0x71, 0x98, 0x79, 0xAC, 0xF7, 0x02, 0x47, 0x87, 0xEC, 0xCD, 0xB2, 0x71, 0x03, 0x55, 0x66, 0xCF, 0xA3, 0x33, 0xE0, 0x49, 0x40, 0x7C, 0x01, 0x78, 0xCC, 0xC5, 0x7A, 0x5B, 0x9F },
+ { 0x89, 0x38, 0x24, 0x9E, 0x4B, 0x50, 0xCA, 0xDA, 0xCC, 0xDF, 0x5B, 0x18, 0x62, 0x13, 0x26, 0xCB, 0xB1, 0x52, 0x53, 0xE3, 0x3A, 0x20, 0xF5, 0x63, 0x6E, 0x99, 0x5D, 0x72, 0x47, 0x8D, 0xE4, 0x72 },
+ { 0xF1, 0x64, 0xAB, 0xBA, 0x49, 0x63, 0xA4, 0x4D, 0x10, 0x72, 0x57, 0xE3, 0x23, 0x2D, 0x90, 0xAC, 0xA5, 0xE6, 0x6A, 0x14, 0x08, 0x24, 0x8C, 0x51, 0x74, 0x1E, 0x99, 0x1D, 0xB5, 0x22, 0x77, 0x56 },
+ { 0xD0, 0x55, 0x63, 0xE2, 0xB1, 0xCB, 0xA0, 0xC4, 0xA2, 0xA1, 0xE8, 0xBD, 0xE3, 0xA1, 0xA0, 0xD9, 0xF5, 0xB4, 0x0C, 0x85, 0xA0, 0x70, 0xD6, 0xF5, 0xFB, 0x21, 0x06, 0x6E, 0xAD, 0x5D, 0x06, 0x01 },
+ { 0x03, 0xFB, 0xB1, 0x63, 0x84, 0xF0, 0xA3, 0x86, 0x6F, 0x4C, 0x31, 0x17, 0x87, 0x76, 0x66, 0xEF, 0xBF, 0x12, 0x45, 0x97, 0x56, 0x4B, 0x29, 0x3D, 0x4A, 0xAB, 0x0D, 0x26, 0x9F, 0xAB, 0xDD, 0xFA },
+ { 0x5F, 0xA8, 0x48, 0x6A, 0xC0, 0xE5, 0x29, 0x64, 0xD1, 0x88, 0x1B, 0xBE, 0x33, 0x8E, 0xB5, 0x4B, 0xE2, 0xF7, 0x19, 0x54, 0x92, 0x24, 0x89, 0x20, 0x57, 0xB4, 0xDA, 0x04, 0xBA, 0x8B, 0x34, 0x75 },
+ { 0xCD, 0xFA, 0xBC, 0xEE, 0x46, 0x91, 0x11, 0x11, 0x23, 0x6A, 0x31, 0x70, 0x8B, 0x25, 0x39, 0xD7, 0x1F, 0xC2, 0x11, 0xD9, 0xB0, 0x9C, 0x0D, 0x85, 0x30, 0xA1, 0x1E, 0x1D, 0xBF, 0x6E, 0xED, 0x01 },
+ { 0x4F, 0x82, 0xDE, 0x03, 0xB9, 0x50, 0x47, 0x93, 0xB8, 0x2A, 0x07, 0xA0, 0xBD, 0xCD, 0xFF, 0x31, 0x4D, 0x75, 0x9E, 0x7B, 0x62, 0xD2, 0x6B, 0x78, 0x49, 0x46, 0xB0, 0xD3, 0x6F, 0x91, 0x6F, 0x52 },
+ { 0x25, 0x9E, 0xC7, 0xF1, 0x73, 0xBC, 0xC7, 0x6A, 0x09, 0x94, 0xC9, 0x67, 0xB4, 0xF5, 0xF0, 0x24, 0xC5, 0x60, 0x57, 0xFB, 0x79, 0xC9, 0x65, 0xC4, 0xFA, 0xE4, 0x18, 0x75, 0xF0, 0x6A, 0x0E, 0x4C },
+ { 0x19, 0x3C, 0xC8, 0xE7, 0xC3, 0xE0, 0x8B, 0xB3, 0x0F, 0x54, 0x37, 0xAA, 0x27, 0xAD, 0xE1, 0xF1, 0x42, 0x36, 0x9B, 0x24, 0x6A, 0x67, 0x5B, 0x23, 0x83, 0xE6, 0xDA, 0x9B, 0x49, 0xA9, 0x80, 0x9E },
+ { 0x5C, 0x10, 0x89, 0x6F, 0x0E, 0x28, 0x56, 0xB2, 0xA2, 0xEE, 0xE0, 0xFE, 0x4A, 0x2C, 0x16, 0x33, 0x56, 0x5D, 0x18, 0xF0, 0xE9, 0x3E, 0x1F, 0xAB, 0x26, 0xC3, 0x73, 0xE8, 0xF8, 0x29, 0x65, 0x4D },
+ { 0xF1, 0x60, 0x12, 0xD9, 0x3F, 0x28, 0x85, 0x1A, 0x1E, 0xB9, 0x89, 0xF5, 0xD0, 0xB4, 0x3F, 0x3F, 0x39, 0xCA, 0x73, 0xC9, 0xA6, 0x2D, 0x51, 0x81, 0xBF, 0xF2, 0x37, 0x53, 0x6B, 0xD3, 0x48, 0xC3 },
+ { 0x29, 0x66, 0xB3, 0xCF, 0xAE, 0x1E, 0x44, 0xEA, 0x99, 0x6D, 0xC5, 0xD6, 0x86, 0xCF, 0x25, 0xFA, 0x05, 0x3F, 0xB6, 0xF6, 0x72, 0x01, 0xB9, 0xE4, 0x6E, 0xAD, 0xE8, 0x5D, 0x0A, 0xD6, 0xB8, 0x06 },
+ { 0xDD, 0xB8, 0x78, 0x24, 0x85, 0xE9, 0x00, 0xBC, 0x60, 0xBC, 0xF4, 0xC3, 0x3A, 0x6F, 0xD5, 0x85, 0x68, 0x0C, 0xC6, 0x83, 0xD5, 0x16, 0xEF, 0xA0, 0x3E, 0xB9, 0x98, 0x5F, 0xAD, 0x87, 0x15, 0xFB },
+ { 0x4C, 0x4D, 0x6E, 0x71, 0xAE, 0xA0, 0x57, 0x86, 0x41, 0x31, 0x48, 0xFC, 0x7A, 0x78, 0x6B, 0x0E, 0xCA, 0xF5, 0x82, 0xCF, 0xF1, 0x20, 0x9F, 0x5A, 0x80, 0x9F, 0xBA, 0x85, 0x04, 0xCE, 0x66, 0x2C },
+ { 0xFB, 0x4C, 0x5E, 0x86, 0xD7, 0xB2, 0x22, 0x9B, 0x99, 0xB8, 0xBA, 0x6D, 0x94, 0xC2, 0x47, 0xEF, 0x96, 0x4A, 0xA3, 0xA2, 0xBA, 0xE8, 0xED, 0xC7, 0x75, 0x69, 0xF2, 0x8D, 0xBB, 0xFF, 0x2D, 0x4E },
+ { 0xE9, 0x4F, 0x52, 0x6D, 0xE9, 0x01, 0x96, 0x33, 0xEC, 0xD5, 0x4A, 0xC6, 0x12, 0x0F, 0x23, 0x95, 0x8D, 0x77, 0x18, 0xF1, 0xE7, 0x71, 0x7B, 0xF3, 0x29, 0x21, 0x1A, 0x4F, 0xAE, 0xED, 0x4E, 0x6D },
+ { 0xCB, 0xD6, 0x66, 0x0A, 0x10, 0xDB, 0x3F, 0x23, 0xF7, 0xA0, 0x3D, 0x4B, 0x9D, 0x40, 0x44, 0xC7, 0x93, 0x2B, 0x28, 0x01, 0xAC, 0x89, 0xD6, 0x0B, 0xC9, 0xEB, 0x92, 0xD6, 0x5A, 0x46, 0xC2, 0xA0 },
+ { 0x88, 0x18, 0xBB, 0xD3, 0xDB, 0x4D, 0xC1, 0x23, 0xB2, 0x5C, 0xBB, 0xA5, 0xF5, 0x4C, 0x2B, 0xC4, 0xB3, 0xFC, 0xF9, 0xBF, 0x7D, 0x7A, 0x77, 0x09, 0xF4, 0xAE, 0x58, 0x8B, 0x26, 0x7C, 0x4E, 0xCE },
+ { 0xC6, 0x53, 0x82, 0x51, 0x3F, 0x07, 0x46, 0x0D, 0xA3, 0x98, 0x33, 0xCB, 0x66, 0x6C, 0x5E, 0xD8, 0x2E, 0x61, 0xB9, 0xE9, 0x98, 0xF4, 0xB0, 0xC4, 0x28, 0x7C, 0xEE, 0x56, 0xC3, 0xCC, 0x9B, 0xCD },
+ { 0x89, 0x75, 0xB0, 0x57, 0x7F, 0xD3, 0x55, 0x66, 0xD7, 0x50, 0xB3, 0x62, 0xB0, 0x89, 0x7A, 0x26, 0xC3, 0x99, 0x13, 0x6D, 0xF0, 0x7B, 0xAB, 0xAB, 0xBD, 0xE6, 0x20, 0x3F, 0xF2, 0x95, 0x4E, 0xD4 },
+ { 0x21, 0xFE, 0x0C, 0xEB, 0x00, 0x52, 0xBE, 0x7F, 0xB0, 0xF0, 0x04, 0x18, 0x7C, 0xAC, 0xD7, 0xDE, 0x67, 0xFA, 0x6E, 0xB0, 0x93, 0x8D, 0x92, 0x76, 0x77, 0xF2, 0x39, 0x8C, 0x13, 0x23, 0x17, 0xA8 },
+ { 0x2E, 0xF7, 0x3F, 0x3C, 0x26, 0xF1, 0x2D, 0x93, 0x88, 0x9F, 0x3C, 0x78, 0xB6, 0xA6, 0x6C, 0x1D, 0x52, 0xB6, 0x49, 0xDC, 0x9E, 0x85, 0x6E, 0x2C, 0x17, 0x2E, 0xA7, 0xC5, 0x8A, 0xC2, 0xB5, 0xE3 },
+ { 0x38, 0x8A, 0x3C, 0xD5, 0x6D, 0x73, 0x86, 0x7A, 0xBB, 0x5F, 0x84, 0x01, 0x49, 0x2B, 0x6E, 0x26, 0x81, 0xEB, 0x69, 0x85, 0x1E, 0x76, 0x7F, 0xD8, 0x42, 0x10, 0xA5, 0x60, 0x76, 0xFB, 0x3D, 0xD3 },
+ { 0xAF, 0x53, 0x3E, 0x02, 0x2F, 0xC9, 0x43, 0x9E, 0x4E, 0x3C, 0xB8, 0x38, 0xEC, 0xD1, 0x86, 0x92, 0x23, 0x2A, 0xDF, 0x6F, 0xE9, 0x83, 0x95, 0x26, 0xD3, 0xC3, 0xDD, 0x1B, 0x71, 0x91, 0x0B, 0x1A },
+ { 0x75, 0x1C, 0x09, 0xD4, 0x1A, 0x93, 0x43, 0x88, 0x2A, 0x81, 0xCD, 0x13, 0xEE, 0x40, 0x81, 0x8D, 0x12, 0xEB, 0x44, 0xC6, 0xC7, 0xF4, 0x0D, 0xF1, 0x6E, 0x4A, 0xEA, 0x8F, 0xAB, 0x91, 0x97, 0x2A },
+ { 0x5B, 0x73, 0xDD, 0xB6, 0x8D, 0x9D, 0x2B, 0x0A, 0xA2, 0x65, 0xA0, 0x79, 0x88, 0xD6, 0xB8, 0x8A, 0xE9, 0xAA, 0xC5, 0x82, 0xAF, 0x83, 0x03, 0x2F, 0x8A, 0x9B, 0x21, 0xA2, 0xE1, 0xB7, 0xBF, 0x18 },
+ { 0x3D, 0xA2, 0x91, 0x26, 0xC7, 0xC5, 0xD7, 0xF4, 0x3E, 0x64, 0x24, 0x2A, 0x79, 0xFE, 0xAA, 0x4E, 0xF3, 0x45, 0x9C, 0xDE, 0xCC, 0xC8, 0x98, 0xED, 0x59, 0xA9, 0x7F, 0x6E, 0xC9, 0x3B, 0x9D, 0xAB },
+ { 0x56, 0x6D, 0xC9, 0x20, 0x29, 0x3D, 0xA5, 0xCB, 0x4F, 0xE0, 0xAA, 0x8A, 0xBD, 0xA8, 0xBB, 0xF5, 0x6F, 0x55, 0x23, 0x13, 0xBF, 0xF1, 0x90, 0x46, 0x64, 0x1E, 0x36, 0x15, 0xC1, 0xE3, 0xED, 0x3F },
+ { 0x41, 0x15, 0xBE, 0xA0, 0x2F, 0x73, 0xF9, 0x7F, 0x62, 0x9E, 0x5C, 0x55, 0x90, 0x72, 0x0C, 0x01, 0xE7, 0xE4, 0x49, 0xAE, 0x2A, 0x66, 0x97, 0xD4, 0xD2, 0x78, 0x33, 0x21, 0x30, 0x36, 0x92, 0xF9 },
+ { 0x4C, 0xE0, 0x8F, 0x47, 0x62, 0x46, 0x8A, 0x76, 0x70, 0x01, 0x21, 0x64, 0x87, 0x8D, 0x68, 0x34, 0x0C, 0x52, 0xA3, 0x5E, 0x66, 0xC1, 0x88, 0x4D, 0x5C, 0x86, 0x48, 0x89, 0xAB, 0xC9, 0x66, 0x77 },
+ { 0x81, 0xEA, 0x0B, 0x78, 0x04, 0x12, 0x4E, 0x0C, 0x22, 0xEA, 0x5F, 0xC7, 0x11, 0x04, 0xA2, 0xAF, 0xCB, 0x52, 0xA1, 0xFA, 0x81, 0x6F, 0x3E, 0xCB, 0x7D, 0xCB, 0x5D, 0x9D, 0xEA, 0x17, 0x86, 0xD0 },
+ { 0xFE, 0x36, 0x27, 0x33, 0xB0, 0x5F, 0x6B, 0xED, 0xAF, 0x93, 0x79, 0xD7, 0xF7, 0x93, 0x6E, 0xDE, 0x20, 0x9B, 0x1F, 0x83, 0x23, 0xC3, 0x92, 0x25, 0x49, 0xD9, 0xE7, 0x36, 0x81, 0xB5, 0xDB, 0x7B },
+ { 0xEF, 0xF3, 0x7D, 0x30, 0xDF, 0xD2, 0x03, 0x59, 0xBE, 0x4E, 0x73, 0xFD, 0xF4, 0x0D, 0x27, 0x73, 0x4B, 0x3D, 0xF9, 0x0A, 0x97, 0xA5, 0x5E, 0xD7, 0x45, 0x29, 0x72, 0x94, 0xCA, 0x85, 0xD0, 0x9F },
+ { 0x17, 0x2F, 0xFC, 0x67, 0x15, 0x3D, 0x12, 0xE0, 0xCA, 0x76, 0xA8, 0xB6, 0xCD, 0x5D, 0x47, 0x31, 0x88, 0x5B, 0x39, 0xCE, 0x0C, 0xAC, 0x93, 0xA8, 0x97, 0x2A, 0x18, 0x00, 0x6C, 0x8B, 0x8B, 0xAF },
+ { 0xC4, 0x79, 0x57, 0xF1, 0xCC, 0x88, 0xE8, 0x3E, 0xF9, 0x44, 0x58, 0x39, 0x70, 0x9A, 0x48, 0x0A, 0x03, 0x6B, 0xED, 0x5F, 0x88, 0xAC, 0x0F, 0xCC, 0x8E, 0x1E, 0x70, 0x3F, 0xFA, 0xAC, 0x13, 0x2C },
+ { 0x30, 0xF3, 0x54, 0x83, 0x70, 0xCF, 0xDC, 0xED, 0xA5, 0xC3, 0x7B, 0x56, 0x9B, 0x61, 0x75, 0xE7, 0x99, 0xEE, 0xF1, 0xA6, 0x2A, 0xAA, 0x94, 0x32, 0x45, 0xAE, 0x76, 0x69, 0xC2, 0x27, 0xA7, 0xB5 },
+ { 0xC9, 0x5D, 0xCB, 0x3C, 0xF1, 0xF2, 0x7D, 0x0E, 0xEF, 0x2F, 0x25, 0xD2, 0x41, 0x38, 0x70, 0x90, 0x4A, 0x87, 0x7C, 0x4A, 0x56, 0xC2, 0xDE, 0x1E, 0x83, 0xE2, 0xBC, 0x2A, 0xE2, 0xE4, 0x68, 0x21 },
+ { 0xD5, 0xD0, 0xB5, 0xD7, 0x05, 0x43, 0x4C, 0xD4, 0x6B, 0x18, 0x57, 0x49, 0xF6, 0x6B, 0xFB, 0x58, 0x36, 0xDC, 0xDF, 0x6E, 0xE5, 0x49, 0xA2, 0xB7, 0xA4, 0xAE, 0xE7, 0xF5, 0x80, 0x07, 0xCA, 0xAF },
+ { 0xBB, 0xC1, 0x24, 0xA7, 0x12, 0xF1, 0x5D, 0x07, 0xC3, 0x00, 0xE0, 0x5B, 0x66, 0x83, 0x89, 0xA4, 0x39, 0xC9, 0x17, 0x77, 0xF7, 0x21, 0xF8, 0x32, 0x0C, 0x1C, 0x90, 0x78, 0x06, 0x6D, 0x2C, 0x7E },
+ { 0xA4, 0x51, 0xB4, 0x8C, 0x35, 0xA6, 0xC7, 0x85, 0x4C, 0xFA, 0xAE, 0x60, 0x26, 0x2E, 0x76, 0x99, 0x08, 0x16, 0x38, 0x2A, 0xC0, 0x66, 0x7E, 0x5A, 0x5C, 0x9E, 0x1B, 0x46, 0xC4, 0x34, 0x2D, 0xDF },
+ { 0xB0, 0xD1, 0x50, 0xFB, 0x55, 0xE7, 0x78, 0xD0, 0x11, 0x47, 0xF0, 0xB5, 0xD8, 0x9D, 0x99, 0xEC, 0xB2, 0x0F, 0xF0, 0x7E, 0x5E, 0x67, 0x60, 0xD6, 0xB6, 0x45, 0xEB, 0x5B, 0x65, 0x4C, 0x62, 0x2B },
+ { 0x34, 0xF7, 0x37, 0xC0, 0xAB, 0x21, 0x99, 0x51, 0xEE, 0xE8, 0x9A, 0x9F, 0x8D, 0xAC, 0x29, 0x9C, 0x9D, 0x4C, 0x38, 0xF3, 0x3F, 0xA4, 0x94, 0xC5, 0xC6, 0xEE, 0xFC, 0x92, 0xB6, 0xDB, 0x08, 0xBC },
+ { 0x1A, 0x62, 0xCC, 0x3A, 0x00, 0x80, 0x0D, 0xCB, 0xD9, 0x98, 0x91, 0x08, 0x0C, 0x1E, 0x09, 0x84, 0x58, 0x19, 0x3A, 0x8C, 0xC9, 0xF9, 0x70, 0xEA, 0x99, 0xFB, 0xEF, 0xF0, 0x03, 0x18, 0xC2, 0x89 },
+ { 0xCF, 0xCE, 0x55, 0xEB, 0xAF, 0xC8, 0x40, 0xD7, 0xAE, 0x48, 0x28, 0x1C, 0x7F, 0xD5, 0x7E, 0xC8, 0xB4, 0x82, 0xD4, 0xB7, 0x04, 0x43, 0x74, 0x95, 0x49, 0x5A, 0xC4, 0x14, 0xCF, 0x4A, 0x37, 0x4B },
+ { 0x67, 0x46, 0xFA, 0xCF, 0x71, 0x14, 0x6D, 0x99, 0x9D, 0xAB, 0xD0, 0x5D, 0x09, 0x3A, 0xE5, 0x86, 0x64, 0x8D, 0x1E, 0xE2, 0x8E, 0x72, 0x61, 0x7B, 0x99, 0xD0, 0xF0, 0x08, 0x6E, 0x1E, 0x45, 0xBF },
+ { 0x57, 0x1C, 0xED, 0x28, 0x3B, 0x3F, 0x23, 0xB4, 0xE7, 0x50, 0xBF, 0x12, 0xA2, 0xCA, 0xF1, 0x78, 0x18, 0x47, 0xBD, 0x89, 0x0E, 0x43, 0x60, 0x3C, 0xDC, 0x59, 0x76, 0x10, 0x2B, 0x7B, 0xB1, 0x1B },
+ { 0xCF, 0xCB, 0x76, 0x5B, 0x04, 0x8E, 0x35, 0x02, 0x2C, 0x5D, 0x08, 0x9D, 0x26, 0xE8, 0x5A, 0x36, 0xB0, 0x05, 0xA2, 0xB8, 0x04, 0x93, 0xD0, 0x3A, 0x14, 0x4E, 0x09, 0xF4, 0x09, 0xB6, 0xAF, 0xD1 },
+ { 0x40, 0x50, 0xC7, 0xA2, 0x77, 0x05, 0xBB, 0x27, 0xF4, 0x20, 0x89, 0xB2, 0x99, 0xF3, 0xCB, 0xE5, 0x05, 0x4E, 0xAD, 0x68, 0x72, 0x7E, 0x8E, 0xF9, 0x31, 0x8C, 0xE6, 0xF2, 0x5C, 0xD6, 0xF3, 0x1D },
+ { 0x18, 0x40, 0x70, 0xBD, 0x5D, 0x26, 0x5F, 0xBD, 0xC1, 0x42, 0xCD, 0x1C, 0x5C, 0xD0, 0xD7, 0xE4, 0x14, 0xE7, 0x03, 0x69, 0xA2, 0x66, 0xD6, 0x27, 0xC8, 0xFB, 0xA8, 0x4F, 0xA5, 0xE8, 0x4C, 0x34 },
+ { 0x9E, 0xDD, 0xA9, 0xA4, 0x44, 0x39, 0x02, 0xA9, 0x58, 0x8C, 0x0D, 0x0C, 0xCC, 0x62, 0xB9, 0x30, 0x21, 0x84, 0x79, 0xA6, 0x84, 0x1E, 0x6F, 0xE7, 0xD4, 0x30, 0x03, 0xF0, 0x4B, 0x1F, 0xD6, 0x43 },
+ { 0xE4, 0x12, 0xFE, 0xEF, 0x79, 0x08, 0x32, 0x4A, 0x6D, 0xA1, 0x84, 0x16, 0x29, 0xF3, 0x5D, 0x3D, 0x35, 0x86, 0x42, 0x01, 0x93, 0x10, 0xEC, 0x57, 0xC6, 0x14, 0x83, 0x6B, 0x63, 0xD3, 0x07, 0x63 },
+ { 0x1A, 0x2B, 0x8E, 0xDF, 0xF3, 0xF9, 0xAC, 0xC1, 0x55, 0x4F, 0xCB, 0xAE, 0x3C, 0xF1, 0xD6, 0x29, 0x8C, 0x64, 0x62, 0xE2, 0x2E, 0x5E, 0xB0, 0x25, 0x96, 0x84, 0xF8, 0x35, 0x01, 0x2B, 0xD1, 0x3F },
+ { 0x28, 0x8C, 0x4A, 0xD9, 0xB9, 0x40, 0x97, 0x62, 0xEA, 0x07, 0xC2, 0x4A, 0x41, 0xF0, 0x4F, 0x69, 0xA7, 0xD7, 0x4B, 0xEE, 0x2D, 0x95, 0x43, 0x53, 0x74, 0xBD, 0xE9, 0x46, 0xD7, 0x24, 0x1C, 0x7B },
+ { 0x80, 0x56, 0x91, 0xBB, 0x28, 0x67, 0x48, 0xCF, 0xB5, 0x91, 0xD3, 0xAE, 0xBE, 0x7E, 0x6F, 0x4E, 0x4D, 0xC6, 0xE2, 0x80, 0x8C, 0x65, 0x14, 0x3C, 0xC0, 0x04, 0xE4, 0xEB, 0x6F, 0xD0, 0x9D, 0x43 },
+ { 0xD4, 0xAC, 0x8D, 0x3A, 0x0A, 0xFC, 0x6C, 0xFA, 0x7B, 0x46, 0x0A, 0xE3, 0x00, 0x1B, 0xAE, 0xB3, 0x6D, 0xAD, 0xB3, 0x7D, 0xA0, 0x7D, 0x2E, 0x8A, 0xC9, 0x18, 0x22, 0xDF, 0x34, 0x8A, 0xED, 0x3D },
+ { 0xC3, 0x76, 0x61, 0x70, 0x14, 0xD2, 0x01, 0x58, 0xBC, 0xED, 0x3D, 0x3B, 0xA5, 0x52, 0xB6, 0xEC, 0xCF, 0x84, 0xE6, 0x2A, 0xA3, 0xEB, 0x65, 0x0E, 0x90, 0x02, 0x9C, 0x84, 0xD1, 0x3E, 0xEA, 0x69 },
+ { 0xC4, 0x1F, 0x09, 0xF4, 0x3C, 0xEC, 0xAE, 0x72, 0x93, 0xD6, 0x00, 0x7C, 0xA0, 0xA3, 0x57, 0x08, 0x7D, 0x5A, 0xE5, 0x9B, 0xE5, 0x00, 0xC1, 0xCD, 0x5B, 0x28, 0x9E, 0xE8, 0x10, 0xC7, 0xB0, 0x82 },
+ { 0x03, 0xD1, 0xCE, 0xD1, 0xFB, 0xA5, 0xC3, 0x91, 0x55, 0xC4, 0x4B, 0x77, 0x65, 0xCB, 0x76, 0x0C, 0x78, 0x70, 0x8D, 0xCF, 0xC8, 0x0B, 0x0B, 0xD8, 0xAD, 0xE3, 0xA5, 0x6D, 0xA8, 0x83, 0x0B, 0x29 },
+ { 0x09, 0xBD, 0xE6, 0xF1, 0x52, 0x21, 0x8D, 0xC9, 0x2C, 0x41, 0xD7, 0xF4, 0x53, 0x87, 0xE6, 0x3E, 0x58, 0x69, 0xD8, 0x07, 0xEC, 0x70, 0xB8, 0x21, 0x40, 0x5D, 0xBD, 0x88, 0x4B, 0x7F, 0xCF, 0x4B },
+ { 0x71, 0xC9, 0x03, 0x6E, 0x18, 0x17, 0x9B, 0x90, 0xB3, 0x7D, 0x39, 0xE9, 0xF0, 0x5E, 0xB8, 0x9C, 0xC5, 0xFC, 0x34, 0x1F, 0xD7, 0xC4, 0x77, 0xD0, 0xD7, 0x49, 0x32, 0x85, 0xFA, 0xCA, 0x08, 0xA4 },
+ { 0x59, 0x16, 0x83, 0x3E, 0xBB, 0x05, 0xCD, 0x91, 0x9C, 0xA7, 0xFE, 0x83, 0xB6, 0x92, 0xD3, 0x20, 0x5B, 0xEF, 0x72, 0x39, 0x2B, 0x2C, 0xF6, 0xBB, 0x0A, 0x6D, 0x43, 0xF9, 0x94, 0xF9, 0x5F, 0x11 },
+ { 0xF6, 0x3A, 0xAB, 0x3E, 0xC6, 0x41, 0xB3, 0xB0, 0x24, 0x96, 0x4C, 0x2B, 0x43, 0x7C, 0x04, 0xF6, 0x04, 0x3C, 0x4C, 0x7E, 0x02, 0x79, 0x23, 0x99, 0x95, 0x40, 0x19, 0x58, 0xF8, 0x6B, 0xBE, 0x54 },
+ { 0xF1, 0x72, 0xB1, 0x80, 0xBF, 0xB0, 0x97, 0x40, 0x49, 0x31, 0x20, 0xB6, 0x32, 0x6C, 0xBD, 0xC5, 0x61, 0xE4, 0x77, 0xDE, 0xF9, 0xBB, 0xCF, 0xD2, 0x8C, 0xC8, 0xC1, 0xC5, 0xE3, 0x37, 0x9A, 0x31 },
+ { 0xCB, 0x9B, 0x89, 0xCC, 0x18, 0x38, 0x1D, 0xD9, 0x14, 0x1A, 0xDE, 0x58, 0x86, 0x54, 0xD4, 0xE6, 0xA2, 0x31, 0xD5, 0xBF, 0x49, 0xD4, 0xD5, 0x9A, 0xC2, 0x7D, 0x86, 0x9C, 0xBE, 0x10, 0x0C, 0xF3 },
+ { 0x7B, 0xD8, 0x81, 0x50, 0x46, 0xFD, 0xD8, 0x10, 0xA9, 0x23, 0xE1, 0x98, 0x4A, 0xAE, 0xBD, 0xCD, 0xF8, 0x4D, 0x87, 0xC8, 0x99, 0x2D, 0x68, 0xB5, 0xEE, 0xB4, 0x60, 0xF9, 0x3E, 0xB3, 0xC8, 0xD7 },
+ { 0x60, 0x7B, 0xE6, 0x68, 0x62, 0xFD, 0x08, 0xEE, 0x5B, 0x19, 0xFA, 0xCA, 0xC0, 0x9D, 0xFD, 0xBC, 0xD4, 0x0C, 0x31, 0x21, 0x01, 0xD6, 0x6E, 0x6E, 0xBD, 0x2B, 0x84, 0x1F, 0x1B, 0x9A, 0x93, 0x25 },
+ { 0x9F, 0xE0, 0x3B, 0xBE, 0x69, 0xAB, 0x18, 0x34, 0xF5, 0x21, 0x9B, 0x0D, 0xA8, 0x8A, 0x08, 0xB3, 0x0A, 0x66, 0xC5, 0x91, 0x3F, 0x01, 0x51, 0x96, 0x3C, 0x36, 0x05, 0x60, 0xDB, 0x03, 0x87, 0xB3 },
+ { 0x90, 0xA8, 0x35, 0x85, 0x71, 0x7B, 0x75, 0xF0, 0xE9, 0xB7, 0x25, 0xE0, 0x55, 0xEE, 0xEE, 0xB9, 0xE7, 0xA0, 0x28, 0xEA, 0x7E, 0x6C, 0xBC, 0x07, 0xB2, 0x09, 0x17, 0xEC, 0x03, 0x63, 0xE3, 0x8C },
+ { 0x33, 0x6E, 0xA0, 0x53, 0x0F, 0x4A, 0x74, 0x69, 0x12, 0x6E, 0x02, 0x18, 0x58, 0x7E, 0xBB, 0xDE, 0x33, 0x58, 0xA0, 0xB3, 0x1C, 0x29, 0xD2, 0x00, 0xF7, 0xDC, 0x7E, 0xB1, 0x5C, 0x6A, 0xAD, 0xD8 },
+ { 0xA7, 0x9E, 0x76, 0xDC, 0x0A, 0xBC, 0xA4, 0x39, 0x6F, 0x07, 0x47, 0xCD, 0x7B, 0x74, 0x8D, 0xF9, 0x13, 0x00, 0x76, 0x26, 0xB1, 0xD6, 0x59, 0xDA, 0x0C, 0x1F, 0x78, 0xB9, 0x30, 0x3D, 0x01, 0xA3 },
+ { 0x44, 0xE7, 0x8A, 0x77, 0x37, 0x56, 0xE0, 0x95, 0x15, 0x19, 0x50, 0x4D, 0x70, 0x38, 0xD2, 0x8D, 0x02, 0x13, 0xA3, 0x7E, 0x0C, 0xE3, 0x75, 0x37, 0x17, 0x57, 0xBC, 0x99, 0x63, 0x11, 0xE3, 0xB8 },
+ { 0x77, 0xAC, 0x01, 0x2A, 0x3F, 0x75, 0x4D, 0xCF, 0xEA, 0xB5, 0xEB, 0x99, 0x6B, 0xE9, 0xCD, 0x2D, 0x1F, 0x96, 0x11, 0x1B, 0x6E, 0x49, 0xF3, 0x99, 0x4D, 0xF1, 0x81, 0xF2, 0x85, 0x69, 0xD8, 0x25 },
+ { 0xCE, 0x5A, 0x10, 0xDB, 0x6F, 0xCC, 0xDA, 0xF1, 0x40, 0xAA, 0xA4, 0xDE, 0xD6, 0x25, 0x0A, 0x9C, 0x06, 0xE9, 0x22, 0x2B, 0xC9, 0xF9, 0xF3, 0x65, 0x8A, 0x4A, 0xFF, 0x93, 0x5F, 0x2B, 0x9F, 0x3A },
+ { 0xEC, 0xC2, 0x03, 0xA7, 0xFE, 0x2B, 0xE4, 0xAB, 0xD5, 0x5B, 0xB5, 0x3E, 0x6E, 0x67, 0x35, 0x72, 0xE0, 0x07, 0x8D, 0xA8, 0xCD, 0x37, 0x5E, 0xF4, 0x30, 0xCC, 0x97, 0xF9, 0xF8, 0x00, 0x83, 0xAF },
+ { 0x14, 0xA5, 0x18, 0x6D, 0xE9, 0xD7, 0xA1, 0x8B, 0x04, 0x12, 0xB8, 0x56, 0x3E, 0x51, 0xCC, 0x54, 0x33, 0x84, 0x0B, 0x4A, 0x12, 0x9A, 0x8F, 0xF9, 0x63, 0xB3, 0x3A, 0x3C, 0x4A, 0xFE, 0x8E, 0xBB },
+ { 0x13, 0xF8, 0xEF, 0x95, 0xCB, 0x86, 0xE6, 0xA6, 0x38, 0x93, 0x1C, 0x8E, 0x10, 0x76, 0x73, 0xEB, 0x76, 0xBA, 0x10, 0xD7, 0xC2, 0xCD, 0x70, 0xB9, 0xD9, 0x92, 0x0B, 0xBE, 0xED, 0x92, 0x94, 0x09 },
+ { 0x0B, 0x33, 0x8F, 0x4E, 0xE1, 0x2F, 0x2D, 0xFC, 0xB7, 0x87, 0x13, 0x37, 0x79, 0x41, 0xE0, 0xB0, 0x63, 0x21, 0x52, 0x58, 0x1D, 0x13, 0x32, 0x51, 0x6E, 0x4A, 0x2C, 0xAB, 0x19, 0x42, 0xCC, 0xA4 },
+ { 0xEA, 0xAB, 0x0E, 0xC3, 0x7B, 0x3B, 0x8A, 0xB7, 0x96, 0xE9, 0xF5, 0x72, 0x38, 0xDE, 0x14, 0xA2, 0x64, 0xA0, 0x76, 0xF3, 0x88, 0x7D, 0x86, 0xE2, 0x9B, 0xB5, 0x90, 0x6D, 0xB5, 0xA0, 0x0E, 0x02 },
+ { 0x23, 0xCB, 0x68, 0xB8, 0xC0, 0xE6, 0xDC, 0x26, 0xDC, 0x27, 0x76, 0x6D, 0xDC, 0x0A, 0x13, 0xA9, 0x94, 0x38, 0xFD, 0x55, 0x61, 0x7A, 0xA4, 0x09, 0x5D, 0x8F, 0x96, 0x97, 0x20, 0xC8, 0x72, 0xDF },
+ { 0x09, 0x1D, 0x8E, 0xE3, 0x0D, 0x6F, 0x29, 0x68, 0xD4, 0x6B, 0x68, 0x7D, 0xD6, 0x52, 0x92, 0x66, 0x57, 0x42, 0xDE, 0x0B, 0xB8, 0x3D, 0xCC, 0x00, 0x04, 0xC7, 0x2C, 0xE1, 0x00, 0x07, 0xA5, 0x49 },
+ { 0x7F, 0x50, 0x7A, 0xBC, 0x6D, 0x19, 0xBA, 0x00, 0xC0, 0x65, 0xA8, 0x76, 0xEC, 0x56, 0x57, 0x86, 0x88, 0x82, 0xD1, 0x8A, 0x22, 0x1B, 0xC4, 0x6C, 0x7A, 0x69, 0x12, 0x54, 0x1F, 0x5B, 0xC7, 0xBA },
+ { 0xA0, 0x60, 0x7C, 0x24, 0xE1, 0x4E, 0x8C, 0x22, 0x3D, 0xB0, 0xD7, 0x0B, 0x4D, 0x30, 0xEE, 0x88, 0x01, 0x4D, 0x60, 0x3F, 0x43, 0x7E, 0x9E, 0x02, 0xAA, 0x7D, 0xAF, 0xA3, 0xCD, 0xFB, 0xAD, 0x94 },
+ { 0xDD, 0xBF, 0xEA, 0x75, 0xCC, 0x46, 0x78, 0x82, 0xEB, 0x34, 0x83, 0xCE, 0x5E, 0x2E, 0x75, 0x6A, 0x4F, 0x47, 0x01, 0xB7, 0x6B, 0x44, 0x55, 0x19, 0xE8, 0x9F, 0x22, 0xD6, 0x0F, 0xA8, 0x6E, 0x06 },
+ { 0x0C, 0x31, 0x1F, 0x38, 0xC3, 0x5A, 0x4F, 0xB9, 0x0D, 0x65, 0x1C, 0x28, 0x9D, 0x48, 0x68, 0x56, 0xCD, 0x14, 0x13, 0xDF, 0x9B, 0x06, 0x77, 0xF5, 0x3E, 0xCE, 0x2C, 0xD9, 0xE4, 0x77, 0xC6, 0x0A },
+ { 0x46, 0xA7, 0x3A, 0x8D, 0xD3, 0xE7, 0x0F, 0x59, 0xD3, 0x94, 0x2C, 0x01, 0xDF, 0x59, 0x9D, 0xEF, 0x78, 0x3C, 0x9D, 0xA8, 0x2F, 0xD8, 0x32, 0x22, 0xCD, 0x66, 0x2B, 0x53, 0xDC, 0xE7, 0xDB, 0xDF },
+ { 0xAD, 0x03, 0x8F, 0xF9, 0xB1, 0x4D, 0xE8, 0x4A, 0x80, 0x1E, 0x4E, 0x62, 0x1C, 0xE5, 0xDF, 0x02, 0x9D, 0xD9, 0x35, 0x20, 0xD0, 0xC2, 0xFA, 0x38, 0xBF, 0xF1, 0x76, 0xA8, 0xB1, 0xD1, 0x69, 0x8C },
+ { 0xAB, 0x70, 0xC5, 0xDF, 0xBD, 0x1E, 0xA8, 0x17, 0xFE, 0xD0, 0xCD, 0x06, 0x72, 0x93, 0xAB, 0xF3, 0x19, 0xE5, 0xD7, 0x90, 0x1C, 0x21, 0x41, 0xD5, 0xD9, 0x9B, 0x23, 0xF0, 0x3A, 0x38, 0xE7, 0x48 },
+ { 0x1F, 0xFF, 0xDA, 0x67, 0x93, 0x2B, 0x73, 0xC8, 0xEC, 0xAF, 0x00, 0x9A, 0x34, 0x91, 0xA0, 0x26, 0x95, 0x3B, 0xAB, 0xFE, 0x1F, 0x66, 0x3B, 0x06, 0x97, 0xC3, 0xC4, 0xAE, 0x8B, 0x2E, 0x7D, 0xCB },
+ { 0xB0, 0xD2, 0xCC, 0x19, 0x47, 0x2D, 0xD5, 0x7F, 0x2B, 0x17, 0xEF, 0xC0, 0x3C, 0x8D, 0x58, 0xC2, 0x28, 0x3D, 0xBB, 0x19, 0xDA, 0x57, 0x2F, 0x77, 0x55, 0x85, 0x5A, 0xA9, 0x79, 0x43, 0x17, 0xA0 },
+ { 0xA0, 0xD1, 0x9A, 0x6E, 0xE3, 0x39, 0x79, 0xC3, 0x25, 0x51, 0x0E, 0x27, 0x66, 0x22, 0xDF, 0x41, 0xF7, 0x15, 0x83, 0xD0, 0x75, 0x01, 0xB8, 0x70, 0x71, 0x12, 0x9A, 0x0A, 0xD9, 0x47, 0x32, 0xA5 },
+ { 0x72, 0x46, 0x42, 0xA7, 0x03, 0x2D, 0x10, 0x62, 0xB8, 0x9E, 0x52, 0xBE, 0xA3, 0x4B, 0x75, 0xDF, 0x7D, 0x8F, 0xE7, 0x72, 0xD9, 0xFE, 0x3C, 0x93, 0xDD, 0xF3, 0xC4, 0x54, 0x5A, 0xB5, 0xA9, 0x9B },
+ { 0xAD, 0xE5, 0xEA, 0xA7, 0xE6, 0x1F, 0x67, 0x2D, 0x58, 0x7E, 0xA0, 0x3D, 0xAE, 0x7D, 0x7B, 0x55, 0x22, 0x9C, 0x01, 0xD0, 0x6B, 0xC0, 0xA5, 0x70, 0x14, 0x36, 0xCB, 0xD1, 0x83, 0x66, 0xA6, 0x26 },
+ { 0x01, 0x3B, 0x31, 0xEB, 0xD2, 0x28, 0xFC, 0xDD, 0xA5, 0x1F, 0xAB, 0xB0, 0x3B, 0xB0, 0x2D, 0x60, 0xAC, 0x20, 0xCA, 0x21, 0x5A, 0xAF, 0xA8, 0x3B, 0xDD, 0x85, 0x5E, 0x37, 0x55, 0xA3, 0x5F, 0x0B },
+ { 0x33, 0x2E, 0xD4, 0x0B, 0xB1, 0x0D, 0xDE, 0x3C, 0x95, 0x4A, 0x75, 0xD7, 0xB8, 0x99, 0x9D, 0x4B, 0x26, 0xA1, 0xC0, 0x63, 0xC1, 0xDC, 0x6E, 0x32, 0xC1, 0xD9, 0x1B, 0xAB, 0x7B, 0xBB, 0x7D, 0x16 },
+ { 0xC7, 0xA1, 0x97, 0xB3, 0xA0, 0x5B, 0x56, 0x6B, 0xCC, 0x9F, 0xAC, 0xD2, 0x0E, 0x44, 0x1D, 0x6F, 0x6C, 0x28, 0x60, 0xAC, 0x96, 0x51, 0xCD, 0x51, 0xD6, 0xB9, 0xD2, 0xCD, 0xEE, 0xEA, 0x03, 0x90 },
+ { 0xBD, 0x9C, 0xF6, 0x4E, 0xA8, 0x95, 0x3C, 0x03, 0x71, 0x08, 0xE6, 0xF6, 0x54, 0x91, 0x4F, 0x39, 0x58, 0xB6, 0x8E, 0x29, 0xC1, 0x67, 0x00, 0xDC, 0x18, 0x4D, 0x94, 0xA2, 0x17, 0x08, 0xFF, 0x60 },
+ { 0x88, 0x35, 0xB0, 0xAC, 0x02, 0x11, 0x51, 0xDF, 0x71, 0x64, 0x74, 0xCE, 0x27, 0xCE, 0x4D, 0x3C, 0x15, 0xF0, 0xB2, 0xDA, 0xB4, 0x80, 0x03, 0xCF, 0x3F, 0x3E, 0xFD, 0x09, 0x45, 0x10, 0x6B, 0x9A },
+ { 0x3B, 0xFE, 0xFA, 0x33, 0x01, 0xAA, 0x55, 0xC0, 0x80, 0x19, 0x0C, 0xFF, 0xDA, 0x8E, 0xAE, 0x51, 0xD9, 0xAF, 0x48, 0x8B, 0x4C, 0x1F, 0x24, 0xC3, 0xD9, 0xA7, 0x52, 0x42, 0xFD, 0x8E, 0xA0, 0x1D },
+ { 0x08, 0x28, 0x4D, 0x14, 0x99, 0x3C, 0xD4, 0x7D, 0x53, 0xEB, 0xAE, 0xCF, 0x0D, 0xF0, 0x47, 0x8C, 0xC1, 0x82, 0xC8, 0x9C, 0x00, 0xE1, 0x85, 0x9C, 0x84, 0x85, 0x16, 0x86, 0xDD, 0xF2, 0xC1, 0xB7 },
+ { 0x1E, 0xD7, 0xEF, 0x9F, 0x04, 0xC2, 0xAC, 0x8D, 0xB6, 0xA8, 0x64, 0xDB, 0x13, 0x10, 0x87, 0xF2, 0x70, 0x65, 0x09, 0x8E, 0x69, 0xC3, 0xFE, 0x78, 0x71, 0x8D, 0x9B, 0x94, 0x7F, 0x4A, 0x39, 0xD0 },
+ { 0xC1, 0x61, 0xF2, 0xDC, 0xD5, 0x7E, 0x9C, 0x14, 0x39, 0xB3, 0x1A, 0x9D, 0xD4, 0x3D, 0x8F, 0x3D, 0x7D, 0xD8, 0xF0, 0xEB, 0x7C, 0xFA, 0xC6, 0xFB, 0x25, 0xA0, 0xF2, 0x8E, 0x30, 0x6F, 0x06, 0x61 },
+ { 0xC0, 0x19, 0x69, 0xAD, 0x34, 0xC5, 0x2C, 0xAF, 0x3D, 0xC4, 0xD8, 0x0D, 0x19, 0x73, 0x5C, 0x29, 0x73, 0x1A, 0xC6, 0xE7, 0xA9, 0x20, 0x85, 0xAB, 0x92, 0x50, 0xC4, 0x8D, 0xEA, 0x48, 0xA3, 0xFC },
+ { 0x17, 0x20, 0xB3, 0x65, 0x56, 0x19, 0xD2, 0xA5, 0x2B, 0x35, 0x21, 0xAE, 0x0E, 0x49, 0xE3, 0x45, 0xCB, 0x33, 0x89, 0xEB, 0xD6, 0x20, 0x8A, 0xCA, 0xF9, 0xF1, 0x3F, 0xDA, 0xCC, 0xA8, 0xBE, 0x49 },
+ { 0x75, 0x62, 0x88, 0x36, 0x1C, 0x83, 0xE2, 0x4C, 0x61, 0x7C, 0xF9, 0x5C, 0x90, 0x5B, 0x22, 0xD0, 0x17, 0xCD, 0xC8, 0x6F, 0x0B, 0xF1, 0xD6, 0x58, 0xF4, 0x75, 0x6C, 0x73, 0x79, 0x87, 0x3B, 0x7F },
+ { 0xE7, 0xD0, 0xED, 0xA3, 0x45, 0x26, 0x93, 0xB7, 0x52, 0xAB, 0xCD, 0xA1, 0xB5, 0x5E, 0x27, 0x6F, 0x82, 0x69, 0x8F, 0x5F, 0x16, 0x05, 0x40, 0x3E, 0xFF, 0x83, 0x0B, 0xEA, 0x00, 0x71, 0xA3, 0x94 },
+ { 0x2C, 0x82, 0xEC, 0xAA, 0x6B, 0x84, 0x80, 0x3E, 0x04, 0x4A, 0xF6, 0x31, 0x18, 0xAF, 0xE5, 0x44, 0x68, 0x7C, 0xB6, 0xE6, 0xC7, 0xDF, 0x49, 0xED, 0x76, 0x2D, 0xFD, 0x7C, 0x86, 0x93, 0xA1, 0xBC },
+ { 0x61, 0x36, 0xCB, 0xF4, 0xB4, 0x41, 0x05, 0x6F, 0xA1, 0xE2, 0x72, 0x24, 0x98, 0x12, 0x5D, 0x6D, 0xED, 0x45, 0xE1, 0x7B, 0x52, 0x14, 0x39, 0x59, 0xC7, 0xF4, 0xD4, 0xE3, 0x95, 0x21, 0x8A, 0xC2 },
+ { 0x72, 0x1D, 0x32, 0x45, 0xAA, 0xFE, 0xF2, 0x7F, 0x6A, 0x62, 0x4F, 0x47, 0x95, 0x4B, 0x6C, 0x25, 0x50, 0x79, 0x52, 0x6F, 0xFA, 0x25, 0xE9, 0xFF, 0x77, 0xE5, 0xDC, 0xFF, 0x47, 0x3B, 0x15, 0x97 },
+ { 0x9D, 0xD2, 0xFB, 0xD8, 0xCE, 0xF1, 0x6C, 0x35, 0x3C, 0x0A, 0xC2, 0x11, 0x91, 0xD5, 0x09, 0xEB, 0x28, 0xDD, 0x9E, 0x3E, 0x0D, 0x8C, 0xEA, 0x5D, 0x26, 0xCA, 0x83, 0x93, 0x93, 0x85, 0x1C, 0x3A },
+ { 0xB2, 0x39, 0x4C, 0xEA, 0xCD, 0xEB, 0xF2, 0x1B, 0xF9, 0xDF, 0x2C, 0xED, 0x98, 0xE5, 0x8F, 0x1C, 0x3A, 0x4B, 0xBB, 0xFF, 0x66, 0x0D, 0xD9, 0x00, 0xF6, 0x22, 0x02, 0xD6, 0x78, 0x5C, 0xC4, 0x6E },
+ { 0x57, 0x08, 0x9F, 0x22, 0x27, 0x49, 0xAD, 0x78, 0x71, 0x76, 0x5F, 0x06, 0x2B, 0x11, 0x4F, 0x43, 0xBA, 0x20, 0xEC, 0x56, 0x42, 0x2A, 0x8B, 0x1E, 0x3F, 0x87, 0x19, 0x2C, 0x0E, 0xA7, 0x18, 0xC6 },
+ { 0xE4, 0x9A, 0x94, 0x59, 0x96, 0x1C, 0xD3, 0x3C, 0xDF, 0x4A, 0xAE, 0x1B, 0x10, 0x78, 0xA5, 0xDE, 0xA7, 0xC0, 0x40, 0xE0, 0xFE, 0xA3, 0x40, 0xC9, 0x3A, 0x72, 0x48, 0x72, 0xFC, 0x4A, 0xF8, 0x06 },
+ { 0xED, 0xE6, 0x7F, 0x72, 0x0E, 0xFF, 0xD2, 0xCA, 0x9C, 0x88, 0x99, 0x41, 0x52, 0xD0, 0x20, 0x1D, 0xEE, 0x6B, 0x0A, 0x2D, 0x2C, 0x07, 0x7A, 0xCA, 0x6D, 0xAE, 0x29, 0xF7, 0x3F, 0x8B, 0x63, 0x09 },
+ { 0xE0, 0xF4, 0x34, 0xBF, 0x22, 0xE3, 0x08, 0x80, 0x39, 0xC2, 0x1F, 0x71, 0x9F, 0xFC, 0x67, 0xF0, 0xF2, 0xCB, 0x5E, 0x98, 0xA7, 0xA0, 0x19, 0x4C, 0x76, 0xE9, 0x6B, 0xF4, 0xE8, 0xE1, 0x7E, 0x61 },
+ { 0x27, 0x7C, 0x04, 0xE2, 0x85, 0x34, 0x84, 0xA4, 0xEB, 0xA9, 0x10, 0xAD, 0x33, 0x6D, 0x01, 0xB4, 0x77, 0xB6, 0x7C, 0xC2, 0x00, 0xC5, 0x9F, 0x3C, 0x8D, 0x77, 0xEE, 0xF8, 0x49, 0x4F, 0x29, 0xCD },
+ { 0x15, 0x6D, 0x57, 0x47, 0xD0, 0xC9, 0x9C, 0x7F, 0x27, 0x09, 0x7D, 0x7B, 0x7E, 0x00, 0x2B, 0x2E, 0x18, 0x5C, 0xB7, 0x2D, 0x8D, 0xD7, 0xEB, 0x42, 0x4A, 0x03, 0x21, 0x52, 0x81, 0x61, 0x21, 0x9F },
+ { 0x20, 0xDD, 0xD1, 0xED, 0x9B, 0x1C, 0xA8, 0x03, 0x94, 0x6D, 0x64, 0xA8, 0x3A, 0xE4, 0x65, 0x9D, 0xA6, 0x7F, 0xBA, 0x7A, 0x1A, 0x3E, 0xDD, 0xB1, 0xE1, 0x03, 0xC0, 0xF5, 0xE0, 0x3E, 0x3A, 0x2C },
+ { 0xF0, 0xAF, 0x60, 0x4D, 0x3D, 0xAB, 0xBF, 0x9A, 0x0F, 0x2A, 0x7D, 0x3D, 0xDA, 0x6B, 0xD3, 0x8B, 0xBA, 0x72, 0xC6, 0xD0, 0x9B, 0xE4, 0x94, 0xFC, 0xEF, 0x71, 0x3F, 0xF1, 0x01, 0x89, 0xB6, 0xE6 },
+ { 0x98, 0x02, 0xBB, 0x87, 0xDE, 0xF4, 0xCC, 0x10, 0xC4, 0xA5, 0xFD, 0x49, 0xAA, 0x58, 0xDF, 0xE2, 0xF3, 0xFD, 0xDB, 0x46, 0xB4, 0x70, 0x88, 0x14, 0xEA, 0xD8, 0x1D, 0x23, 0xBA, 0x95, 0x13, 0x9B },
+ { 0x4F, 0x8C, 0xE1, 0xE5, 0x1D, 0x2F, 0xE7, 0xF2, 0x40, 0x43, 0xA9, 0x04, 0xD8, 0x98, 0xEB, 0xFC, 0x91, 0x97, 0x54, 0x18, 0x75, 0x34, 0x13, 0xAA, 0x09, 0x9B, 0x79, 0x5E, 0xCB, 0x35, 0xCE, 0xDB },
+ { 0xBD, 0xDC, 0x65, 0x14, 0xD7, 0xEE, 0x6A, 0xCE, 0x0A, 0x4A, 0xC1, 0xD0, 0xE0, 0x68, 0x11, 0x22, 0x88, 0xCB, 0xCF, 0x56, 0x04, 0x54, 0x64, 0x27, 0x05, 0x63, 0x01, 0x77, 0xCB, 0xA6, 0x08, 0xBD },
+ { 0xD6, 0x35, 0x99, 0x4F, 0x62, 0x91, 0x51, 0x7B, 0x02, 0x81, 0xFF, 0xDD, 0x49, 0x6A, 0xFA, 0x86, 0x27, 0x12, 0xE5, 0xB3, 0xC4, 0xE5, 0x2E, 0x4C, 0xD5, 0xFD, 0xAE, 0x8C, 0x0E, 0x72, 0xFB, 0x08 },
+ { 0x87, 0x8D, 0x9C, 0xA6, 0x00, 0xCF, 0x87, 0xE7, 0x69, 0xCC, 0x30, 0x5C, 0x1B, 0x35, 0x25, 0x51, 0x86, 0x61, 0x5A, 0x73, 0xA0, 0xDA, 0x61, 0x3B, 0x5F, 0x1C, 0x98, 0xDB, 0xF8, 0x12, 0x83, 0xEA },
+ { 0xA6, 0x4E, 0xBE, 0x5D, 0xC1, 0x85, 0xDE, 0x9F, 0xDD, 0xE7, 0x60, 0x7B, 0x69, 0x98, 0x70, 0x2E, 0xB2, 0x34, 0x56, 0x18, 0x49, 0x57, 0x30, 0x7D, 0x2F, 0xA7, 0x2E, 0x87, 0xA4, 0x77, 0x02, 0xD6 },
+ { 0xCE, 0x50, 0xEA, 0xB7, 0xB5, 0xEB, 0x52, 0xBD, 0xC9, 0xAD, 0x8E, 0x5A, 0x48, 0x0A, 0xB7, 0x80, 0xCA, 0x93, 0x20, 0xE4, 0x43, 0x60, 0xB1, 0xFE, 0x37, 0xE0, 0x3F, 0x2F, 0x7A, 0xD7, 0xDE, 0x01 },
+ { 0xEE, 0xDD, 0xB7, 0xC0, 0xDB, 0x6E, 0x30, 0xAB, 0xE6, 0x6D, 0x79, 0xE3, 0x27, 0x51, 0x1E, 0x61, 0xFC, 0xEB, 0xBC, 0x29, 0xF1, 0x59, 0xB4, 0x0A, 0x86, 0xB0, 0x46, 0xEC, 0xF0, 0x51, 0x38, 0x23 },
+ { 0x78, 0x7F, 0xC9, 0x34, 0x40, 0xC1, 0xEC, 0x96, 0xB5, 0xAD, 0x01, 0xC1, 0x6C, 0xF7, 0x79, 0x16, 0xA1, 0x40, 0x5F, 0x94, 0x26, 0x35, 0x6E, 0xC9, 0x21, 0xD8, 0xDF, 0xF3, 0xEA, 0x63, 0xB7, 0xE0 },
+ { 0x7F, 0x0D, 0x5E, 0xAB, 0x47, 0xEE, 0xFD, 0xA6, 0x96, 0xC0, 0xBF, 0x0F, 0xBF, 0x86, 0xAB, 0x21, 0x6F, 0xCE, 0x46, 0x1E, 0x93, 0x03, 0xAB, 0xA6, 0xAC, 0x37, 0x41, 0x20, 0xE8, 0x90, 0xE8, 0xDF },
+ { 0xB6, 0x80, 0x04, 0xB4, 0x2F, 0x14, 0xAD, 0x02, 0x9F, 0x4C, 0x2E, 0x03, 0xB1, 0xD5, 0xEB, 0x76, 0xD5, 0x71, 0x60, 0xE2, 0x64, 0x76, 0xD2, 0x11, 0x31, 0xBE, 0xF2, 0x0A, 0xDA, 0x7D, 0x27, 0xF4 },
+ { 0xB0, 0xC4, 0xEB, 0x18, 0xAE, 0x25, 0x0B, 0x51, 0xA4, 0x13, 0x82, 0xEA, 0xD9, 0x2D, 0x0D, 0xC7, 0x45, 0x5F, 0x93, 0x79, 0xFC, 0x98, 0x84, 0x42, 0x8E, 0x47, 0x70, 0x60, 0x8D, 0xB0, 0xFA, 0xEC },
+ { 0xF9, 0x2B, 0x7A, 0x87, 0x0C, 0x05, 0x9F, 0x4D, 0x46, 0x46, 0x4C, 0x82, 0x4E, 0xC9, 0x63, 0x55, 0x14, 0x0B, 0xDC, 0xE6, 0x81, 0x32, 0x2C, 0xC3, 0xA9, 0x92, 0xFF, 0x10, 0x3E, 0x3F, 0xEA, 0x52 },
+ { 0x53, 0x64, 0x31, 0x26, 0x14, 0x81, 0x33, 0x98, 0xCC, 0x52, 0x5D, 0x4C, 0x4E, 0x14, 0x6E, 0xDE, 0xB3, 0x71, 0x26, 0x5F, 0xBA, 0x19, 0x13, 0x3A, 0x2C, 0x3D, 0x21, 0x59, 0x29, 0x8A, 0x17, 0x42 },
+ { 0xF6, 0x62, 0x0E, 0x68, 0xD3, 0x7F, 0xB2, 0xAF, 0x50, 0x00, 0xFC, 0x28, 0xE2, 0x3B, 0x83, 0x22, 0x97, 0xEC, 0xD8, 0xBC, 0xE9, 0x9E, 0x8B, 0xE4, 0xD0, 0x4E, 0x85, 0x30, 0x9E, 0x3D, 0x33, 0x74 },
+ { 0x53, 0x16, 0xA2, 0x79, 0x69, 0xD7, 0xFE, 0x04, 0xFF, 0x27, 0xB2, 0x83, 0x96, 0x1B, 0xFF, 0xC3, 0xBF, 0x5D, 0xFB, 0x32, 0xFB, 0x6A, 0x89, 0xD1, 0x01, 0xC6, 0xC3, 0xB1, 0x93, 0x7C, 0x28, 0x71 },
+ { 0x81, 0xD1, 0x66, 0x4F, 0xDF, 0x3C, 0xB3, 0x3C, 0x24, 0xEE, 0xBA, 0xC0, 0xBD, 0x64, 0x24, 0x4B, 0x77, 0xC4, 0xAB, 0xEA, 0x90, 0xBB, 0xE8, 0xB5, 0xEE, 0x0B, 0x2A, 0xAF, 0xCF, 0x2D, 0x6A, 0x53 },
+ { 0x34, 0x57, 0x82, 0xF2, 0x95, 0xB0, 0x88, 0x03, 0x52, 0xE9, 0x24, 0xA0, 0x46, 0x7B, 0x5F, 0xBC, 0x3E, 0x8F, 0x3B, 0xFB, 0xC3, 0xC7, 0xE4, 0x8B, 0x67, 0x09, 0x1F, 0xB5, 0xE8, 0x0A, 0x94, 0x42 },
+ { 0x79, 0x41, 0x11, 0xEA, 0x6C, 0xD6, 0x5E, 0x31, 0x1F, 0x74, 0xEE, 0x41, 0xD4, 0x76, 0xCB, 0x63, 0x2C, 0xE1, 0xE4, 0xB0, 0x51, 0xDC, 0x1D, 0x9E, 0x9D, 0x06, 0x1A, 0x19, 0xE1, 0xD0, 0xBB, 0x49 },
+ { 0x2A, 0x85, 0xDA, 0xF6, 0x13, 0x88, 0x16, 0xB9, 0x9B, 0xF8, 0xD0, 0x8B, 0xA2, 0x11, 0x4B, 0x7A, 0xB0, 0x79, 0x75, 0xA7, 0x84, 0x20, 0xC1, 0xA3, 0xB0, 0x6A, 0x77, 0x7C, 0x22, 0xDD, 0x8B, 0xCB },
+ { 0x89, 0xB0, 0xD5, 0xF2, 0x89, 0xEC, 0x16, 0x40, 0x1A, 0x06, 0x9A, 0x96, 0x0D, 0x0B, 0x09, 0x3E, 0x62, 0x5D, 0xA3, 0xCF, 0x41, 0xEE, 0x29, 0xB5, 0x9B, 0x93, 0x0C, 0x58, 0x20, 0x14, 0x54, 0x55 },
+ { 0xD0, 0xFD, 0xCB, 0x54, 0x39, 0x43, 0xFC, 0x27, 0xD2, 0x08, 0x64, 0xF5, 0x21, 0x81, 0x47, 0x1B, 0x94, 0x2C, 0xC7, 0x7C, 0xA6, 0x75, 0xBC, 0xB3, 0x0D, 0xF3, 0x1D, 0x35, 0x8E, 0xF7, 0xB1, 0xEB },
+ { 0xB1, 0x7E, 0xA8, 0xD7, 0x70, 0x63, 0xC7, 0x09, 0xD4, 0xDC, 0x6B, 0x87, 0x94, 0x13, 0xC3, 0x43, 0xE3, 0x79, 0x0E, 0x9E, 0x62, 0xCA, 0x85, 0xB7, 0x90, 0x0B, 0x08, 0x6F, 0x6B, 0x75, 0xC6, 0x72 },
+ { 0xE7, 0x1A, 0x3E, 0x2C, 0x27, 0x4D, 0xB8, 0x42, 0xD9, 0x21, 0x14, 0xF2, 0x17, 0xE2, 0xC0, 0xEA, 0xC8, 0xB4, 0x50, 0x93, 0xFD, 0xFD, 0x9D, 0xF4, 0xCA, 0x71, 0x62, 0x39, 0x48, 0x62, 0xD5, 0x01 },
+ { 0xC0, 0x47, 0x67, 0x59, 0xAB, 0x7A, 0xA3, 0x33, 0x23, 0x4F, 0x6B, 0x44, 0xF5, 0xFD, 0x85, 0x83, 0x90, 0xEC, 0x23, 0x69, 0x4C, 0x62, 0x2C, 0xB9, 0x86, 0xE7, 0x69, 0xC7, 0x8E, 0xDD, 0x73, 0x3E },
+ { 0x9A, 0xB8, 0xEA, 0xBB, 0x14, 0x16, 0x43, 0x4D, 0x85, 0x39, 0x13, 0x41, 0xD5, 0x69, 0x93, 0xC5, 0x54, 0x58, 0x16, 0x7D, 0x44, 0x18, 0xB1, 0x9A, 0x0F, 0x2A, 0xD8, 0xB7, 0x9A, 0x83, 0xA7, 0x5B },
+ { 0x79, 0x92, 0xD0, 0xBB, 0xB1, 0x5E, 0x23, 0x82, 0x6F, 0x44, 0x3E, 0x00, 0x50, 0x5D, 0x68, 0xD3, 0xED, 0x73, 0x72, 0x99, 0x5A, 0x5C, 0x3E, 0x49, 0x86, 0x54, 0x10, 0x2F, 0xBC, 0xD0, 0x96, 0x4E },
+ { 0xC0, 0x21, 0xB3, 0x00, 0x85, 0x15, 0x14, 0x35, 0xDF, 0x33, 0xB0, 0x07, 0xCC, 0xEC, 0xC6, 0x9D, 0xF1, 0x26, 0x9F, 0x39, 0xBA, 0x25, 0x09, 0x2B, 0xED, 0x59, 0xD9, 0x32, 0xAC, 0x0F, 0xDC, 0x28 },
+ { 0x91, 0xA2, 0x5E, 0xC0, 0xEC, 0x0D, 0x9A, 0x56, 0x7F, 0x89, 0xC4, 0xBF, 0xE1, 0xA6, 0x5A, 0x0E, 0x43, 0x2D, 0x07, 0x06, 0x4B, 0x41, 0x90, 0xE2, 0x7D, 0xFB, 0x81, 0x90, 0x1F, 0xD3, 0x13, 0x9B },
+ { 0x59, 0x50, 0xD3, 0x9A, 0x23, 0xE1, 0x54, 0x5F, 0x30, 0x12, 0x70, 0xAA, 0x1A, 0x12, 0xF2, 0xE6, 0xC4, 0x53, 0x77, 0x6E, 0x4D, 0x63, 0x55, 0xDE, 0x42, 0x5C, 0xC1, 0x53, 0xF9, 0x81, 0x88, 0x67 },
+ { 0xD7, 0x9F, 0x14, 0x72, 0x0C, 0x61, 0x0A, 0xF1, 0x79, 0xA3, 0x76, 0x5D, 0x4B, 0x7C, 0x09, 0x68, 0xF9, 0x77, 0x96, 0x2D, 0xBF, 0x65, 0x5B, 0x52, 0x12, 0x72, 0xB6, 0xF1, 0xE1, 0x94, 0x48, 0x8E },
+ { 0xE9, 0x53, 0x1B, 0xFC, 0x8B, 0x02, 0x99, 0x5A, 0xEA, 0xA7, 0x5B, 0xA2, 0x70, 0x31, 0xFA, 0xDB, 0xCB, 0xF4, 0xA0, 0xDA, 0xB8, 0x96, 0x1D, 0x92, 0x96, 0xCD, 0x7E, 0x84, 0xD2, 0x5D, 0x60, 0x06 },
+ { 0x34, 0xE9, 0xC2, 0x6A, 0x01, 0xD7, 0xF1, 0x61, 0x81, 0xB4, 0x54, 0xA9, 0xD1, 0x62, 0x3C, 0x23, 0x3C, 0xB9, 0x9D, 0x31, 0xC6, 0x94, 0x65, 0x6E, 0x94, 0x13, 0xAC, 0xA3, 0xE9, 0x18, 0x69, 0x2F },
+ { 0xD9, 0xD7, 0x42, 0x2F, 0x43, 0x7B, 0xD4, 0x39, 0xDD, 0xD4, 0xD8, 0x83, 0xDA, 0xE2, 0xA0, 0x83, 0x50, 0x17, 0x34, 0x14, 0xBE, 0x78, 0x15, 0x51, 0x33, 0xFF, 0xF1, 0x96, 0x4C, 0x3D, 0x79, 0x72 },
+ { 0x4A, 0xEE, 0x0C, 0x7A, 0xAF, 0x07, 0x54, 0x14, 0xFF, 0x17, 0x93, 0xEA, 0xD7, 0xEA, 0xCA, 0x60, 0x17, 0x75, 0xC6, 0x15, 0xDB, 0xD6, 0x0B, 0x64, 0x0B, 0x0A, 0x9F, 0x0C, 0xE5, 0x05, 0xD4, 0x35 },
+ { 0x6B, 0xFD, 0xD1, 0x54, 0x59, 0xC8, 0x3B, 0x99, 0xF0, 0x96, 0xBF, 0xB4, 0x9E, 0xE8, 0x7B, 0x06, 0x3D, 0x69, 0xC1, 0x97, 0x4C, 0x69, 0x28, 0xAC, 0xFC, 0xFB, 0x40, 0x99, 0xF8, 0xC4, 0xEF, 0x67 },
+ { 0x9F, 0xD1, 0xC4, 0x08, 0xFD, 0x75, 0xC3, 0x36, 0x19, 0x3A, 0x2A, 0x14, 0xD9, 0x4F, 0x6A, 0xF5, 0xAD, 0xF0, 0x50, 0xB8, 0x03, 0x87, 0xB4, 0xB0, 0x10, 0xFB, 0x29, 0xF4, 0xCC, 0x72, 0x70, 0x7C },
+ { 0x13, 0xC8, 0x84, 0x80, 0xA5, 0xD0, 0x0D, 0x6C, 0x8C, 0x7A, 0xD2, 0x11, 0x0D, 0x76, 0xA8, 0x2D, 0x9B, 0x70, 0xF4, 0xFA, 0x66, 0x96, 0xD4, 0xE5, 0xDD, 0x42, 0xA0, 0x66, 0xDC, 0xAF, 0x99, 0x20 },
+ { 0x82, 0x0E, 0x72, 0x5E, 0xE2, 0x5F, 0xE8, 0xFD, 0x3A, 0x8D, 0x5A, 0xBE, 0x4C, 0x46, 0xC3, 0xBA, 0x88, 0x9D, 0xE6, 0xFA, 0x91, 0x91, 0xAA, 0x22, 0xBA, 0x67, 0xD5, 0x70, 0x54, 0x21, 0x54, 0x2B },
+ { 0x32, 0xD9, 0x3A, 0x0E, 0xB0, 0x2F, 0x42, 0xFB, 0xBC, 0xAF, 0x2B, 0xAD, 0x00, 0x85, 0xB2, 0x82, 0xE4, 0x60, 0x46, 0xA4, 0xDF, 0x7A, 0xD1, 0x06, 0x57, 0xC9, 0xD6, 0x47, 0x63, 0x75, 0xB9, 0x3E },
+ { 0xAD, 0xC5, 0x18, 0x79, 0x05, 0xB1, 0x66, 0x9C, 0xD8, 0xEC, 0x9C, 0x72, 0x1E, 0x19, 0x53, 0x78, 0x6B, 0x9D, 0x89, 0xA9, 0xBA, 0xE3, 0x07, 0x80, 0xF1, 0xE1, 0xEA, 0xB2, 0x4A, 0x00, 0x52, 0x3C },
+ { 0xE9, 0x07, 0x56, 0xFF, 0x7F, 0x9A, 0xD8, 0x10, 0xB2, 0x39, 0xA1, 0x0C, 0xED, 0x2C, 0xF9, 0xB2, 0x28, 0x43, 0x54, 0xC1, 0xF8, 0xC7, 0xE0, 0xAC, 0xCC, 0x24, 0x61, 0xDC, 0x79, 0x6D, 0x6E, 0x89 },
+ { 0x12, 0x51, 0xF7, 0x6E, 0x56, 0x97, 0x84, 0x81, 0x87, 0x53, 0x59, 0x80, 0x1D, 0xB5, 0x89, 0xA0, 0xB2, 0x2F, 0x86, 0xD8, 0xD6, 0x34, 0xDC, 0x04, 0x50, 0x6F, 0x32, 0x2E, 0xD7, 0x8F, 0x17, 0xE8 },
+ { 0x3A, 0xFA, 0x89, 0x9F, 0xD9, 0x80, 0xE7, 0x3E, 0xCB, 0x7F, 0x4D, 0x8B, 0x8F, 0x29, 0x1D, 0xC9, 0xAF, 0x79, 0x6B, 0xC6, 0x5D, 0x27, 0xF9, 0x74, 0xC6, 0xF1, 0x93, 0xC9, 0x19, 0x1A, 0x09, 0xFD },
+ { 0xAA, 0x30, 0x5B, 0xE2, 0x6E, 0x5D, 0xED, 0xDC, 0x3C, 0x10, 0x10, 0xCB, 0xC2, 0x13, 0xF9, 0x5F, 0x05, 0x1C, 0x78, 0x5C, 0x5B, 0x43, 0x1E, 0x6A, 0x7C, 0xD0, 0x48, 0xF1, 0x61, 0x78, 0x75, 0x28 },
+ { 0x8E, 0xA1, 0x88, 0x4F, 0xF3, 0x2E, 0x9D, 0x10, 0xF0, 0x39, 0xB4, 0x07, 0xD0, 0xD4, 0x4E, 0x7E, 0x67, 0x0A, 0xBD, 0x88, 0x4A, 0xEE, 0xE0, 0xFB, 0x75, 0x7A, 0xE9, 0x4E, 0xAA, 0x97, 0x37, 0x3D },
+ { 0xD4, 0x82, 0xB2, 0x15, 0x5D, 0x4D, 0xEC, 0x6B, 0x47, 0x36, 0xA1, 0xF1, 0x61, 0x7B, 0x53, 0xAA, 0xA3, 0x73, 0x10, 0x27, 0x7D, 0x3F, 0xEF, 0x0C, 0x37, 0xAD, 0x41, 0x76, 0x8F, 0xC2, 0x35, 0xB4 },
+ { 0x4D, 0x41, 0x39, 0x71, 0x38, 0x7E, 0x7A, 0x88, 0x98, 0xA8, 0xDC, 0x2A, 0x27, 0x50, 0x07, 0x78, 0x53, 0x9E, 0xA2, 0x14, 0xA2, 0xDF, 0xE9, 0xB3, 0xD7, 0xE8, 0xEB, 0xDC, 0xE5, 0xCF, 0x3D, 0xB3 },
+ { 0x69, 0x6E, 0x5D, 0x46, 0xE6, 0xC5, 0x7E, 0x87, 0x96, 0xE4, 0x73, 0x5D, 0x08, 0x91, 0x6E, 0x0B, 0x79, 0x29, 0xB3, 0xCF, 0x29, 0x8C, 0x29, 0x6D, 0x22, 0xE9, 0xD3, 0x01, 0x96, 0x53, 0x37, 0x1C },
+ { 0x1F, 0x56, 0x47, 0xC1, 0xD3, 0xB0, 0x88, 0x22, 0x88, 0x85, 0x86, 0x5C, 0x89, 0x40, 0x90, 0x8B, 0xF4, 0x0D, 0x1A, 0x82, 0x72, 0x82, 0x19, 0x73, 0xB1, 0x60, 0x00, 0x8E, 0x7A, 0x3C, 0xE2, 0xEB },
+ { 0xB6, 0xE7, 0x6C, 0x33, 0x0F, 0x02, 0x1A, 0x5B, 0xDA, 0x65, 0x87, 0x50, 0x10, 0xB0, 0xED, 0xF0, 0x91, 0x26, 0xC0, 0xF5, 0x10, 0xEA, 0x84, 0x90, 0x48, 0x19, 0x20, 0x03, 0xAE, 0xF4, 0xC6, 0x1C },
+ { 0x3C, 0xD9, 0x52, 0xA0, 0xBE, 0xAD, 0xA4, 0x1A, 0xBB, 0x42, 0x4C, 0xE4, 0x7F, 0x94, 0xB4, 0x2B, 0xE6, 0x4E, 0x1F, 0xFB, 0x0F, 0xD0, 0x78, 0x22, 0x76, 0x80, 0x79, 0x46, 0xD0, 0xD0, 0xBC, 0x55 },
+ { 0x98, 0xD9, 0x26, 0x77, 0x43, 0x9B, 0x41, 0xB7, 0xBB, 0x51, 0x33, 0x12, 0xAF, 0xB9, 0x2B, 0xCC, 0x8E, 0xE9, 0x68, 0xB2, 0xE3, 0xB2, 0x38, 0xCE, 0xCB, 0x9B, 0x0F, 0x34, 0xC9, 0xBB, 0x63, 0xD0 },
+ { 0xEC, 0xBC, 0xA2, 0xCF, 0x08, 0xAE, 0x57, 0xD5, 0x17, 0xAD, 0x16, 0x15, 0x8A, 0x32, 0xBF, 0xA7, 0xDC, 0x03, 0x82, 0xEA, 0xED, 0xA1, 0x28, 0xE9, 0x18, 0x86, 0x73, 0x4C, 0x24, 0xA0, 0xB2, 0x9D },
+ { 0x94, 0x2C, 0xC7, 0xC0, 0xB5, 0x2E, 0x2B, 0x16, 0xA4, 0xB8, 0x9F, 0xA4, 0xFC, 0x7E, 0x0B, 0xF6, 0x09, 0xE2, 0x9A, 0x08, 0xC1, 0xA8, 0x54, 0x34, 0x52, 0xB7, 0x7C, 0x7B, 0xFD, 0x11, 0xBB, 0x28 },
+ { 0x8A, 0x06, 0x5D, 0x8B, 0x61, 0xA0, 0xDF, 0xFB, 0x17, 0x0D, 0x56, 0x27, 0x73, 0x5A, 0x76, 0xB0, 0xE9, 0x50, 0x60, 0x37, 0x80, 0x8C, 0xBA, 0x16, 0xC3, 0x45, 0x00, 0x7C, 0x9F, 0x79, 0xCF, 0x8F },
+ { 0x1B, 0x9F, 0xA1, 0x97, 0x14, 0x65, 0x9C, 0x78, 0xFF, 0x41, 0x38, 0x71, 0x84, 0x92, 0x15, 0x36, 0x10, 0x29, 0xAC, 0x80, 0x2B, 0x1C, 0xBC, 0xD5, 0x4E, 0x40, 0x8B, 0xD8, 0x72, 0x87, 0xF8, 0x1F },
+ { 0x8D, 0xAB, 0x07, 0x1B, 0xCD, 0x6C, 0x72, 0x92, 0xA9, 0xEF, 0x72, 0x7B, 0x4A, 0xE0, 0xD8, 0x67, 0x13, 0x30, 0x1D, 0xA8, 0x61, 0x8D, 0x9A, 0x48, 0xAD, 0xCE, 0x55, 0xF3, 0x03, 0xA8, 0x69, 0xA1 },
+ { 0x82, 0x53, 0xE3, 0xE7, 0xC7, 0xB6, 0x84, 0xB9, 0xCB, 0x2B, 0xEB, 0x01, 0x4C, 0xE3, 0x30, 0xFF, 0x3D, 0x99, 0xD1, 0x7A, 0xBB, 0xDB, 0xAB, 0xE4, 0xF4, 0xD6, 0x74, 0xDE, 0xD5, 0x3F, 0xFC, 0x6B },
+ { 0xF1, 0x95, 0xF3, 0x21, 0xE9, 0xE3, 0xD6, 0xBD, 0x7D, 0x07, 0x45, 0x04, 0xDD, 0x2A, 0xB0, 0xE6, 0x24, 0x1F, 0x92, 0xE7, 0x84, 0xB1, 0xAA, 0x27, 0x1F, 0xF6, 0x48, 0xB1, 0xCA, 0xB6, 0xD7, 0xF6 },
+ { 0x27, 0xE4, 0xCC, 0x72, 0x09, 0x0F, 0x24, 0x12, 0x66, 0x47, 0x6A, 0x7C, 0x09, 0x49, 0x5F, 0x2D, 0xB1, 0x53, 0xD5, 0xBC, 0xBD, 0x76, 0x19, 0x03, 0xEF, 0x79, 0x27, 0x5E, 0xC5, 0x6B, 0x2E, 0xD8 },
+ { 0x89, 0x9C, 0x24, 0x05, 0x78, 0x8E, 0x25, 0xB9, 0x9A, 0x18, 0x46, 0x35, 0x5E, 0x64, 0x6D, 0x77, 0xCF, 0x40, 0x00, 0x83, 0x41, 0x5F, 0x7D, 0xC5, 0xAF, 0xE6, 0x9D, 0x6E, 0x17, 0xC0, 0x00, 0x23 },
+ { 0xA5, 0x9B, 0x78, 0xC4, 0x90, 0x57, 0x44, 0x07, 0x6B, 0xFE, 0xE8, 0x94, 0xDE, 0x70, 0x7D, 0x4F, 0x12, 0x0B, 0x5C, 0x68, 0x93, 0xEA, 0x04, 0x00, 0x29, 0x7D, 0x0B, 0xB8, 0x34, 0x72, 0x76, 0x32 },
+ { 0x59, 0xDC, 0x78, 0xB1, 0x05, 0x64, 0x97, 0x07, 0xA2, 0xBB, 0x44, 0x19, 0xC4, 0x8F, 0x00, 0x54, 0x00, 0xD3, 0x97, 0x3D, 0xE3, 0x73, 0x66, 0x10, 0x23, 0x04, 0x35, 0xB1, 0x04, 0x24, 0xB2, 0x4F },
+ { 0xC0, 0x14, 0x9D, 0x1D, 0x7E, 0x7A, 0x63, 0x53, 0xA6, 0xD9, 0x06, 0xEF, 0xE7, 0x28, 0xF2, 0xF3, 0x29, 0xFE, 0x14, 0xA4, 0x14, 0x9A, 0x3E, 0xA7, 0x76, 0x09, 0xBC, 0x42, 0xB9, 0x75, 0xDD, 0xFA },
+ { 0xA3, 0x2F, 0x24, 0x14, 0x74, 0xA6, 0xC1, 0x69, 0x32, 0xE9, 0x24, 0x3B, 0xE0, 0xCF, 0x09, 0xBC, 0xDC, 0x7E, 0x0C, 0xA0, 0xE7, 0xA6, 0xA1, 0xB9, 0xB1, 0xA0, 0xF0, 0x1E, 0x41, 0x50, 0x23, 0x77 },
+ { 0xB2, 0x39, 0xB2, 0xE4, 0xF8, 0x18, 0x41, 0x36, 0x1C, 0x13, 0x39, 0xF6, 0x8E, 0x2C, 0x35, 0x9F, 0x92, 0x9A, 0xF9, 0xAD, 0x9F, 0x34, 0xE0, 0x1A, 0xAB, 0x46, 0x31, 0xAD, 0x6D, 0x55, 0x00, 0xB0 },
+ { 0x85, 0xFB, 0x41, 0x9C, 0x70, 0x02, 0xA3, 0xE0, 0xB4, 0xB6, 0xEA, 0x09, 0x3B, 0x4C, 0x1A, 0xC6, 0x93, 0x66, 0x45, 0xB6, 0x5D, 0xAC, 0x5A, 0xC1, 0x5A, 0x85, 0x28, 0xB7, 0xB9, 0x4C, 0x17, 0x54 },
+ { 0x96, 0x19, 0x72, 0x06, 0x25, 0xF1, 0x90, 0xB9, 0x3A, 0x3F, 0xAD, 0x18, 0x6A, 0xB3, 0x14, 0x18, 0x96, 0x33, 0xC0, 0xD3, 0xA0, 0x1E, 0x6F, 0x9B, 0xC8, 0xC4, 0xA8, 0xF8, 0x2F, 0x38, 0x3D, 0xBF },
+ { 0x7D, 0x62, 0x0D, 0x90, 0xFE, 0x69, 0xFA, 0x46, 0x9A, 0x65, 0x38, 0x38, 0x89, 0x70, 0xA1, 0xAA, 0x09, 0xBB, 0x48, 0xA2, 0xD5, 0x9B, 0x34, 0x7B, 0x97, 0xE8, 0xCE, 0x71, 0xF4, 0x8C, 0x7F, 0x46 },
+ { 0x29, 0x43, 0x83, 0x56, 0x85, 0x96, 0xFB, 0x37, 0xC7, 0x5B, 0xBA, 0xCD, 0x97, 0x9C, 0x5F, 0xF6, 0xF2, 0x0A, 0x55, 0x6B, 0xF8, 0x87, 0x9C, 0xC7, 0x29, 0x24, 0x85, 0x5D, 0xF9, 0xB8, 0x24, 0x0E },
+ { 0x16, 0xB1, 0x8A, 0xB3, 0x14, 0x35, 0x9C, 0x2B, 0x83, 0x3C, 0x1C, 0x69, 0x86, 0xD4, 0x8C, 0x55, 0xA9, 0xFC, 0x97, 0xCD, 0xE9, 0xA3, 0xC1, 0xF1, 0x0A, 0x31, 0x77, 0x14, 0x0F, 0x73, 0xF7, 0x38 },
+ { 0x8C, 0xBB, 0xDD, 0x14, 0xBC, 0x33, 0xF0, 0x4C, 0xF4, 0x58, 0x13, 0xE4, 0xA1, 0x53, 0xA2, 0x73, 0xD3, 0x6A, 0xDA, 0xD5, 0xCE, 0x71, 0xF4, 0x99, 0xEE, 0xB8, 0x7F, 0xB8, 0xAC, 0x63, 0xB7, 0x29 },
+ { 0x69, 0xC9, 0xA4, 0x98, 0xDB, 0x17, 0x4E, 0xCA, 0xEF, 0xCC, 0x5A, 0x3A, 0xC9, 0xFD, 0xED, 0xF0, 0xF8, 0x13, 0xA5, 0xBE, 0xC7, 0x27, 0xF1, 0xE7, 0x75, 0xBA, 0xBD, 0xEC, 0x77, 0x18, 0x81, 0x6E },
+ { 0xB4, 0x62, 0xC3, 0xBE, 0x40, 0x44, 0x8F, 0x1D, 0x4F, 0x80, 0x62, 0x62, 0x54, 0xE5, 0x35, 0xB0, 0x8B, 0xC9, 0xCD, 0xCF, 0xF5, 0x99, 0xA7, 0x68, 0x57, 0x8D, 0x4B, 0x28, 0x81, 0xA8, 0xE3, 0xF0 },
+ { 0x55, 0x3E, 0x9D, 0x9C, 0x5F, 0x36, 0x0A, 0xC0, 0xB7, 0x4A, 0x7D, 0x44, 0xE5, 0xA3, 0x91, 0xDA, 0xD4, 0xCE, 0xD0, 0x3E, 0x0C, 0x24, 0x18, 0x3B, 0x7E, 0x8E, 0xCA, 0xBD, 0xF1, 0x71, 0x5A, 0x64 },
+ { 0x7A, 0x7C, 0x55, 0xA5, 0x6F, 0xA9, 0xAE, 0x51, 0xE6, 0x55, 0xE0, 0x19, 0x75, 0xD8, 0xA6, 0xFF, 0x4A, 0xE9, 0xE4, 0xB4, 0x86, 0xFC, 0xBE, 0x4E, 0xAC, 0x04, 0x45, 0x88, 0xF2, 0x45, 0xEB, 0xEA },
+ { 0x2A, 0xFD, 0xF3, 0xC8, 0x2A, 0xBC, 0x48, 0x67, 0xF5, 0xDE, 0x11, 0x12, 0x86, 0xC2, 0xB3, 0xBE, 0x7D, 0x6E, 0x48, 0x65, 0x7B, 0xA9, 0x23, 0xCF, 0xBF, 0x10, 0x1A, 0x6D, 0xFC, 0xF9, 0xDB, 0x9A },
+ { 0x41, 0x03, 0x7D, 0x2E, 0xDC, 0xDC, 0xE0, 0xC4, 0x9B, 0x7F, 0xB4, 0xA6, 0xAA, 0x09, 0x99, 0xCA, 0x66, 0x97, 0x6C, 0x74, 0x83, 0xAF, 0xE6, 0x31, 0xD4, 0xED, 0xA2, 0x83, 0x14, 0x4F, 0x6D, 0xFC },
+ { 0xC4, 0x46, 0x6F, 0x84, 0x97, 0xCA, 0x2E, 0xEB, 0x45, 0x83, 0xA0, 0xB0, 0x8E, 0x9D, 0x9A, 0xC7, 0x43, 0x95, 0x70, 0x9F, 0xDA, 0x10, 0x9D, 0x24, 0xF2, 0xE4, 0x46, 0x21, 0x96, 0x77, 0x9C, 0x5D },
+ { 0x75, 0xF6, 0x09, 0x33, 0x8A, 0xA6, 0x7D, 0x96, 0x9A, 0x2A, 0xE2, 0xA2, 0x36, 0x2B, 0x2D, 0xA9, 0xD7, 0x7C, 0x69, 0x5D, 0xFD, 0x1D, 0xF7, 0x22, 0x4A, 0x69, 0x01, 0xDB, 0x93, 0x2C, 0x33, 0x64 },
+ { 0x68, 0x60, 0x6C, 0xEB, 0x98, 0x9D, 0x54, 0x88, 0xFC, 0x7C, 0xF6, 0x49, 0xF3, 0xD7, 0xC2, 0x72, 0xEF, 0x05, 0x5D, 0xA1, 0xA9, 0x3F, 0xAE, 0xCD, 0x55, 0xFE, 0x06, 0xF6, 0x96, 0x70, 0x98, 0xCA },
+ { 0x44, 0x34, 0x6B, 0xDE, 0xB7, 0xE0, 0x52, 0xF6, 0x25, 0x50, 0x48, 0xF0, 0xD9, 0xB4, 0x2C, 0x42, 0x5B, 0xAB, 0x9C, 0x3D, 0xD2, 0x41, 0x68, 0x21, 0x2C, 0x3E, 0xCF, 0x1E, 0xBF, 0x34, 0xE6, 0xAE },
+ { 0x8E, 0x9C, 0xF6, 0xE1, 0xF3, 0x66, 0x47, 0x1F, 0x2A, 0xC7, 0xD2, 0xEE, 0x9B, 0x5E, 0x62, 0x66, 0xFD, 0xA7, 0x1F, 0x8F, 0x2E, 0x41, 0x09, 0xF2, 0x23, 0x7E, 0xD5, 0xF8, 0x81, 0x3F, 0xC7, 0x18 },
+ { 0x84, 0xBB, 0xEB, 0x84, 0x06, 0xD2, 0x50, 0x95, 0x1F, 0x8C, 0x1B, 0x3E, 0x86, 0xA7, 0xC0, 0x10, 0x08, 0x29, 0x21, 0x83, 0x3D, 0xFD, 0x95, 0x55, 0xA2, 0xF9, 0x09, 0xB1, 0x08, 0x6E, 0xB4, 0xB8 },
+ { 0xEE, 0x66, 0x6F, 0x3E, 0xEF, 0x0F, 0x7E, 0x2A, 0x9C, 0x22, 0x29, 0x58, 0xC9, 0x7E, 0xAF, 0x35, 0xF5, 0x1C, 0xED, 0x39, 0x3D, 0x71, 0x44, 0x85, 0xAB, 0x09, 0xA0, 0x69, 0x34, 0x0F, 0xDF, 0x88 },
+ { 0xC1, 0x53, 0xD3, 0x4A, 0x65, 0xC4, 0x7B, 0x4A, 0x62, 0xC5, 0xCA, 0xCF, 0x24, 0x01, 0x09, 0x75, 0xD0, 0x35, 0x6B, 0x2F, 0x32, 0xC8, 0xF5, 0xDA, 0x53, 0x0D, 0x33, 0x88, 0x16, 0xAD, 0x5D, 0xE6 },
+ { 0x9F, 0xC5, 0x45, 0x01, 0x09, 0xE1, 0xB7, 0x79, 0xF6, 0xC7, 0xAE, 0x79, 0xD5, 0x6C, 0x27, 0x63, 0x5C, 0x8D, 0xD4, 0x26, 0xC5, 0xA9, 0xD5, 0x4E, 0x25, 0x78, 0xDB, 0x98, 0x9B, 0x8C, 0x3B, 0x4E },
+ { 0xD1, 0x2B, 0xF3, 0x73, 0x2E, 0xF4, 0xAF, 0x5C, 0x22, 0xFA, 0x90, 0x35, 0x6A, 0xF8, 0xFC, 0x50, 0xFC, 0xB4, 0x0F, 0x8F, 0x2E, 0xA5, 0xC8, 0x59, 0x47, 0x37, 0xA3, 0xB3, 0xD5, 0xAB, 0xDB, 0xD7 },
+ { 0x11, 0x03, 0x0B, 0x92, 0x89, 0xBB, 0xA5, 0xAF, 0x65, 0x26, 0x06, 0x72, 0xAB, 0x6F, 0xEE, 0x88, 0xB8, 0x74, 0x20, 0xAC, 0xEF, 0x4A, 0x17, 0x89, 0xA2, 0x07, 0x3B, 0x7E, 0xC2, 0xF2, 0xA0, 0x9E },
+ { 0x69, 0xCB, 0x19, 0x2B, 0x84, 0x44, 0x00, 0x5C, 0x8C, 0x0C, 0xEB, 0x12, 0xC8, 0x46, 0x86, 0x07, 0x68, 0x18, 0x8C, 0xDA, 0x0A, 0xEC, 0x27, 0xA9, 0xC8, 0xA5, 0x5C, 0xDE, 0xE2, 0x12, 0x36, 0x32 },
+ { 0xDB, 0x44, 0x4C, 0x15, 0x59, 0x7B, 0x5F, 0x1A, 0x03, 0xD1, 0xF9, 0xED, 0xD1, 0x6E, 0x4A, 0x9F, 0x43, 0xA6, 0x67, 0xCC, 0x27, 0x51, 0x75, 0xDF, 0xA2, 0xB7, 0x04, 0xE3, 0xBB, 0x1A, 0x9B, 0x83 },
+ { 0x3F, 0xB7, 0x35, 0x06, 0x1A, 0xBC, 0x51, 0x9D, 0xFE, 0x97, 0x9E, 0x54, 0xC1, 0xEE, 0x5B, 0xFA, 0xD0, 0xA9, 0xD8, 0x58, 0xB3, 0x31, 0x5B, 0xAD, 0x34, 0xBD, 0xE9, 0x99, 0xEF, 0xD7, 0x24, 0xDD }
+};
+
+bool __init blake2s_selftest(void)
+{
+ u8 key[BLAKE2S_KEYBYTES];
+ u8 buf[ARRAY_SIZE(blake2s_testvecs)];
+ u8 hash[BLAKE2S_OUTBYTES];
+ size_t i;
+ bool success = true;
+
+ for (i = 0; i < BLAKE2S_KEYBYTES; ++i)
+ key[i] = (u8)i;
+
+ for (i = 0; i < ARRAY_SIZE(blake2s_testvecs); ++i)
+ buf[i] = (u8)i;
+
+ for (i = 0; i < ARRAY_SIZE(blake2s_keyed_testvecs); ++i) {
+ blake2s(hash, buf, key, BLAKE2S_OUTBYTES, i, BLAKE2S_KEYBYTES);
+ if (memcmp(hash, blake2s_keyed_testvecs[i], BLAKE2S_OUTBYTES)) {
+ pr_info("blake2s keyed self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+ }
+
+ for (i = 0; i < ARRAY_SIZE(blake2s_testvecs); ++i) {
+ blake2s(hash, buf, NULL, BLAKE2S_OUTBYTES, i, 0);
+ if (memcmp(hash, blake2s_testvecs[i], BLAKE2S_OUTBYTES)) {
+ pr_info("blake2s unkeyed self-test %zu: FAIL\n", i + i);
+ success = false;
+ }
+ }
+
+ if (success)
+ pr_info("blake2s self-tests: pass\n");
+ return success;
+}
+#endif
diff --git a/lib/zinc/selftest/chacha20poly1305.h b/lib/zinc/selftest/chacha20poly1305.h
new file mode 100644
index 000000000000..4e375212afd7
--- /dev/null
+++ b/lib/zinc/selftest/chacha20poly1305.h
@@ -0,0 +1,1559 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifdef CONFIG_ZINC_DEBUG
+struct chacha20poly1305_testvec {
+ u8 *key, *nonce, *assoc, *input, *result;
+ size_t nlen, alen, ilen;
+ bool failure;
+};
+
+/* The first of these are the ChaCha20-Poly1305 AEAD test vectors from RFC7539 2.8.2. After they
+ * are generated by the below python program. And the final marked ones are taken from wycheproof,
+ * but we only do these for the encrypt side, because mostly we're stressing the primitives rather
+ * than the actual chapoly construction. This also requires adding a 96-bit nonce construction, just
+ * for the purpose of the tests.
+ *
+ * #!/usr/bin/env python3
+ *
+ * from cryptography.hazmat.primitives.ciphers.aead import ChaCha20Poly1305
+ * import os
+ *
+ * def encode_blob(blob):
+ * a = ""
+ * for i in blob:
+ * a += "\\x" + hex(i)[2:]
+ * return a
+ *
+ * enc = [ ]
+ * dec = [ ]
+ *
+ * def make_vector(plen, adlen):
+ * key = os.urandom(32)
+ * nonce = os.urandom(8)
+ * p = os.urandom(plen)
+ * ad = os.urandom(adlen)
+ * c = ChaCha20Poly1305(key).encrypt(nonce=bytes(4) + nonce, data=p, associated_data=ad)
+ *
+ * out = "{\n"
+ * out += "\t.key\t= \"" + encode_blob(key) + "\",\n"
+ * out += "\t.nonce\t= \"" + encode_blob(nonce) + "\",\n"
+ * out += "\t.assoc\t= \"" + encode_blob(ad) + "\",\n"
+ * out += "\t.alen\t= " + str(len(ad)) + ",\n"
+ * out += "\t.input\t= \"" + encode_blob(p) + "\",\n"
+ * out += "\t.ilen\t= " + str(len(p)) + ",\n"
+ * out += "\t.result\t= \"" + encode_blob(c) + "\"\n"
+ * out += "}"
+ * enc.append(out)
+ *
+ *
+ * out = "{\n"
+ * out += "\t.key\t= \"" + encode_blob(key) + "\",\n"
+ * out += "\t.nonce\t= \"" + encode_blob(nonce) + "\",\n"
+ * out += "\t.assoc\t= \"" + encode_blob(ad) + "\",\n"
+ * out += "\t.alen\t= " + str(len(ad)) + ",\n"
+ * out += "\t.input\t= \"" + encode_blob(c) + "\",\n"
+ * out += "\t.ilen\t= " + str(len(c)) + ",\n"
+ * out += "\t.result\t= \"" + encode_blob(p) + "\"\n"
+ * out += "}"
+ * dec.append(out)
+ *
+ *
+ * make_vector(0, 0)
+ * make_vector(0, 8)
+ * make_vector(1, 8)
+ * make_vector(1, 0)
+ * make_vector(129, 7)
+ * make_vector(256, 0)
+ * make_vector(512, 0)
+ * make_vector(513, 9)
+ * make_vector(1024, 16)
+ * make_vector(1933, 7)
+ * make_vector(2011, 63)
+ *
+ * print("======== encryption vectors ========")
+ * print(", ".join(enc))
+ *
+ * print("\n\n\n======== decryption vectors ========")
+ * print(", ".join(dec))
+ */
+
+static const struct chacha20poly1305_testvec chacha20poly1305_enc_vectors[] __initconst = { {
+ .key = "\x1c\x92\x40\xa5\xeb\x55\xd3\x8a\xf3\x33\x88\x86\x04\xf6\xb5\xf0\x47\x39\x17\xc1\x40\x2b\x80\x09\x9d\xca\x5c\xbc\x20\x70\x75\xc0",
+ .nonce = "\x01\x02\x03\x04\x05\x06\x07\x08",
+ .nlen = 8,
+ .assoc = "\xf3\x33\x88\x86\x00\x00\x00\x00\x00\x00\x4e\x91",
+ .alen = 12,
+ .input = "\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x72\x65\x20\x64\x72\x61\x66\x74\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x76\x61\x6c\x69\x64\x20\x66\x6f\x72\x20\x61\x20\x6d\x61\x78\x69\x6d\x75\x6d\x20\x6f\x66\x20\x73\x69\x78\x20\x6d\x6f\x6e\x74\x68\x73\x20\x61\x6e\x64\x20\x6d\x61\x79\x20\x62\x65\x20\x75\x70\x64\x61\x74\x65\x64\x2c\x20\x72\x65\x70\x6c\x61\x63\x65\x64\x2c\x20\x6f\x72\x20\x6f\x62\x73\x6f\x6c\x65\x74\x65\x64\x20\x62\x79\x20\x6f\x74\x68\x65\x72\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x61\x74\x20\x61\x6e\x79\x20\x74\x69\x6d\x65\x2e\x20\x49\x74\x20\x69\x73\x20\x69\x6e\x61\x70\x70\x72\x6f\x70\x72\x69\x61\x74\x65\x20\x74\x6f\x20\x75\x73\x65\x20\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x73\x20\x72\x65\x66\x65\x72\x65"
+ "\x6e\x63\x65\x20\x6d\x61\x74\x65\x72\x69\x61\x6c\x20\x6f\x72\x20\x74\x6f\x20\x63\x69\x74\x65\x20\x74\x68\x65\x6d\x20\x6f\x74\x68\x65\x72\x20\x74\x68\x61\x6e\x20\x61\x73\x20\x2f\xe2\x80\x9c\x77\x6f\x72\x6b\x20\x69\x6e\x20\x70\x72\x6f\x67\x72\x65\x73\x73\x2e\x2f\xe2\x80\x9d",
+ .ilen = 265,
+ .result = "\x64\xa0\x86\x15\x75\x86\x1a\xf4\x60\xf0\x62\xc7\x9b\xe6\x43\xbd\x5e\x80\x5c\xfd\x34\x5c\xf3\x89\xf1\x08\x67\x0a\xc7\x6c\x8c\xb2\x4c\x6c\xfc\x18\x75\x5d\x43\xee\xa0\x9e\xe9\x4e\x38\x2d\x26\xb0\xbd\xb7\xb7\x3c\x32\x1b\x01\x00\xd4\xf0\x3b\x7f\x35\x58\x94\xcf\x33\x2f\x83\x0e\x71\x0b\x97\xce\x98\xc8\xa8\x4a\xbd\x0b\x94\x81\x14\xad\x17\x6e\x00\x8d\x33\xbd\x60\xf9\x82\xb1\xff\x37\xc8\x55\x97\x97\xa0\x6e\xf4\xf0\xef\x61\xc1\x86\x32\x4e\x2b\x35\x06\x38\x36\x06\x90\x7b\x6a\x7c\x02\xb0\xf9\xf6\x15\x7b\x53\xc8\x67\xe4\xb9\x16\x6c\x76\x7b\x80\x4d\x46\xa5\x9b\x52\x16\xcd\xe7\xa4\xe9\x90\x40\xc5\xa4\x04\x33\x22\x5e\xe2\x82\xa1\xb0\xa0\x6c\x52\x3e\xaf\x45\x34\xd7\xf8\x3f\xa1\x15\x5b\x00\x47\x71\x8c\xbc\x54\x6a\x0d\x07\x2b\x04\xb3\x56\x4e\xea\x1b\x42\x22\x73\xf5\x48\x27\x1a\x0b\xb2\x31\x60\x53"
+ "\xfa\x76\x99\x19\x55\xeb\xd6\x31\x59\x43\x4e\xce\xbb\x4e\x46\x6d\xae\x5a\x10\x73\xa6\x72\x76\x27\x09\x7a\x10\x49\xe6\x17\xd9\x1d\x36\x10\x94\xfa\x68\xf0\xff\x77\x98\x71\x30\x30\x5b\xea\xba\x2e\xda\x04\xdf\x99\x7b\x71\x4d\x6c\x6f\x2c\x29\xa6\xad\x5c\xb4\x02\x2b\x02\x70\x9b\xee\xad\x9d\x67\x89\x0c\xbb\x22\x39\x23\x36\xfe\xa1\x85\x1f\x38"
+}, {
+ .key = "\x4c\xf5\x96\x83\x38\xe6\xae\x7f\x2d\x29\x25\x76\xd5\x75\x27\x86\x91\x9a\x27\x7a\xfb\x46\xc5\xef\x94\x81\x79\x57\x14\x59\x40\x68",
+ .nonce = "\xca\xbf\x33\x71\x32\x45\x77\x8e",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "",
+ .ilen = 0,
+ .result = "\xea\xe0\x1e\x9e\x2c\x91\xaa\xe1\xdb\x5d\x99\x3f\x8a\xf7\x69\x92"
+}, {
+ .key = "\x2d\xb0\x5d\x40\xc8\xed\x44\x88\x34\xd1\x13\xaf\x57\xa1\xeb\x3a\x2a\x80\x51\x36\xec\x5b\xbc\x08\x93\x84\x21\xb5\x13\x88\x3c\x0d",
+ .nonce = "\x3d\x86\xb5\x6b\xc8\xa3\x1f\x1d",
+ .nlen = 8,
+ .assoc = "\x33\x10\x41\x12\x1f\xf3\xd2\x6b",
+ .alen = 8,
+ .input = "",
+ .ilen = 0,
+ .result = "\xdd\x6b\x3b\x82\xce\x5a\xbd\xd6\xa9\x35\x83\xd8\x8c\x3d\x85\x77"
+}, {
+ .key = "\x4b\x28\x4b\xa3\x7b\xbe\xe9\xf8\x31\x80\x82\xd7\xd8\xe8\xb5\xa1\xe2\x18\x18\x8a\x9c\xfa\xa3\x3d\x25\x71\x3e\x40\xbc\x54\x7a\x3e",
+ .nonce = "\xd2\x32\x1f\x29\x28\xc6\xc4\xc4",
+ .nlen = 8,
+ .assoc = "\x6a\xe2\xad\x3f\x88\x39\x5a\x40",
+ .alen = 8,
+ .input = "\xa4",
+ .ilen = 1,
+ .result = "\xb7\x1b\xb0\x73\x59\xb0\x84\xb2\x6d\x8e\xab\x94\x31\xa1\xae\xac\x89"
+}, {
+ .key = "\x66\xca\x9c\x23\x2a\x4b\x4b\x31\x0e\x92\x89\x8b\xf4\x93\xc7\x87\x98\xa3\xd8\x39\xf8\xf4\xa7\x01\xc0\x2e\x0a\xa6\x7e\x5a\x78\x87",
+ .nonce = "\x20\x1c\xaa\x5f\x9c\xbf\x92\x30",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x2d",
+ .ilen = 1,
+ .result = "\xbf\xe1\x5b\x0b\xdb\x6b\xf5\x5e\x6c\x5d\x84\x44\x39\x81\xc1\x9c\xac"
+}, {
+ .key = "\x68\x7b\x8d\x8e\xe3\xc4\xdd\xae\xdf\x72\x7f\x53\x72\x25\x1e\x78\x91\xcb\x69\x76\x1f\x49\x93\xf9\x6f\x21\xcc\x39\x9c\xad\xb1\x01",
+ .nonce = "\xdf\x51\x84\x82\x42\x0c\x75\x9c",
+ .nlen = 8,
+ .assoc = "\x70\xd3\x33\xf3\x8b\x18\x0b",
+ .alen = 7,
+ .input = "\x33\x2f\x94\xc1\xa4\xef\xcc\x2a\x5b\xa6\xe5\x8f\x1d\x40\xf0\x92\x3c\xd9\x24\x11\xa9\x71\xf9\x37\x14\x99\xfa\xbe\xe6\x80\xde\x50\xc9\x96\xd4\xb0\xec\x9e\x17\xec\xd2\x5e\x72\x99\xfc\x0a\xe1\xcb\x48\xd2\x85\xdd\x2f\x90\xe0\x66\x3b\xe6\x20\x74\xbe\x23\x8f\xcb\xb4\xe4\xda\x48\x40\xa6\xd1\x1b\xc7\x42\xce\x2f\x0c\xa6\x85\x6e\x87\x37\x03\xb1\x7c\x25\x96\xa3\x05\xd8\xb0\xf4\xed\xea\xc2\xf0\x31\x98\x6c\xd1\x14\x25\xc0\xcb\x01\x74\xd0\x82\xf4\x36\xf5\x41\xd5\xdc\xca\xc5\xbb\x98\xfe\xfc\x69\x21\x70\xd8\xa4\x4b\xc8\xde\x8f",
+ .ilen = 129,
+ .result = "\x8b\x06\xd3\x31\xb0\x93\x45\xb1\x75\x6e\x26\xf9\x67\xbc\x90\x15\x81\x2c\xb5\xf0\xc6\x2b\xc7\x8c\x56\xd1\xbf\x69\x6c\x07\xa0\xda\x65\x27\xc9\x90\x3d\xef\x4b\x11\x0f\x19\x07\xfd\x29\x92\xd9\xc8\xf7\x99\x2e\x4a\xd0\xb8\x2c\xdc\x93\xf5\x9e\x33\x78\xd1\x37\xc3\x66\xd7\x5e\xbc\x44\xbf\x53\xa5\xbc\xc4\xcb\x7b\x3a\x8e\x7f\x02\xbd\xbb\xe7\xca\xa6\x6c\x6b\x93\x21\x93\x10\x61\xe7\x69\xd0\x78\xf3\x07\x5a\x1a\x8f\x73\xaa\xb1\x4e\xd3\xda\x4f\xf3\x32\xe1\x66\x3e\x6c\xc6\x13\xba\x06\x5b\xfc\x6a\xe5\x6f\x60\xfb\x07\x40\xb0\x8c\x9d\x84\x43\x6b\xc1\xf7\x8d\x8d\x31\xf7\x7a\x39\x4d\x8f\x9a\xeb"
+}, {
+ .key = "\x8d\xb8\x91\x48\xf0\xe7\x0a\xbd\xf9\x3f\xcd\xd9\xa0\x1e\x42\x4c\xe7\xde\x25\x3d\xa3\xd7\x05\x80\x8d\xf2\x82\xac\x44\x16\x51\x01",
+ .nonce = "\xde\x7b\xef\xc3\x65\x1b\x68\xb0",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x9b\x18\xdb\xdd\x9a\x0f\x3e\xa5\x15\x17\xde\xdf\x08\x9d\x65\x0a\x67\x30\x12\xe2\x34\x77\x4b\xc1\xd9\xc6\x1f\xab\xc6\x18\x50\x17\xa7\x9d\x3c\xa6\xc5\x35\x8c\x1c\xc0\xa1\x7c\x9f\x03\x89\xca\xe1\xe6\xe9\xd4\xd3\x88\xdb\xb4\x51\x9d\xec\xb4\xfc\x52\xee\x6d\xf1\x75\x42\xc6\xfd\xbd\x7a\x8e\x86\xfc\x44\xb3\x4f\xf3\xea\x67\x5a\x41\x13\xba\xb0\xdc\xe1\xd3\x2a\x7c\x22\xb3\xca\xac\x6a\x37\x98\x3e\x1d\x40\x97\xf7\x9b\x1d\x36\x6b\xb3\x28\xbd\x60\x82\x47\x34\xaa\x2f\x7d\xe9\xa8\x70\x81\x57\xd4\xb9\x77\x0a\x9d\x29\xa7\x84\x52\x4f\xc2\x4a\x40\x3b\x3c\xd4\xc9\x2a\xdb\x4a\x53\xc4\xbe\x80\xe9\x51\x7f\x8f\xc7\xa2\xce\x82\x5c\x91\x1e\x74\xd9\xd0\xbd\xd5\xf3\xfd\xda\x4d\x25\xb4\xbb\x2d\xac\x2f\x3d\x71\x85\x7b\xcf\x3c\x7b\x3e\x0e\x22\x78\x0c\x29\xbf\xe4\xf4\x57\xb3\xcb\x49\xa0\xfc\x1e\x05\x4e\x16\xbc\xd5\xa8"
+ "\xa3\xee\x05\x35\xc6\x7c\xab\x60\x14\x55\x1a\x8e\xc5\x88\x5d\xd5\x81\xc2\x81\xa5\xc4\x60\xdb\xaf\x77\x91\xe1\xce\xa2\x7e\x7f\x42\xe3\xb0\x13\x1c\x1f\x25\x60\x21\xe2\x40\x5f\x99\xb7\x73\xec\x9b\x2b\xf0\x65\x11\xc8\xd0\x0a\x9f\xd3",
+ .ilen = 256,
+ .result = "\x85\x04\xc2\xed\x8d\xfd\x97\x5c\xd2\xb7\xe2\xc1\x6b\xa3\xba\xf8\xc9\x50\xc3\xc6\xa5\xe3\xa4\x7c\xc3\x23\x49\x5e\xa9\xb9\x32\xeb\x8a\x7c\xca\xe5\xec\xfb\x7c\xc0\xcb\x7d\xdc\x2c\x9d\x92\x55\x21\x0a\xc8\x43\x63\x59\x0a\x31\x70\x82\x67\x41\x03\xf8\xdf\xf2\xac\xa7\x02\xd4\xd5\x8a\x2d\xc8\x99\x19\x66\xd0\xf6\x88\x2c\x77\xd9\xd4\x0d\x6c\xbd\x98\xde\xe7\x7f\xad\x7e\x8a\xfb\xe9\x4b\xe5\xf7\xe5\x50\xa0\x90\x3f\xd6\x22\x53\xe3\xfe\x1b\xcc\x79\x3b\xec\x12\x47\x52\xa7\xd6\x04\xe3\x52\xe6\x93\x90\x91\x32\x73\x79\xb8\xd0\x31\xde\x1f\x9f\x2f\x05\x38\x54\x2f\x35\x04\x39\xe0\xa7\xba\xc6\x52\xf6\x37\x65\x4c\x07\xa9\x7e\xb3\x21\x6f\x74\x8c\xc9\xde\xdb\x65\x1b\x9b\xaa\x60\xb1\x03\x30\x6b\xb2\x03\xc4\x1c\x04\xf8\x0f\x64\xaf\x46\xe4\x65\x99\x49\xe2\xea\xce\x78\x00\xd8\x8b\xd5\x2e\xcf\xfc\x40\x49\xe8\x58\xdc\x34"
+ "\x9c\x8c\x61\xbf\x0a\x8e\xec\x39\xa9\x30\x05\x5a\xd2\x56\x01\xc7\xda\x8f\x4e\xbb\x43\xa3\x3a\xf9\x15\x2a\xd0\xa0\x7a\x87\x34\x82\xfe\x8a\xd1\x2d\x5e\xc7\xbf\x04\x53\x5f\x3b\x36\xd4\x25\x5c\x34\x7a\x8d\xd5\x05\xce\x72\xca\xef\x7a\x4b\xbc\xb0\x10\x5c\x96\x42\x3a\x00\x98\xcd\x15\xe8\xb7\x53"
+}, {
+ .key = "\xf2\xaa\x4f\x99\xfd\x3e\xa8\x53\xc1\x44\xe9\x81\x18\xdc\xf5\xf0\x3e\x44\x15\x59\xe0\xc5\x44\x86\xc3\x91\xa8\x75\xc0\x12\x46\xba",
+ .nonce = "\x0e\x0d\x57\xbb\x7b\x40\x54\x02",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xc3\x09\x94\x62\xe6\x46\x2e\x10\xbe\x00\xe4\xfc\xf3\x40\xa3\xe2\x0f\xc2\x8b\x28\xdc\xba\xb4\x3c\xe4\x21\x58\x61\xcd\x8b\xcd\xfb\xac\x94\xa1\x45\xf5\x1c\xe1\x12\xe0\x3b\x67\x21\x54\x5e\x8c\xaa\xcf\xdb\xb4\x51\xd4\x13\xda\xe6\x83\x89\xb6\x92\xe9\x21\x76\xa4\x93\x7d\x0e\xfd\x96\x36\x03\x91\x43\x5c\x92\x49\x62\x61\x7b\xeb\x43\x89\xb8\x12\x20\x43\xd4\x47\x06\x84\xee\x47\xe9\x8a\x73\x15\x0f\x72\xcf\xed\xce\x96\xb2\x7f\x21\x45\x76\xeb\x26\x28\x83\x6a\xad\xaa\xa6\x81\xd8\x55\xb1\xa3\x85\xb3\x0c\xdf\xf1\x69\x2d\x97\x05\x2a\xbc\x7c\x7b\x25\xf8\x80\x9d\x39\x25\xf3\x62\xf0\x66\x5e\xf4\xa0\xcf\xd8\xfd\x4f\xb1\x1f\x60\x3a\x08\x47\xaf\xe1\xf6\x10\x77\x09\xa7\x27\x8f\x9a\x97\x5a\x26\xfa\xfe\x41\x32\x83\x10\xe0\x1d\xbf\x64\x0d\xf4\x1c\x32\x35\xe5\x1b\x36\xef\xd4\x4a\x93\x4d\x00\x7c\xec\x02\x07\x8b\x5d\x7d\x1b"
+ "\x0e\xd1\xa6\xa5\x5d\x7d\x57\x88\xa8\xcc\x81\xb4\x86\x4e\xb4\x40\xe9\x1d\xc3\xb1\x24\x3e\x7f\xcc\x8a\x24\x9b\xdf\x6d\xf0\x39\x69\x3e\x4c\xc0\x96\xe4\x13\xda\x90\xda\xf4\x95\x66\x8b\x17\x17\xfe\x39\x43\x25\xaa\xda\xa0\x43\x3c\xb1\x41\x02\xa3\xf0\xa7\x19\x59\xbc\x1d\x7d\x6c\x6d\x91\x09\x5c\xb7\x5b\x01\xd1\x6f\x17\x21\x97\xbf\x89\x71\xa5\xb0\x6e\x07\x45\xfd\x9d\xea\x07\xf6\x7a\x9f\x10\x18\x22\x30\x73\xac\xd4\x6b\x72\x44\xed\xd9\x19\x9b\x2d\x4a\x41\xdd\xd1\x85\x5e\x37\x19\xed\xd2\x15\x8f\x5e\x91\xdb\x33\xf2\xe4\xdb\xff\x98\xfb\xa3\xb5\xca\x21\x69\x08\xe7\x8a\xdf\x90\xff\x3e\xe9\x20\x86\x3c\xe9\xfc\x0b\xfe\x5c\x61\xaa\x13\x92\x7f\x7b\xec\xe0\x6d\xa8\x23\x22\xf6\x6b\x77\xc4\xfe\x40\x07\x3b\xb6\xf6\x8e\x5f\xd4\xb9\xb7\x0f\x21\x04\xef\x83\x63\x91\x69\x40\xa3\x48\x5c\xd2\x60\xf9\x4f\x6c\x47\x8b\x3b\xb1"
+ "\x9f\x8e\xee\x16\x8a\x13\xfc\x46\x17\xc3\xc3\x32\x56\xf8\x3c\x85\x3a\xb6\x3e\xaa\x89\x4f\xb3\xdf\x38\xfd\xf1\xe4\x3a\xc0\xe6\x58\xb5\x8f\xc5\x29\xa2\x92\x4a\xb6\xa0\x34\x7f\xab\xb5\x8a\x90\xa1\xdb\x4d\xca\xb6\x2c\x41\x3c\xf7\x2b\x21\xc3\xfd\xf4\x17\x5c\xb5\x33\x17\x68\x2b\x08\x30\xf3\xf7\x30\x3c\x96\xe6\x6a\x20\x97\xe7\x4d\x10\x5f\x47\x5f\x49\x96\x09\xf0\x27\x91\xc8\xf8\x5a\x2e\x79\xb5\xe2\xb8\xe8\xb9\x7b\xd5\x10\xcb\xff\x5d\x14\x73\xf3",
+ .ilen = 512,
+ .result = "\x14\xf6\x41\x37\xa6\xd4\x27\xcd\xdb\x06\x3e\x9a\x4e\xab\xd5\xb1\x1e\x6b\xd2\xbc\x11\xf4\x28\x93\x63\x54\xef\xbb\x5e\x1d\x3a\x1d\x37\x3c\x0a\x6c\x1e\xc2\xd1\x2c\xb5\xa3\xb5\x7b\xb8\x8f\x25\xa6\x1b\x61\x1c\xec\x28\x58\x26\xa4\xa8\x33\x28\x25\x5c\x45\x05\xe5\x6c\x99\xe5\x45\xc4\xa2\x03\x84\x03\x73\x1e\x8c\x49\xac\x20\xdd\x8d\xb3\xc4\xf5\xe7\x4f\xf1\xed\xa1\x98\xde\xa4\x96\xdd\x2f\xab\xab\x97\xcf\x3e\xd2\x9e\xb8\x13\x07\x28\x29\x19\xaf\xfd\xf2\x49\x43\xea\x49\x26\x91\xc1\x07\xd6\xbb\x81\x75\x35\x0d\x24\x7f\xc8\xda\xd4\xb7\xeb\xe8\x5c\x09\xa2\x2f\xdc\x28\x7d\x3a\x03\xfa\x94\xb5\x1d\x17\x99\x36\xc3\x1c\x18\x34\xe3\x9f\xf5\x55\x7c\xb0\x60\x9d\xff\xac\xd4\x61\xf2\xad\xf8\xce\xc7\xbe\x5c\xd2\x95\xa8\x4b\x77\x13\x19\x59\x26\xc9\xb7\x8f\x6a\xcb\x2d\x37\x91\xea\x92\x9c\x94\x5b\xda\x0b\xce\xfe\x30"
+ "\x20\xf8\x51\xad\xf2\xbe\xe7\xc7\xff\xb3\x33\x91\x6a\xc9\x1a\x41\xc9\x0f\xf3\x10\x0e\xfd\x53\xff\x6c\x16\x52\xd9\xf3\xf7\x98\x2e\xc9\x07\x31\x2c\x0c\x72\xd7\xc5\xc6\x08\x2a\x7b\xda\xbd\x7e\x02\xea\x1a\xbb\xf2\x04\x27\x61\x28\x8e\xf5\x04\x03\x1f\x4c\x07\x55\x82\xec\x1e\xd7\x8b\x2f\x65\x56\xd1\xd9\x1e\x3c\xe9\x1f\x5e\x98\x70\x38\x4a\x8c\x49\xc5\x43\xa0\xa1\x8b\x74\x9d\x4c\x62\x0d\x10\x0c\xf4\x6c\x8f\xe0\xaa\x9a\x8d\xb7\xe0\xbe\x4c\x87\xf1\x98\x2f\xcc\xed\xc0\x52\x29\xdc\x83\xf8\xfc\x2c\x0e\xa8\x51\x4d\x80\x0d\xa3\xfe\xd8\x37\xe7\x41\x24\xfc\xfb\x75\xe3\x71\x7b\x57\x45\xf5\x97\x73\x65\x63\x14\x74\xb8\x82\x9f\xf8\x60\x2f\x8a\xf2\x4e\xf1\x39\xda\x33\x91\xf8\x36\xe0\x8d\x3f\x1f\x3b\x56\xdc\xa0\x8f\x3c\x9d\x71\x52\xa7\xb8\xc0\xa5\xc6\xa2\x73\xda\xf4\x4b\x74\x5b\x00\x3d\x99\xd7\x96\xba\xe6\xe1\xa6\x96\x38"
+ "\xad\xb3\xc0\xd2\xba\x91\x6b\xf9\x19\xdd\x3b\xbe\xbe\x9c\x20\x50\xba\xa1\xd0\xce\x11\xbd\x95\xd8\xd1\xdd\x33\x85\x74\xdc\xdb\x66\x76\x44\xdc\x03\x74\x48\x35\x98\xb1\x18\x47\x94\x7d\xff\x62\xe4\x58\x78\xab\xed\x95\x36\xd9\x84\x91\x82\x64\x41\xbb\x58\xe6\x1c\x20\x6d\x15\x6b\x13\x96\xe8\x35\x7f\xdc\x40\x2c\xe9\xbc\x8a\x4f\x92\xec\x06\x2d\x50\xdf\x93\x5d\x65\x5a\xa8\xfc\x20\x50\x14\xa9\x8a\x7e\x1d\x08\x1f\xe2\x99\xd0\xbe\xfb\x3a\x21\x9d\xad\x86\x54\xfd\x0d\x98\x1c\x5a\x6f\x1f\x9a\x40\xcd\xa2\xff\x6a\xf1\x54"
+}, {
+ .key = "\xea\xbc\x56\x99\xe3\x50\xff\xc5\xcc\x1a\xd7\xc1\x57\x72\xea\x86\x5b\x89\x88\x61\x3d\x2f\x9b\xb2\xe7\x9c\xec\x74\x6e\x3e\xf4\x3b",
+ .nonce = "\xef\x2d\x63\xee\x6b\x80\x8b\x78",
+ .nlen = 8,
+ .assoc = "\x5a\x27\xff\xeb\xdf\x84\xb2\x9e\xef",
+ .alen = 9,
+ .input = "\xe6\xc3\xdb\x63\x55\x15\xe3\x5b\xb7\x4b\x27\x8b\x5a\xdd\xc2\xe8\x3a\x6b\xd7\x81\x96\x35\x97\xca\xd7\x68\xe8\xef\xce\xab\xda\x09\x6e\xd6\x8e\xcb\x55\xb5\xe1\xe5\x57\xfd\xc4\xe3\xe0\x18\x4f\x85\xf5\x3f\x7e\x4b\x88\xc9\x52\x44\x0f\xea\xaf\x1f\x71\x48\x9f\x97\x6d\xb9\x6f\x00\xa6\xde\x2b\x77\x8b\x15\xad\x10\xa0\x2b\x7b\x41\x90\x03\x2d\x69\xae\xcc\x77\x7c\xa5\x9d\x29\x22\xc2\xea\xb4\x00\x1a\xd2\x7a\x98\x8a\xf9\xf7\x82\xb0\xab\xd8\xa6\x94\x8d\x58\x2f\x01\x9e\x00\x20\xfc\x49\xdc\x0e\x03\xe8\x45\x10\xd6\xa8\xda\x55\x10\x9a\xdf\x67\x22\x8b\x43\xab\x00\xbb\x02\xc8\xdd\x7b\x97\x17\xd7\x1d\x9e\x02\x5e\x48\xde\x8e\xcf\x99\x07\x95\x92\x3c\x5f\x9f\xc5\x8a\xc0\x23\xaa\xd5\x8c\x82\x6e\x16\x92\xb1\x12\x17\x07\xc3\xfb\x36\xf5\x6c\x35\xd6\x06\x1f\x9f\xa7\x94\xa2\x38\x63\x9c\xb0\x71\xb3\xa5\xd2\xd8\xba\x9f\x08\x01"
+ "\xb3\xff\x04\x97\x73\x45\x1b\xd5\xa9\x9c\x80\xaf\x04\x9a\x85\xdb\x32\x5b\x5d\x1a\xc1\x36\x28\x10\x79\xf1\x3c\xbf\x1a\x41\x5c\x4e\xdf\xb2\x7c\x79\x3b\x7a\x62\x3d\x4b\xc9\x9b\x2a\x2e\x7c\xa2\xb1\x11\x98\xa7\x34\x1a\x00\xf3\xd1\xbc\x18\x22\xba\x02\x56\x62\x31\x10\x11\x6d\xe0\x54\x9d\x40\x1f\x26\x80\x41\xca\x3f\x68\x0f\x32\x1d\x0a\x8e\x79\xd8\xa4\x1b\x29\x1c\x90\x8e\xc5\xe3\xb4\x91\x37\x9a\x97\x86\x99\xd5\x09\xc5\xbb\xa3\x3f\x21\x29\x82\x14\x5c\xab\x25\xfb\xf2\x4f\x58\x26\xd4\x83\xaa\x66\x89\x67\x7e\xc0\x49\xe1\x11\x10\x7f\x7a\xda\x29\x04\xff\xf0\xcb\x09\x7c\x9d\xfa\x03\x6f\x81\x09\x31\x60\xfb\x08\xfa\x74\xd3\x64\x44\x7c\x55\x85\xec\x9c\x6e\x25\xb7\x6c\xc5\x37\xb6\x83\x87\x72\x95\x8b\x9d\xe1\x69\x5c\x31\x95\x42\xa6\x2c\xd1\x36\x47\x1f\xec\x54\xab\xa2\x1c\xd8\x00\xcc\xbc\x0d\x65\xe2\x67\xbf\xbc\xea\xee"
+ "\x9e\xe4\x36\x95\xbe\x73\xd9\xa6\xd9\x0f\xa0\xcc\x82\x76\x26\xad\x5b\x58\x6c\x4e\xab\x29\x64\xd3\xd9\xa9\x08\x8c\x1d\xa1\x4f\x80\xd8\x3f\x94\xfb\xd3\x7b\xfc\xd1\x2b\xc3\x21\xeb\xe5\x1c\x84\x23\x7f\x4b\xfa\xdb\x34\x18\xa2\xc2\xe5\x13\xfe\x6c\x49\x81\xd2\x73\xe7\xe2\xd7\xe4\x4f\x4b\x08\x6e\xb1\x12\x22\x10\x9d\xac\x51\x1e\x17\xd9\x8a\x0b\x42\x88\x16\x81\x37\x7c\x6a\xf7\xef\x2d\xe3\xd9\xf8\x5f\xe0\x53\x27\x74\xb9\xe2\xd6\x1c\x80\x2c\x52\x65",
+ .ilen = 513,
+ .result = "\xfd\x81\x8d\xd0\x3d\xb4\xd5\xdf\xd3\x42\x47\x5a\x6d\x19\x27\x66\x4b\x2e\x0c\x27\x9c\x96\x4c\x72\x02\xa3\x65\xc3\xb3\x6f\x2e\xbd\x63\x8a\x4a\x5d\x29\xa2\xd0\x28\x48\xc5\x3d\x98\xa3\xbc\xe0\xbe\x3b\x3f\xe6\x8a\xa4\x7f\x53\x06\xfa\x7f\x27\x76\x72\x31\xa1\xf5\xd6\x0c\x52\x47\xba\xcd\x4f\xd7\xeb\x05\x48\x0d\x7c\x35\x4a\x09\xc9\x76\x71\x02\xa3\xfb\xb7\x1a\x65\xb7\xed\x98\xc6\x30\x8a\x00\xae\xa1\x31\xe5\xb5\x9e\x6d\x62\xda\xda\x07\x0f\x38\x38\xd3\xcb\xc1\xb0\xad\xec\x72\xec\xb1\xa2\x7b\x59\xf3\x3d\x2b\xef\xcd\x28\x5b\x83\xcc\x18\x91\x88\xb0\x2e\xf9\x29\x31\x18\xf9\x4e\xe9\x0a\x91\x92\x9f\xae\x2d\xad\xf4\xe6\x1a\xe2\xa4\xee\x47\x15\xbf\x83\x6e\xd7\x72\x12\x3b\x2d\x24\xe9\xb2\x55\xcb\x3c\x10\xf0\x24\x8a\x4a\x02\xea\x90\x25\xf0\xb4\x79\x3a\xef\x6e\xf5\x52\xdf\xb0\x0a\xcd\x24\x1c\xd3\x2e\x22\x74\xea"
+ "\x21\x6f\xe9\xbd\xc8\x3e\x36\x5b\x19\xf1\xca\x99\x0a\xb4\xa7\x52\x1a\x4e\xf2\xad\x8d\x56\x85\xbb\x64\x89\xba\x26\xf9\xc7\xe1\x89\x19\x22\x77\xc3\xa8\xfc\xff\xad\xfe\xb9\x48\xae\x12\x30\x9f\x19\xfb\x1b\xef\x14\x87\x8a\x78\x71\xf3\xf4\xb7\x00\x9c\x1d\xb5\x3d\x49\x00\x0c\x06\xd4\x50\xf9\x54\x45\xb2\x5b\x43\xdb\x6d\xcf\x1a\xe9\x7a\x7a\xcf\xfc\x8a\x4e\x4d\x0b\x07\x63\x28\xd8\xe7\x08\x95\xdf\xa6\x72\x93\x2e\xbb\xa0\x42\x89\x16\xf1\xd9\x0c\xf9\xa1\x16\xfd\xd9\x03\xb4\x3b\x8a\xf5\xf6\xe7\x6b\x2e\x8e\x4c\x3d\xe2\xaf\x08\x45\x03\xff\x09\xb6\xeb\x2d\xc6\x1b\x88\x94\xac\x3e\xf1\x9f\x0e\x0e\x2b\xd5\x00\x4d\x3f\x3b\x53\xae\xaf\x1c\x33\x5f\x55\x6e\x8d\xaf\x05\x7a\x10\x34\xc9\xf4\x66\xcb\x62\x12\xa6\xee\xe8\x1c\x5d\x12\x86\xdb\x6f\x1c\x33\xc4\x1c\xda\x82\x2d\x3b\x59\xfe\xb1\xa4\x59\x41\x86\xd0\xef\xae\xfb\xda\x6d\x11"
+ "\xb8\xca\xe9\x6e\xff\xf7\xa9\xd9\x70\x30\xfc\x53\xe2\xd7\xa2\x4e\xc7\x91\xd9\x07\x06\xaa\xdd\xb0\x59\x28\x1d\x00\x66\xc5\x54\xc2\xfc\x06\xda\x05\x90\x52\x1d\x37\x66\xee\xf0\xb2\x55\x8a\x5d\xd2\x38\x86\x94\x9b\xfc\x10\x4c\xa1\xb9\x64\x3e\x44\xb8\x5f\xb0\x0c\xec\xe0\xc9\xe5\x62\x75\x3f\x09\xd5\xf5\xd9\x26\xba\x9e\xd2\xf4\xb9\x48\x0a\xbc\xa2\xd6\x7c\x36\x11\x7d\x26\x81\x89\xcf\xa4\xad\x73\x0e\xee\xcc\x06\xa9\xdb\xb1\xfd\xfb\x09\x7f\x90\x42\x37\x2f\xe1\x9c\x0f\x6f\xcf\x43\xb5\xd9\x90\xe1\x85\xf5\xa8\xae"
+}, {
+ .key = "\x47\x11\xeb\x86\x2b\x2c\xab\x44\x34\xda\x7f\x57\x03\x39\x0c\xaf\x2c\x14\xfd\x65\x23\xe9\x8e\x74\xd5\x08\x68\x08\xe7\xb4\x72\xd7",
+ .nonce = "\xdb\x92\x0f\x7f\x17\x54\x0c\x30",
+ .nlen = 8,
+ .assoc = "\xd2\xa1\x70\xdb\x7a\xf8\xfa\x27\xba\x73\x0f\xbf\x3d\x1e\x82\xb2",
+ .alen = 16,
+ .input = "\x42\x93\xe4\xeb\x97\xb0\x57\xbf\x1a\x8b\x1f\xe4\x5f\x36\x20\x3c\xef\x0a\xa9\x48\x5f\x5f\x37\x22\x3a\xde\xe3\xae\xbe\xad\x07\xcc\xb1\xf6\xf5\xf9\x56\xdd\xe7\x16\x1e\x7f\xdf\x7a\x9e\x75\xb7\xc7\xbe\xbe\x8a\x36\x04\xc0\x10\xf4\x95\x20\x03\xec\xdc\x05\xa1\x7d\xc4\xa9\x2c\x82\xd0\xbc\x8b\xc5\xc7\x45\x50\xf6\xa2\x1a\xb5\x46\x3b\x73\x02\xa6\x83\x4b\x73\x82\x58\x5e\x3b\x65\x2f\x0e\xfd\x2b\x59\x16\xce\xa1\x60\x9c\xe8\x3a\x99\xed\x8d\x5a\xcf\xf6\x83\xaf\xba\xd7\x73\x73\x40\x97\x3d\xca\xef\x07\x57\xe6\xd9\x70\x0e\x95\xae\xa6\x8d\x04\xcc\xee\xf7\x09\x31\x77\x12\xa3\x23\x97\x62\xb3\x7b\x32\xfb\x80\x14\x48\x81\xc3\xe5\xea\x91\x39\x52\x81\xa2\x4f\xe4\xb3\x09\xff\xde\x5e\xe9\x58\x84\x6e\xf9\x3d\xdf\x25\xea\xad\xae\xe6\x9a\xd1\x89\x55\xd3\xde\x6c\x52\xdb\x70\xfe\x37\xce\x44\x0a\xa8\x25\x5f\x92\xc1\x33\x4a"
+ "\x4f\x9b\x62\x35\xff\xce\xc0\xa9\x60\xce\x52\x00\x97\x51\x35\x26\x2e\xb9\x36\xa9\x87\x6e\x1e\xcc\x91\x78\x53\x98\x86\x5b\x9c\x74\x7d\x88\x33\xe1\xdf\x37\x69\x2b\xbb\xf1\x4d\xf4\xd1\xf1\x39\x93\x17\x51\x19\xe3\x19\x1e\x76\x37\x25\xfb\x09\x27\x6a\xab\x67\x6f\x14\x12\x64\xe7\xc4\x07\xdf\x4d\x17\xbb\x6d\xe0\xe9\xb9\xab\xca\x10\x68\xaf\x7e\xb7\x33\x54\x73\x07\x6e\xf7\x81\x97\x9c\x05\x6f\x84\x5f\xd2\x42\xfb\x38\xcf\xd1\x2f\x14\x30\x88\x98\x4d\x5a\xa9\x76\xd5\x4f\x3e\x70\x6c\x85\x76\xd7\x01\xa0\x1a\xc8\x4e\xaa\xac\x78\xfe\x46\xde\x6a\x05\x46\xa7\x43\x0c\xb9\xde\xb9\x68\xfb\xce\x42\x99\x07\x4d\x0b\x3b\x5a\x30\x35\xa8\xf9\x3a\x73\xef\x0f\xdb\x1e\x16\x42\xc4\xba\xae\x58\xaa\xf8\xe5\x75\x2f\x1b\x15\x5c\xfd\x0a\x97\xd0\xe4\x37\x83\x61\x5f\x43\xa6\xc7\x3f\x38\x59\xe6\xeb\xa3\x90\xc3\xaa\xaa\x5a\xd3\x34\xd4"
+ "\x17\xc8\x65\x3e\x57\xbc\x5e\xdd\x9e\xb7\xf0\x2e\x5b\xb2\x1f\x8a\x08\x0d\x45\x91\x0b\x29\x53\x4f\x4c\x5a\x73\x56\xfe\xaf\x41\x01\x39\x0a\x24\x3c\x7e\xbe\x4e\x53\xf3\xeb\x06\x66\x51\x28\x1d\xbd\x41\x0a\x01\xab\x16\x47\x27\x47\x47\xf7\xcb\x46\x0a\x70\x9e\x01\x9c\x09\xe1\x2a\x00\x1a\xd8\xd4\x79\x9d\x80\x15\x8e\x53\x2a\x65\x83\x78\x3e\x03\x00\x07\x12\x1f\x33\x3e\x7b\x13\x37\xf1\xc3\xef\xb7\xc1\x20\x3c\x3e\x67\x66\x5d\x88\xa7\x7d\x33\x50\x77\xb0\x28\x8e\xe7\x2c\x2e\x7a\xf4\x3c\x8d\x74\x83\xaf\x8e\x87\x0f\xe4\x50\xff\x84\x5c\x47\x0c\x6a\x49\xbf\x42\x86\x77\x15\x48\xa5\x90\x5d\x93\xd6\x2a\x11\xd5\xd5\x11\xaa\xce\xe7\x6f\xa5\xb0\x09\x2c\x8d\xd3\x92\xf0\x5a\x2a\xda\x5b\x1e\xd5\x9a\xc4\xc4\xf3\x49\x74\x41\xca\xe8\xc1\xf8\x44\xd6\x3c\xae\x6c\x1d\x9a\x30\x04\x4d\x27\x0e\xb1\x5f\x59\xa2\x24\xe8\xe1\x98\xc5\x6a\x4c"
+ "\xfe\x41\xd2\x27\x42\x52\xe1\xe9\x7d\x62\xe4\x88\x0f\xad\xb2\x70\xcb\x9d\x4c\x27\x2e\x76\x1e\x1a\x63\x65\xf5\x3b\xf8\x57\x69\xeb\x5b\x38\x26\x39\x33\x25\x45\x3e\x91\xb8\xd8\xc7\xd5\x42\xc0\x22\x31\x74\xf4\xbc\x0c\x23\xf1\xca\xc1\x8d\xd7\xbe\xc9\x62\xe4\x08\x1a\xcf\x36\xd5\xfe\x55\x21\x59\x91\x87\x87\xdf\x06\xdb\xdf\x96\x45\x58\xda\x05\xcd\x50\x4d\xd2\x7d\x05\x18\x73\x6a\x8d\x11\x85\xa6\x88\xe8\xda\xe6\x30\x33\xa4\x89\x31\x75\xbe\x69\x43\x84\x43\x50\x87\xdd\x71\x36\x83\xc3\x78\x74\x24\x0a\xed\x7b\xdb\xa4\x24\x0b\xb9\x7e\x5d\xff\xde\xb1\xef\x61\x5a\x45\x33\xf6\x17\x07\x08\x98\x83\x92\x0f\x23\x6d\xe6\xaa\x17\x54\xad\x6a\xc8\xdb\x26\xbe\xb8\xb6\x08\xfa\x68\xf1\xd7\x79\x6f\x18\xb4\x9e\x2d\x3f\x1b\x64\xaf\x8d\x06\x0e\x49\x28\xe0\x5d\x45\x68\x13\x87\xfa\xde\x40\x7b\xd2\xc3\x94\xd5\xe1\xd9\xc2\xaf\x55\x89"
+ "\xeb\xb4\x12\x59\xa8\xd4\xc5\x29\x66\x38\xe6\xac\x22\x22\xd9\x64\x9b\x34\x0a\x32\x9f\xc2\xbf\x17\x6c\x3f\x71\x7a\x38\x6b\x98\xfb\x49\x36\x89\xc9\xe2\xd6\xc7\x5d\xd0\x69\x5f\x23\x35\xc9\x30\xe2\xfd\x44\x58\x39\xd7\x97\xfb\x5c\x00\xd5\x4f\x7a\x1a\x95\x8b\x62\x4b\xce\xe5\x91\x21\x7b\x30\x00\xd6\xdd\x6d\x02\x86\x49\x0f\x3c\x1a\x27\x3c\xd3\x0e\x71\xf2\xff\xf5\x2f\x87\xac\x67\x59\x81\xa3\xf7\xf8\xd6\x11\x0c\x84\xa9\x03\xee\x2a\xc4\xf3\x22\xab\x7c\xe2\x25\xf5\x67\xa3\xe4\x11\xe0\x59\xb3\xca\x87\xa0\xae\xc9\xa6\x62\x1b\x6e\x4d\x02\x6b\x07\x9d\xfd\xd0\x92\x06\xe1\xb2\x9a\x4a\x1f\x1f\x13\x49\x99\x97\x08\xde\x7f\x98\xaf\x51\x98\xee\x2c\xcb\xf0\x0b\xc6\xb6\xb7\x2d\x9a\xb1\xac\xa6\xe3\x15\x77\x9d\x6b\x1a\xe4\xfc\x8b\xf2\x17\x59\x08\x04\x58\x81\x9d\x1b\x1b\x69\x55\xc2\xb4\x3c\x1f\x50\xf1\x7f\x77\x90\x4c\x66\x40"
+ "\x5a\xc0\x33\x1f\xcb\x05\x6d\x5c\x06\x87\x52\xa2\x8f\x26\xd5\x4f",
+ .ilen = 1024,
+ .result = "\xe5\x26\xa4\x3d\xbd\x33\xd0\x4b\x6f\x05\xa7\x6e\x12\x7a\xd2\x74\xa6\xdd\xbd\x95\xeb\xf9\xa4\xf1\x59\x93\x91\x70\xd9\xfe\x9a\xcd\x53\x1f\x3a\xab\xa6\x7c\x9f\xa6\x9e\xbd\x99\xd9\xb5\x97\x44\xd5\x14\x48\x4d\x9d\xc0\xd0\x05\x96\xeb\x4c\x78\x55\x09\x08\x01\x02\x30\x90\x7b\x96\x7a\x7b\x5f\x30\x41\x24\xce\x68\x61\x49\x86\x57\x82\xdd\x53\x1c\x51\x28\x2b\x53\x6e\x2d\xc2\x20\x4c\xdd\x8f\x65\x10\x20\x50\xdd\x9d\x50\xe5\x71\x40\x53\x69\xfc\x77\x48\x11\xb9\xde\xa4\x8d\x58\xe4\xa6\x1a\x18\x47\x81\x7e\xfc\xdd\xf6\xef\xce\x2f\x43\x68\xd6\x06\xe2\x74\x6a\xad\x90\xf5\x37\xf3\x3d\x82\x69\x40\xe9\x6b\xa7\x3d\xa8\x1e\xd2\x02\x7c\xb7\x9b\xe4\xda\x8f\x95\x06\xc5\xdf\x73\xa3\x20\x9a\x49\xde\x9c\xbc\xee\x14\x3f\x81\x5e\xf8\x3b\x59\x3c\xe1\x68\x12\x5a\x3a\x76\x3a\x3f\xf7\x87\x33\x0a\x01\xb8\xd4\xed\xb6\xbe\x94"
+ "\x5e\x70\x40\x56\x67\x1f\x50\x44\x19\xce\x82\x70\x10\x87\x13\x20\x0b\x4c\x5a\xb6\xf6\xa7\xae\x81\x75\x01\x81\xe6\x4b\x57\x7c\xdd\x6d\xf8\x1c\x29\x32\xf7\xda\x3c\x2d\xf8\x9b\x25\x6e\x00\xb4\xf7\x2f\xf7\x04\xf7\xa1\x56\xac\x4f\x1a\x64\xb8\x47\x55\x18\x7b\x07\x4d\xbd\x47\x24\x80\x5d\xa2\x70\xc5\xdd\x8e\x82\xd4\xeb\xec\xb2\x0c\x39\xd2\x97\xc1\xcb\xeb\xf4\x77\x59\xb4\x87\xef\xcb\x43\x2d\x46\x54\xd1\xa7\xd7\x15\x99\x0a\x43\xa1\xe0\x99\x33\x71\xc1\xed\xfe\x72\x46\x33\x8e\x91\x08\x9f\xc8\x2e\xca\xfa\xdc\x59\xd5\xc3\x76\x84\x9f\xa3\x37\x68\xc3\xf0\x47\x2c\x68\xdb\x5e\xc3\x49\x4c\xe8\x92\x85\xe2\x23\xd3\x3f\xad\x32\xe5\x2b\x82\xd7\x8f\x99\x0a\x59\x5c\x45\xd9\xb4\x51\x52\xc2\xae\xbf\x80\xcf\xc9\xc9\x51\x24\x2a\x3b\x3a\x4d\xae\xeb\xbd\x22\xc3\x0e\x0f\x59\x25\x92\x17\xe9\x74\xc7\x8b\x70\x70\x36\x55\x95\x75"
+ "\x4b\xad\x61\x2b\x09\xbc\x82\xf2\x6e\x94\x43\xae\xc3\xd5\xcd\x8e\xfe\x5b\x9a\x88\x43\x01\x75\xb2\x23\x09\xf7\x89\x83\xe7\xfa\xf9\xb4\x9b\xf8\xef\xbd\x1c\x92\xc1\xda\x7e\xfe\x05\xba\x5a\xcd\x07\x6a\x78\x9e\x5d\xfb\x11\x2f\x79\x38\xb6\xc2\x5b\x6b\x51\xb4\x71\xdd\xf7\x2a\xe4\xf4\x72\x76\xad\xc2\xdd\x64\x5d\x79\xb6\xf5\x7a\x77\x20\x05\x3d\x30\x06\xd4\x4c\x0a\x2c\x98\x5a\xb9\xd4\x98\xa9\x3f\xc6\x12\xea\x3b\x4b\xc5\x79\x64\x63\x6b\x09\x54\x3b\x14\x27\xba\x99\x80\xc8\x72\xa8\x12\x90\x29\xba\x40\x54\x97\x2b\x7b\xfe\xeb\xcd\x01\x05\x44\x72\xdb\x99\xe4\x61\xc9\x69\xd6\xb9\x28\xd1\x05\x3e\xf9\x0b\x49\x0a\x49\xe9\x8d\x0e\xa7\x4a\x0f\xaf\x32\xd0\xe0\xb2\x3a\x55\x58\xfe\x5c\x28\x70\x51\x23\xb0\x7b\x6a\x5f\x1e\xb8\x17\xd7\x94\x15\x8f\xee\x20\xc7\x42\x25\x3e\x9a\x14\xd7\x60\x72\x39\x47\x48\xa9\xfe\xdd\x47\x0a\xb1\xe6"
+ "\x60\x28\x8c\x11\x68\xe1\xff\xd7\xce\xc8\xbe\xb3\xfe\x27\x30\x09\x70\xd7\xfa\x02\x33\x3a\x61\x2e\xc7\xff\xa4\x2a\xa8\x6e\xb4\x79\x35\x6d\x4c\x1e\x38\xf8\xee\xd4\x84\x4e\x6e\x28\xa7\xce\xc8\xc1\xcf\x80\x05\xf3\x04\xef\xc8\x18\x28\x2e\x8d\x5e\x0c\xdf\xb8\x5f\x96\xe8\xc6\x9c\x2f\xe5\xa6\x44\xd7\xe7\x99\x44\x0c\xec\xd7\x05\x60\x97\xbb\x74\x77\x58\xd5\xbb\x48\xde\x5a\xb2\x54\x7f\x0e\x46\x70\x6a\x6f\x78\xa5\x08\x89\x05\x4e\x7e\xa0\x69\xb4\x40\x60\x55\x77\x75\x9b\x19\xf2\xd5\x13\x80\x77\xf9\x4b\x3f\x1e\xee\xe6\x76\x84\x7b\x8c\xe5\x27\xa8\x0a\x91\x01\x68\x71\x8a\x3f\x06\xab\xf6\xa9\xa5\xe6\x72\x92\xe4\x67\xe2\xa2\x46\x35\x84\x55\x7d\xca\xa8\x85\xd0\xf1\x3f\xbe\xd7\x34\x64\xfc\xae\xe3\xe4\x04\x9f\x66\x02\xb9\x88\x10\xd9\xc4\x4c\x31\x43\x7a\x93\xe2\x9b\x56\x43\x84\xdc\xdc\xde\x1d\xa4\x02\x0e\xc2\xef\xc3\xf8\x78"
+ "\xd1\xb2\x6b\x63\x18\xc9\xa9\xe5\x72\xd8\xf3\xb9\xd1\x8a\xc7\x1a\x02\x27\x20\x77\x10\xe5\xc8\xd4\x4a\x47\xe5\xdf\x5f\x01\xaa\xb0\xd4\x10\xbb\x69\xe3\x36\xc8\xe1\x3d\x43\xfb\x86\xcd\xcc\xbf\xf4\x88\xe0\x20\xca\xb7\x1b\xf1\x2f\x5c\xee\xd4\xd3\xa3\xcc\xa4\x1e\x1c\x47\xfb\xbf\xfc\xa2\x41\x55\x9d\xf6\x5a\x5e\x65\x32\x34\x7b\x52\x8d\xd5\xd0\x20\x60\x03\xab\x3f\x8c\xd4\x21\xea\x2a\xd9\xc4\xd0\xd3\x65\xd8\x7a\x13\x28\x62\x32\x4b\x2c\x87\x93\xa8\xb4\x52\x45\x09\x44\xec\xec\xc3\x17\xdb\x9a\x4d\x5c\xa9\x11\xd4\x7d\xaf\x9e\xf1\x2d\xb2\x66\xc5\x1d\xed\xb7\xcd\x0b\x25\x5e\x30\x47\x3f\x40\xf4\xa1\xa0\x00\x94\x10\xc5\x6a\x63\x1a\xd5\x88\x92\x8e\x82\x39\x87\x3c\x78\x65\x58\x42\x75\x5b\xdd\x77\x3e\x09\x4e\x76\x5b\xe6\x0e\x4d\x38\xb2\xc0\xb8\x95\x01\x7a\x10\xe0\xfb\x07\xf2\xab\x2d\x8c\x32\xed\x2b\xc0\x46\xc2\xf5"
+ "\x38\x83\xf0\x17\xec\xc1\x20\x6a\x9a\x0b\x00\xa0\x98\x22\x50\x23\xd5\x80\x6b\xf6\x1f\xc3\xcc\x97\xc9\x24\x9f\xf3\xaf\x43\x14\xd5\xa0"
+}, {
+ .key = "\x35\x4e\xb5\x70\x50\x42\x8a\x85\xf2\xfb\xed\x7b\xd0\x9e\x97\xca\xfa\x98\x66\x63\xee\x37\xcc\x52\xfe\xd1\xdf\x95\x15\x34\x29\x38",
+ .nonce = "\xfd\x87\xd4\xd8\x62\xfd\xec\xaa",
+ .nlen = 8,
+ .assoc = "\xd6\x31\xda\x5d\x42\x5e\xd7",
+ .alen = 7,
+ .input = "\x7a\x57\xf2\xc7\x06\x3f\x50\x7b\x36\x1a\x66\x5c\xb9\x0e\x5e\x3b\x45\x60\xbe\x9a\x31\x9f\xff\x5d\x66\x34\xb4\xdc\xfb\x9d\x8e\xee\x6a\x33\xa4\x07\x3c\xf9\x4c\x30\xa1\x24\x52\xf9\x50\x46\x88\x20\x02\x32\x3a\x0e\x99\x63\xaf\x1f\x15\x28\x2a\x05\xff\x57\x59\x5e\x18\xa1\x1f\xd0\x92\x5c\x88\x66\x1b\x00\x64\xa5\x93\x8d\x06\x46\xb0\x64\x8b\x8b\xef\x99\x05\x35\x85\xb3\xf3\x33\xbb\xec\x66\xb6\x3d\x57\x42\xe3\xb4\xc6\xaa\xb0\x41\x2a\xb9\x59\xa9\xf6\x3e\x15\x26\x12\x03\x21\x4c\x74\x43\x13\x2a\x03\x27\x09\xb4\xfb\xe7\xb7\x40\xff\x5e\xce\x48\x9a\x60\xe3\x8b\x80\x8c\x38\x2d\xcb\x93\x37\x74\x05\x52\x6f\x73\x3e\xc3\xbc\xca\x72\x0a\xeb\xf1\x3b\xa0\x95\xdc\x8a\xc4\xa9\xdc\xca\x44\xd8\x08\x63\x6a\x36\xd3\x3c\xb8\xac\x46\x7d\xfd\xaa\xeb\x3e\x0f\x45\x8f\x49\xda\x2b\xf2\x12\xbd\xaf\x67\x8a\x63\x48\x4b\x55\x5f\x6d\x8c"
+ "\xb9\x76\x34\x84\xae\xc2\xfc\x52\x64\x82\xf7\xb0\x06\xf0\x45\x73\x12\x50\x30\x72\xea\x78\x9a\xa8\xaf\xb5\xe3\xbb\x77\x52\xec\x59\x84\xbf\x6b\x8f\xce\x86\x5e\x1f\x23\xe9\xfb\x08\x86\xf7\x10\xb9\xf2\x44\x96\x44\x63\xa9\xa8\x78\x00\x23\xd6\xc7\xe7\x6e\x66\x4f\xcc\xee\x15\xb3\xbd\x1d\xa0\xe5\x9c\x1b\x24\x2c\x4d\x3c\x62\x35\x9c\x88\x59\x09\xdd\x82\x1b\xcf\x0a\x83\x6b\x3f\xae\x03\xc4\xb4\xdd\x7e\x5b\x28\x76\x25\x96\xd9\xc9\x9d\x5f\x86\xfa\xf6\xd7\xd2\xe6\x76\x1d\x0f\xa1\xdc\x74\x05\x1b\x1d\xe0\xcd\x16\xb0\xa8\x8a\x34\x7b\x15\x11\x77\xe5\x7b\x7e\x20\xf7\xda\x38\xda\xce\x70\xe9\xf5\x6c\xd9\xbe\x0c\x4c\x95\x4c\xc2\x9b\x34\x55\x55\xe1\xf3\x46\x8e\x48\x74\x14\x4f\x9d\xc9\xf5\xe8\x1a\xf0\x11\x4a\xc1\x8d\xe0\x93\xa0\xbe\x09\x1c\x2b\x4e\x0f\xb2\x87\x8b\x84\xfe\x92\x32\x14\xd7\x93\xdf\xe7\x44\xbc\xc5\xae\x53"
+ "\x69\xd8\xb3\x79\x37\x80\xe3\x17\x5c\xec\x53\x00\x9a\xe3\x8e\xdc\x38\xb8\x66\xf0\xd3\xad\x1d\x02\x96\x86\x3e\x9d\x3b\x5d\xa5\x7f\x21\x10\xf1\x1f\x13\x20\xf9\x57\x87\x20\xf5\x5f\xf1\x17\x48\x0a\x51\x5a\xcd\x19\x03\xa6\x5a\xd1\x12\x97\xe9\x48\xe2\x1d\x83\x75\x50\xd9\x75\x7d\x6a\x82\xa1\xf9\x4e\x54\x87\x89\xc9\x0c\xb7\x5b\x6a\x91\xc1\x9c\xb2\xa9\xdc\x9a\xa4\x49\x0a\x6d\x0d\xbb\xde\x86\x44\xdd\x5d\x89\x2b\x96\x0f\x23\x95\xad\xcc\xa2\xb3\xb9\x7e\x74\x38\xba\x9f\x73\xae\x5f\xf8\x68\xa2\xe0\xa9\xce\xbd\x40\xd4\x4c\x6b\xd2\x56\x62\xb0\xcc\x63\x7e\x5b\xd3\xae\xd1\x75\xce\xbb\xb4\x5b\xa8\xf8\xb4\xac\x71\x75\xaa\xc9\x9f\xbb\x6c\xad\x0f\x55\x5d\xe8\x85\x7d\xf9\x21\x35\xea\x92\x85\x2b\x00\xec\x84\x90\x0a\x63\x96\xe4\x6b\xa9\x77\xb8\x91\xf8\x46\x15\x72\x63\x70\x01\x40\xa3\xa5\x76\x62\x2b\xbf\xf1\xe5\x8d\x9f"
+ "\xa3\xfa\x9b\x03\xbe\xfe\x65\x6f\xa2\x29\x0d\x54\xb4\x71\xce\xa9\xd6\x3d\x88\xf9\xaf\x6b\xa8\x9e\xf4\x16\x96\x36\xb9\x00\xdc\x10\xab\xb5\x08\x31\x1f\x00\xb1\x3c\xd9\x38\x3e\xc6\x04\xa7\x4e\xe8\xae\xed\x98\xc2\xf7\xb9\x00\x5f\x8c\x60\xd1\xe5\x15\xf7\xae\x1e\x84\x88\xd1\xf6\xbc\x3a\x89\x35\x22\x83\x7c\xca\xf0\x33\x82\x4c\x79\x3c\xfd\xb1\xae\x52\x62\x55\xd2\x41\x60\xc6\xbb\xfa\x0e\x59\xd6\xa8\xfe\x5d\xed\x47\x3d\xe0\xea\x1f\x6e\x43\x51\xec\x10\x52\x56\x77\x42\x6b\x52\x87\xd8\xec\xe0\xaa\x76\xa5\x84\x2a\x22\x24\xfd\x92\x40\x88\xd5\x85\x1c\x1f\x6b\x47\xa0\xc4\xe4\xef\xf4\xea\xd7\x59\xac\x2a\x9e\x8c\xfa\x1f\x42\x08\xfe\x4f\x74\xa0\x26\xf5\xb3\x84\xf6\x58\x5f\x26\x66\x3e\xd7\xe4\x22\x91\x13\xc8\xac\x25\x96\x23\xd8\x09\xea\x45\x75\x23\xb8\x5f\xc2\x90\x8b\x09\xc4\xfc\x47\x6c\x6d\x0a\xef\x69\xa4\x38\x19"
+ "\xcf\x7d\xf9\x09\x73\x9b\x60\x5a\xf7\x37\xb5\xfe\x9f\xe3\x2b\x4c\x0d\x6e\x19\xf1\xd6\xc0\x70\xf3\x9d\x22\x3c\xf9\x49\xce\x30\x8e\x44\xb5\x76\x15\x8f\x52\xfd\xa5\x04\xb8\x55\x6a\x36\x59\x7c\xc4\x48\xb8\xd7\xab\x05\x66\xe9\x5e\x21\x6f\x6b\x36\x29\xbb\xe9\xe3\xa2\x9a\xa8\xcd\x55\x25\x11\xba\x5a\x58\xa0\xde\xae\x19\x2a\x48\x5a\xff\x36\xcd\x6d\x16\x7a\x73\x38\x46\xe5\x47\x59\xc8\xa2\xf6\xe2\x6c\x83\xc5\x36\x2c\x83\x7d\xb4\x01\x05\x69\xe7\xaf\x5c\xc4\x64\x82\x12\x21\xef\xf7\xd1\x7d\xb8\x8d\x8c\x98\x7c\x5f\x7d\x92\x88\xb9\x94\x07\x9c\xd8\xe9\x9c\x17\x38\xe3\x57\x6c\xe0\xdc\xa5\x92\x42\xb3\xbd\x50\xa2\x7e\xb5\xb1\x52\x72\x03\x97\xd8\xaa\x9a\x1e\x75\x41\x11\xa3\x4f\xcc\xd4\xe3\x73\xad\x96\xdc\x47\x41\x9f\xb0\xbe\x79\x91\xf5\xb6\x18\xfe\xc2\x83\x18\x7d\x73\xd9\x4f\x83\x84\x03\xb3\xf0\x77\x66\x3d\x83\x63"
+ "\x2e\x2c\xf9\xdd\xa6\x1f\x89\x82\xb8\x23\x42\xeb\xe2\xca\x70\x82\x61\x41\x0a\x6d\x5f\x75\xc5\xe2\xc4\x91\x18\x44\x22\xfa\x34\x10\xf5\x20\xdc\xb7\xdd\x2a\x20\x77\xf5\xf9\xce\xdb\xa0\x0a\x52\x2a\x4e\xdd\xcc\x97\xdf\x05\xe4\x5e\xb7\xaa\xf0\xe2\x80\xff\xba\x1a\x0f\xac\xdf\x02\x32\xe6\xf7\xc7\x17\x13\xb7\xfc\x98\x48\x8c\x0d\x82\xc9\x80\x7a\xe2\x0a\xc5\xb4\xde\x7c\x3c\x79\x81\x0e\x28\x65\x79\x67\x82\x69\x44\x66\x09\xf7\x16\x1a\xf9\x7d\x80\xa1\x79\x14\xa9\xc8\x20\xfb\xa2\x46\xbe\x08\x35\x17\x58\xc1\x1a\xda\x2a\x6b\x2e\x1e\xe6\x27\x55\x7b\x19\xe2\xfb\x64\xfc\x5e\x15\x54\x3c\xe7\xc2\x11\x50\x30\xb8\x72\x03\x0b\x1a\x9f\x86\x27\x11\x5c\x06\x2b\xbd\x75\x1a\x0a\xda\x01\xfa\x5c\x4a\xc1\x80\x3a\x6e\x30\xc8\x2c\xeb\x56\xec\x89\xfa\x35\x7b\xb2\xf0\x97\x08\x86\x53\xbe\xbd\x40\x41\x38\x1c\xb4\x8b\x79\x2e\x18\x96\x94"
+ "\xde\xe8\xca\xe5\x9f\x92\x9f\x15\x5d\x56\x60\x5c\x09\xf9\x16\xf4\x17\x0f\xf6\x4c\xda\xe6\x67\x89\x9f\xca\x6c\xe7\x9b\x04\x62\x0e\x26\xa6\x52\xbd\x29\xff\xc7\xa4\x96\xe6\x6a\x02\xa5\x2e\x7b\xfe\x97\x68\x3e\x2e\x5f\x3b\x0f\x36\xd6\x98\x19\x59\x48\xd2\xc6\xe1\x55\x1a\x6e\xd6\xed\x2c\xba\xc3\x9e\x64\xc9\x95\x86\x35\x5e\x3e\x88\x69\x99\x4b\xee\xbe\x9a\x99\xb5\x6e\x58\xae\xdd\x22\xdb\xdd\x6b\xfc\xaf\x90\xa3\x3d\xa4\xc1\x15\x92\x18\x8d\xd2\x4b\x7b\x06\xd1\x37\xb5\xe2\x7c\x2c\xf0\x25\xe4\x94\x2a\xbd\xe3\x82\x70\x78\xa3\x82\x10\x5a\x90\xd7\xa4\xfa\xaf\x1a\x88\x59\xdc\x74\x12\xb4\x8e\xd7\x19\x46\xf4\x84\x69\x9f\xbb\x70\xa8\x4c\x52\x81\xa9\xff\x76\x1c\xae\xd8\x11\x3d\x7f\x7d\xc5\x12\x59\x28\x18\xc2\xa2\xb7\x1c\x88\xf8\xd6\x1b\xa6\x7d\x9e\xde\x29\xf8\xed\xff\xeb\x92\x24\x4f\x05\xaa\xd9\x49\xba\x87\x59"
+ "\x51\xc9\x20\x5c\x9b\x74\xcf\x03\xd9\x2d\x34\xc7\x5b\xa5\x40\xb2\x99\xf5\xcb\xb4\xf6\xb7\x72\x4a\xd6\xbd\xb0\xf3\x93\xe0\x1b\xa8\x04\x1e\x35\xd4\x80\x20\xf4\x9c\x31\x6b\x45\xb9\x15\xb0\x5e\xdd\x0a\x33\x9c\x83\xcd\x58\x89\x50\x56\xbb\x81\x00\x91\x32\xf3\x1b\x3e\xcf\x45\xe1\xf9\xe1\x2c\x26\x78\x93\x9a\x60\x46\xc9\xb5\x5e\x6a\x28\x92\x87\x3f\x63\x7b\xdb\xf7\xd0\x13\x9d\x32\x40\x5e\xcf\xfb\x79\x68\x47\x4c\xfd\x01\x17\xe6\x97\x93\x78\xbb\xa6\x27\xa3\xe8\x1a\xe8\x94\x55\x7d\x08\xe5\xdc\x66\xa3\x69\xc8\xca\xc5\xa1\x84\x55\xde\x08\x91\x16\x3a\x0c\x86\xab\x27\x2b\x64\x34\x02\x6c\x76\x8b\xc6\xaf\xcc\xe1\xd6\x8c\x2a\x18\x3d\xa6\x1b\x37\x75\x45\x73\xc2\x75\xd7\x53\x78\x3a\xd6\xe8\x29\xd2\x4a\xa8\x1e\x82\xf6\xb6\x81\xde\x21\xed\x2b\x56\xbb\xf2\xd0\x57\xc1\x7c\xd2\x6a\xd2\x56\xf5\x13\x5f\x1c\x6a\x0b\x74\xfb"
+ "\xe9\xfe\x9e\xea\x95\xb2\x46\xab\x0a\xfc\xfd\xf3\xbb\x04\x2b\x76\x1b\xa4\x74\xb0\xc1\x78\xc3\x69\xe2\xb0\x01\xe1\xde\x32\x4c\x8d\x1a\xb3\x38\x08\xd5\xfc\x1f\xdc\x0e\x2c\x9c\xb1\xa1\x63\x17\x22\xf5\x6c\x93\x70\x74\x00\xf8\x39\x01\x94\xd1\x32\x23\x56\x5d\xa6\x02\x76\x76\x93\xce\x2f\x19\xe9\x17\x52\xae\x6e\x2c\x6d\x61\x7f\x3b\xaa\xe0\x52\x85\xc5\x65\xc1\xbb\x8e\x5b\x21\xd5\xc9\x78\x83\x07\x97\x4c\x62\x61\x41\xd4\xfc\xc9\x39\xe3\x9b\xd0\xcc\x75\xc4\x97\xe6\xdd\x2a\x5f\xa6\xe8\x59\x6c\x98\xb9\x02\xe2\xa2\xd6\x68\xee\x3b\x1d\xe3\x4d\x5b\x30\xef\x03\xf2\xeb\x18\x57\x36\xe8\xa1\xf4\x47\xfb\xcb\x8f\xcb\xc8\xf3\x4f\x74\x9d\x9d\xb1\x8d\x14\x44\xd9\x19\xb4\x54\x4f\x75\x19\x09\xa0\x75\xbc\x3b\x82\xc6\x3f\xb8\x83\x19\x6e\xd6\x37\xfe\x6e\x8a\x4e\xe0\x4a\xab\x7b\xc8\xb4\x1d\xf4\xed\x27\x03\x65\xa2\xa1\xae\x11\xe7"
+ "\x98\x78\x48\x91\xd2\xd2\xd4\x23\x78\x50\xb1\x5b\x85\x10\x8d\xca\x5f\x0f\x71\xae\x72\x9a\xf6\x25\x19\x60\x06\xf7\x10\x34\x18\x0d\xc9\x9f\x7b\x0c\x9b\x8f\x91\x1b\x9f\xcd\x10\xee\x75\xf9\x97\x66\xfc\x4d\x33\x6e\x28\x2b\x92\x85\x4f\xab\x43\x8d\x8f\x7d\x86\xa7\xc7\xd8\xd3\x0b\x8b\x57\xb6\x1d\x95\x0d\xe9\xbc\xd9\x03\xd9\x10\x19\xc3\x46\x63\x55\x87\x61\x79\x6c\x95\x0e\x9c\xdd\xca\xc3\xf3\x64\xf0\x7d\x76\xb7\x53\x67\x2b\x1e\x44\x56\x81\xea\x8f\x5c\x42\x16\xb8\x28\xeb\x1b\x61\x10\x1e\xbf\xec\xa8",
+ .ilen = 1933,
+ .result = "\x6a\xfc\x4b\x25\xdf\xc0\xe4\xe8\x17\x4d\x4c\xc9\x7e\xde\x3a\xcc\x3c\xba\x6a\x77\x47\xdb\xe3\x74\x7a\x4d\x5f\x8d\x37\x55\x80\x73\x90\x66\x5d\x3a\x7d\x5d\x86\x5e\x8d\xfd\x83\xff\x4e\x74\x6f\xf9\xe6\x70\x17\x70\x3e\x96\xa7\x7e\xcb\xab\x8f\x58\x24\x9b\x01\xfd\xcb\xe6\x4d\x9b\xf0\x88\x94\x57\x66\xef\x72\x4c\x42\x6e\x16\x19\x15\xea\x70\x5b\xac\x13\xdb\x9f\x18\xe2\x3c\x26\x97\xbc\xdc\x45\x8c\x6c\x24\x69\x9c\xf7\x65\x1e\x18\x59\x31\x7c\xe4\x73\xbc\x39\x62\xc6\x5c\x9f\xbf\xfa\x90\x03\xc9\x72\x26\xb6\x1b\xc2\xb7\x3f\xf2\x13\x77\xf2\x8d\xb9\x47\xd0\x53\xdd\xc8\x91\x83\x8b\xb1\xce\xa3\xfe\xcd\xd9\xdd\x92\x7b\xdb\xb8\xfb\xc9\x2d\x01\x59\x39\x52\xad\x1b\xec\xcf\xd7\x70\x13\x21\xf5\x47\xaa\x18\x21\x5c\xc9\x9a\xd2\x6b\x05\x9c\x01\xa1\xda\x35\x5d\xb3\x70\xe6\xa9\x80\x8b\x91\xb7\xb3\x5f\x24\x9a\xb7"
+ "\xd1\x6b\xa1\x1c\x50\xba\x49\xe0\xee\x2e\x75\xac\x69\xc0\xeb\x03\xdd\x19\xe5\xf6\x06\xdd\xc3\xd7\x2b\x07\x07\x30\xa7\x19\x0c\xbf\xe6\x18\xcc\xb1\x01\x11\x85\x77\x1d\x96\xa7\xa3\x00\x84\x02\xa2\x83\x68\xda\x17\x27\xc8\x7f\x23\xb7\xf4\x13\x85\xcf\xdd\x7a\x7d\x24\x57\xfe\x05\x93\xf5\x74\xce\xed\x0c\x20\x98\x8d\x92\x30\xa1\x29\x23\x1a\xa0\x4f\x69\x56\x4c\xe1\xc8\xce\xf6\x9a\x0c\xa4\xfa\x04\xf6\x62\x95\xf2\xfa\xc7\x40\x68\x40\x8f\x41\xda\xb4\x26\x6f\x70\xab\x40\x61\xa4\x0e\x75\xfb\x86\xeb\x9d\x9a\x1f\xec\x76\x99\xe7\xea\xaa\x1e\x2d\xb5\xd4\xa6\x1a\xb8\x61\x0a\x1d\x16\x5b\x98\xc2\x31\x40\xe7\x23\x1d\x66\x99\xc8\xc0\xd7\xce\xf3\x57\x40\x04\x3f\xfc\xea\xb3\xfc\xd2\xd3\x99\xa4\x94\x69\xa0\xef\xd1\x85\xb3\xa6\xb1\x28\xbf\x94\x67\x22\xc3\x36\x46\xf8\xd2\x0f\x5f\xf4\x59\x80\xe6\x2d\x43\x08\x7d\x19\x09\x97\xa7\x4c"
+ "\x3d\x8d\xba\x65\x62\xa3\x71\x33\x29\x62\xdb\xc1\x33\x34\x1a\x63\x33\x16\xb6\x64\x7e\xab\x33\xf0\xe6\x26\x68\xba\x1d\x2e\x38\x08\xe6\x02\xd3\x25\x2c\x47\x23\x58\x34\x0f\x9d\x63\x4f\x63\xbb\x7f\x3b\x34\x38\xa7\xb5\x8d\x65\xd9\x9f\x79\x55\x3e\x4d\xe7\x73\xd8\xf6\x98\x97\x84\x60\x9c\xc8\xa9\x3c\xf6\xdc\x12\x5c\xe1\xbb\x0b\x8b\x98\x9c\x9d\x26\x7c\x4a\xe6\x46\x36\x58\x21\x4a\xee\xca\xd7\x3b\xc2\x6c\x49\x2f\xe5\xd5\x03\x59\x84\x53\xcb\xfe\x92\x71\x2e\x7c\x21\xcc\x99\x85\x7f\xb8\x74\x90\x13\x42\x3f\xe0\x6b\x1d\xf2\x4d\x54\xd4\xfc\x3a\x05\xe6\x74\xaf\xa6\xa0\x2a\x20\x23\x5d\x34\x5c\xd9\x3e\x4e\xfa\x93\xe7\xaa\xe9\x6f\x08\x43\x67\x41\xc5\xad\xfb\x31\x95\x82\x73\x32\xd8\xa6\xa3\xed\x0e\x2d\xf6\x5f\xfd\x80\xa6\x7a\xe0\xdf\x78\x15\x29\x74\x33\xd0\x9e\x83\x86\x72\x22\x57\x29\xb9\x9e\x5d\xd3\x1a\xb5\x96"
+ "\x72\x41\x3d\xf1\x64\x43\x67\xee\xaa\x5c\xd3\x9a\x96\x13\x11\x5d\xf3\x0c\x87\x82\x1e\x41\x9e\xd0\x27\xd7\x54\x3b\x67\x73\x09\x91\xe9\xd5\x36\xa7\xb5\x55\xe4\xf3\x21\x51\x49\x22\x07\x55\x4f\x44\x4b\xd2\x15\x93\x17\x2a\xfa\x4d\x4a\x57\xdb\x4c\xa6\xeb\xec\x53\x25\x6c\x21\xed\x00\x4c\x3b\xca\x14\x57\xa9\xd6\x6a\xcd\x8d\x5e\x74\xac\x72\xc1\x97\xe5\x1b\x45\x4e\xda\xfc\xcc\x40\xe8\x48\x88\x0b\xa3\xe3\x8d\x83\x42\xc3\x23\xfd\x68\xb5\x8e\xf1\x9d\x63\x77\xe9\xa3\x8e\x8c\x26\x6b\xbd\x72\x73\x35\x0c\x03\xf8\x43\x78\x52\x71\x15\x1f\x71\x5d\x6e\xed\xb9\xcc\x86\x30\xdb\x2b\xd3\x82\x88\x23\x71\x90\x53\x5c\xa9\x2f\x76\x01\xb7\x9a\xfe\x43\x55\xa3\x04\x9b\x0e\xe4\x59\xdf\xc9\xe9\xb1\xea\x29\x28\x3c\x5c\xae\x72\x84\xb6\xc6\xeb\x0c\x27\x07\x74\x90\x0d\x31\xb0\x00\x77\xe9\x40\x70\x6f\x68\xa7\xfd\x06\xec\x4b\xc0\xb7\xac"
+ "\xbc\x33\xb7\x6d\x0a\xbd\x12\x1b\x59\xcb\xdd\x32\xf5\x1d\x94\x57\x76\x9e\x0c\x18\x98\x71\xd7\x2a\xdb\x0b\x7b\xa7\x71\xb7\x67\x81\x23\x96\xae\xb9\x7e\x32\x43\x92\x8a\x19\xa0\xc4\xd4\x3b\x57\xf9\x4a\x2c\xfb\x51\x46\xbb\xcb\x5d\xb3\xef\x13\x93\x6e\x68\x42\x54\x57\xd3\x6a\x3a\x8f\x9d\x66\xbf\xbd\x36\x23\xf5\x93\x83\x7b\x9c\xc0\xdd\xc5\x49\xc0\x64\xed\x07\x12\xb3\xe6\xe4\xe5\x38\x95\x23\xb1\xa0\x3b\x1a\x61\xda\x17\xac\xc3\x58\xdd\x74\x64\x22\x11\xe8\x32\x1d\x16\x93\x85\x99\xa5\x9c\x34\x55\xb1\xe9\x20\x72\xc9\x28\x7b\x79\x00\xa1\xa6\xa3\x27\x40\x18\x8a\x54\xe0\xcc\xe8\x4e\x8e\x43\x96\xe7\x3f\xc8\xe9\xb2\xf9\xc9\xda\x04\x71\x50\x47\xe4\xaa\xce\xa2\x30\xc8\xe4\xac\xc7\x0d\x06\x2e\xe6\xe8\x80\x36\x29\x9e\x01\xb8\xc3\xf0\xa0\x5d\x7a\xca\x4d\xa0\x57\xbd\x2a\x45\xa7\x7f\x9c\x93\x07\x8f\x35\x67\x92\xe3\xe9"
+ "\x7f\xa8\x61\x43\x9e\x25\x4f\x33\x76\x13\x6e\x12\xb9\xdd\xa4\x7c\x08\x9f\x7c\xe7\x0a\x8d\x84\x06\xa4\x33\x17\x34\x5e\x10\x7c\xc0\xa8\x3d\x1f\x42\x20\x51\x65\x5d\x09\xc3\xaa\xc0\xc8\x0d\xf0\x79\xbc\x20\x1b\x95\xe7\x06\x7d\x47\x20\x03\x1a\x74\xdd\xe2\xd4\xae\x38\x71\x9b\xf5\x80\xec\x08\x4e\x56\xba\x76\x12\x1a\xdf\x48\xf3\xae\xb3\xe6\xe6\xbe\xc0\x91\x2e\x01\xb3\x01\x86\xa2\xb9\x52\xd1\x21\xae\xd4\x97\x1d\xef\x41\x12\x95\x3d\x48\x45\x1c\x56\x32\x8f\xb8\x43\xbb\x19\xf3\xca\xe9\xeb\x6d\x84\xbe\x86\x06\xe2\x36\xb2\x62\x9d\xd3\x4c\x48\x18\x54\x13\x4e\xcf\xfd\xba\x84\xb9\x30\x53\xcf\xfb\xb9\x29\x8f\xdc\x9f\xef\x60\x0b\x64\xf6\x8b\xee\xa6\x91\xc2\x41\x6c\xf6\xfa\x79\x67\x4b\xc1\x3f\xaf\x09\x81\xd4\x5d\xcb\x09\xdf\x36\x31\xc0\x14\x3c\x7c\x0e\x65\x95\x99\x6d\xa3\xf4\xd7\x38\xee\x1a\x2b\x37\xe2\xa4\x3b\x4b\xd0"
+ "\x65\xca\xf8\xc3\xe8\x15\x20\xef\xf2\x00\xfd\x01\x09\xc5\xc8\x17\x04\x93\xd0\x93\x03\x55\xc5\xfe\x32\xa3\x3e\x28\x2d\x3b\x93\x8a\xcc\x07\x72\x80\x8b\x74\x16\x24\xbb\xda\x94\x39\x30\x8f\xb1\xcd\x4a\x90\x92\x7c\x14\x8f\x95\x4e\xac\x9b\xd8\x8f\x1a\x87\xa4\x32\x27\x8a\xba\xf7\x41\xcf\x84\x37\x19\xe6\x06\xf5\x0e\xcf\x36\xf5\x9e\x6c\xde\xbc\xff\x64\x7e\x4e\x59\x57\x48\xfe\x14\xf7\x9c\x93\x5d\x15\xad\xcc\x11\xb1\x17\x18\xb2\x7e\xcc\xab\xe9\xce\x7d\x77\x5b\x51\x1b\x1e\x20\xa8\x32\x06\x0e\x75\x93\xac\xdb\x35\x37\x1f\xe9\x19\x1d\xb4\x71\x97\xd6\x4e\x2c\x08\xa5\x13\xf9\x0e\x7e\x78\x6e\x14\xe0\xa9\xb9\x96\x4c\x80\x82\xba\x17\xb3\x9d\x69\xb0\x84\x46\xff\xf9\x52\x79\x94\x58\x3a\x62\x90\x15\x35\x71\x10\x37\xed\xa1\x8e\x53\x6e\xf4\x26\x57\x93\x15\x93\xf6\x81\x2c\x5a\x10\xda\x92\xad\x2f\xdb\x28\x31\x2d\x55\x04\xd2"
+ "\x06\x28\x8c\x1e\xdc\xea\x54\xac\xff\xb7\x6c\x30\x15\xd4\xb4\x0d\x00\x93\x57\xdd\xd2\x07\x07\x06\xd9\x43\x9b\xcd\x3a\xf4\x7d\x4c\x36\x5d\x23\xa2\xcc\x57\x40\x91\xe9\x2c\x2f\x2c\xd5\x30\x9b\x17\xb0\xc9\xf7\xa7\x2f\xd1\x93\x20\x6b\xc6\xc1\xe4\x6f\xcb\xd1\xe7\x09\x0f\x9e\xdc\xaa\x9f\x2f\xdf\x56\x9f\xd4\x33\x04\xaf\xd3\x6c\x58\x61\xf0\x30\xec\xf2\x7f\xf2\x9c\xdf\x39\xbb\x6f\xa2\x8c\x7e\xc4\x22\x51\x71\xc0\x4d\x14\x1a\xc4\xcd\x04\xd9\x87\x08\x50\x05\xcc\xaf\xf6\xf0\x8f\x92\x54\x58\xc2\xc7\x09\x7a\x59\x02\x05\xe8\xb0\x86\xd9\xbf\x7b\x35\x51\x4d\xaf\x08\x97\x2c\x65\xda\x2a\x71\x3a\xa8\x51\xcc\xf2\x73\x27\xc3\xfd\x62\xcf\xe3\xb2\xca\xcb\xbe\x1a\x0a\xa1\x34\x7b\x77\xc4\x62\x68\x78\x5f\x94\x07\x04\x65\x16\x4b\x61\xcb\xff\x75\x26\x50\x66\x1f\x6e\x93\xf8\xc5\x51\xeb\xa4\x4a\x48\x68\x6b\xe2\x5e\x44\xb2\x50\x2c\x6c"
+ "\xae\x79\x4e\x66\x35\x81\x50\xac\xbc\x3f\xb1\x0c\xf3\x05\x3c\x4a\xa3\x6c\x2a\x79\xb4\xb7\xab\xca\xc7\x9b\x8e\xcd\x5f\x11\x03\xcb\x30\xa3\xab\xda\xfe\x64\xb9\xbb\xd8\x5e\x3a\x1a\x56\xe5\x05\x48\x90\x1e\x61\x69\x1b\x22\xe6\x1a\x3c\x75\xad\x1f\x37\x28\xdc\xe4\x6d\xbd\x42\xdc\xd3\xc8\xb6\x1c\x48\xfe\x94\x77\x7f\xbd\x62\xac\xa3\x47\x27\xcf\x5f\xd9\xdb\xaf\xec\xf7\x5e\xc1\xb0\x9d\x01\x26\x99\x7e\x8f\x03\x70\xb5\x42\xbe\x67\x28\x1b\x7c\xbd\x61\x21\x97\xcc\x5c\xe1\x97\x8f\x8d\xde\x2b\xaa\xa7\x71\x1d\x1e\x02\x73\x70\x58\x32\x5b\x1d\x67\x3d\xe0\x74\x4f\x03\xf2\x70\x51\x79\xf1\x61\x70\x15\x74\x9d\x23\x89\xde\xac\xfd\xde\xd0\x1f\xc3\x87\x44\x35\x4b\xe5\xb0\x60\xc5\x22\xe4\x9e\xca\xeb\xd5\x3a\x09\x45\xa4\xdb\xfa\x3f\xeb\x1b\xc7\xc8\x14\x99\x51\x92\x10\xed\xed\x28\xe0\xa1\xf8\x26\xcf\xcd\xcb\x63\xa1\x3b\xe3"
+ "\xdf\x7e\xfe\xa6\xf0\x81\x9a\xbf\x55\xde\x54\xd5\x56\x60\x98\x10\x68\xf4\x38\x96\x8e\x6f\x1d\x44\x7f\xd6\x2f\xfe\x55\xfb\x0c\x7e\x67\xe2\x61\x44\xed\xf2\x35\x30\x5d\xe9\xc7\xd6\x6d\xe0\xa0\xed\xf3\xfc\xd8\x3e\x0a\x7b\xcd\xaf\x65\x68\x18\xc0\xec\x04\x1c\x74\x6d\xe2\x6e\x79\xd4\x11\x2b\x62\xd5\x27\xad\x4f\x01\x59\x73\xcc\x6a\x53\xfb\x2d\xd5\x4e\x99\x21\x65\x4d\xf5\x82\xf7\xd8\x42\xce\x6f\x3d\x36\x47\xf1\x05\x16\xe8\x1b\x6a\x8f\x93\xf2\x8f\x37\x40\x12\x28\xa3\xe6\xb9\x17\x4a\x1f\xb1\xd1\x66\x69\x86\xc4\xfc\x97\xae\x3f\x8f\x1e\x2b\xdf\xcd\xf9\x3c"
+}, {
+ .key = "\xb3\x35\x50\x03\x54\x2e\x40\x5e\x8f\x59\x8e\xc5\x90\xd5\x27\x2d\xba\x29\x2e\xcb\x1b\x70\x44\x1e\x65\x91\x6e\x2a\x79\x22\xda\x64",
+ .nonce = "\x05\xa3\x93\xed\x30\xc5\xa2\x06",
+ .nlen = 8,
+ .assoc = "\xb1\x69\x83\x87\x30\xaa\x5d\xb8\x77\xe8\x21\xff\x06\x59\x35\xce\x75\xfe\x38\xef\xb8\x91\x43\x8c\xcf\x70\xdd\x0a\x68\xbf\xd4\xbc\x16\x76\x99\x36\x1e\x58\x79\x5e\xd4\x29\xf7\x33\x93\x48\xdb\x5f\x01\xae\x9c\xb6\xe4\x88\x6d\x2b\x76\x75\xe0\xf3\x74\xe2\xc9",
+ .alen = 63,
+ .input = "\x74\xa6\x3e\xe4\xb1\xcb\xaf\xb0\x40\xe5\x0f\x9e\xf1\xf2\x89\xb5\x42\x34\x8a\xa1\x03\xb7\xe9\x57\x46\xbe\x20\xe4\x6e\xb0\xeb\xff\xea\x07\x7e\xef\xe2\x55\x9f\xe5\x78\x3a\xb7\x83\xc2\x18\x40\x7b\xeb\xcd\x81\xfb\x90\x12\x9e\x46\xa9\xd6\x4a\xba\xb0\x62\xdb\x6b\x99\xc4\xdb\x54\x4b\xb8\xa5\x71\xcb\xcd\x63\x32\x55\xfb\x31\xf0\x38\xf5\xbe\x78\xe4\x45\xce\x1b\x6a\x5b\x0e\xf4\x16\xe4\xb1\x3d\xf6\x63\x7b\xa7\x0c\xde\x6f\x8f\x74\xdf\xe0\x1e\x9d\xce\x8f\x24\xef\x23\x35\x33\x7b\x83\x34\x23\x58\x74\x14\x77\x1f\xc2\x4f\x4e\xc6\x89\xf9\x52\x09\x37\x64\x14\xc4\x01\x6b\x9d\x77\xe8\x90\x5d\xa8\x4a\x2a\xef\x5c\x7f\xeb\xbb\xb2\xc6\x93\x99\x66\xdc\x7f\xd4\x9e\x2a\xca\x8d\xdb\xe7\x20\xcf\xe4\x73\xae\x49\x7d\x64\x0f\x0e\x28\x46\xa9\xa8\x32\xe4\x0e\xf6\x51\x53\xb8\x3c\xb1\xff\xa3\x33\x41\x75\xff\xf1\x6f\xf1\xfb"
+ "\xbb\x83\x7f\x06\x9b\xe7\x1b\x0a\xe0\x5c\x33\x60\x5b\xdb\x5b\xed\xfe\xa5\x16\x19\x72\xa3\x64\x23\x00\x02\xc7\xf3\x6a\x81\x3e\x44\x1d\x79\x15\x5f\x9a\xde\xe2\xfd\x1b\x73\xc1\xbc\x23\xba\x31\xd2\x50\xd5\xad\x7f\x74\xa7\xc9\xf8\x3e\x2b\x26\x10\xf6\x03\x36\x74\xe4\x0e\x6a\x72\xb7\x73\x0a\x42\x28\xc2\xad\x5e\x03\xbe\xb8\x0b\xa8\x5b\xd4\xb8\xba\x52\x89\xb1\x9b\xc1\xc3\x65\x87\xed\xa5\xf4\x86\xfd\x41\x80\x91\x27\x59\x53\x67\x15\x78\x54\x8b\x2d\x3d\xc7\xff\x02\x92\x07\x5f\x7a\x4b\x60\x59\x3c\x6f\x5c\xd8\xec\x95\xd2\xfe\xa0\x3b\xd8\x3f\xd1\x69\xa6\xd6\x41\xb2\xf4\x4d\x12\xf4\x58\x3e\x66\x64\x80\x31\x9b\xa8\x4c\x8b\x07\xb2\xec\x66\x94\x66\x47\x50\x50\x5f\x18\x0b\x0e\xd6\xc0\x39\x21\x13\x9e\x33\xbc\x79\x36\x02\x96\x70\xf0\x48\x67\x2f\x26\xe9\x6d\x10\xbb\xd6\x3f\xd1\x64\x7a\x2e\xbe\x0c\x61\xf0\x75\x42\x38\x23"
+ "\xb1\x9e\x9f\x7c\x67\x66\xd9\x58\x9a\xf1\xbb\x41\x2a\x8d\x65\x84\x94\xfc\xdc\x6a\x50\x64\xdb\x56\x33\x76\x00\x10\xed\xbe\xd2\x12\xf6\xf6\x1b\xa2\x16\xde\xae\x31\x95\xdd\xb1\x08\x7e\x4e\xee\xe7\xf9\xa5\xfb\x5b\x61\x43\x00\x40\xf6\x7e\x02\x04\x32\x4e\x0c\xe2\x66\x0d\xd7\x07\x98\x0e\xf8\x72\x34\x6d\x95\x86\xd7\xcb\x31\x54\x47\xd0\x38\x29\x9c\x5a\x68\xd4\x87\x76\xc9\xe7\x7e\xe3\xf4\x81\x6d\x18\xcb\xc9\x05\xaf\xa0\xfb\x66\xf7\xf1\x1c\xc6\x14\x11\x4f\x2b\x79\x42\x8b\xbc\xac\xe7\x6c\xfe\x0f\x58\xe7\x7c\x78\x39\x30\xb0\x66\x2c\x9b\x6d\x3a\xe1\xcf\xc9\xa4\x0e\x6d\x6d\x8a\xa1\x3a\xe7\x28\xd4\x78\x4c\xa6\xa2\x2a\xa6\x03\x30\xd7\xa8\x25\x66\x87\x2f\x69\x5c\x4e\xdd\xa5\x49\x5d\x37\x4a\x59\xc4\xaf\x1f\xa2\xe4\xf8\xa6\x12\x97\xd5\x79\xf5\xe2\x4a\x2b\x5f\x61\xe4\x9e\xe3\xee\xb8\xa7\x5b\x2f\xf4\x9e\x6c\xfb\xd1\xc6"
+ "\x56\x77\xba\x75\xaa\x3d\x1a\xa8\x0b\xb3\x68\x24\x00\x10\x7f\xfd\xd7\xa1\x8d\x83\x54\x4f\x1f\xd8\x2a\xbe\x8a\x0c\x87\xab\xa2\xde\xc3\x39\xbf\x09\x03\xa5\xf3\x05\x28\xe1\xe1\xee\x39\x70\x9c\xd8\x81\x12\x1e\x02\x40\xd2\x6e\xf0\xeb\x1b\x3d\x22\xc6\xe5\xe3\xb4\x5a\x98\xbb\xf0\x22\x28\x8d\xe5\xd3\x16\x48\x24\xa5\xe6\x66\x0c\xf9\x08\xf9\x7e\x1e\xe1\x28\x26\x22\xc7\xc7\x0a\x32\x47\xfa\xa3\xbe\x3c\xc4\xc5\x53\x0a\xd5\x94\x4a\xd7\x93\xd8\x42\x99\xb9\x0a\xdb\x56\xf7\xb9\x1c\x53\x4f\xfa\xd3\x74\xad\xd9\x68\xf1\x1b\xdf\x61\xc6\x5e\xa8\x48\xfc\xd4\x4a\x4c\x3c\x32\xf7\x1c\x96\x21\x9b\xf9\xa3\xcc\x5a\xce\xd5\xd7\x08\x24\xf6\x1c\xfd\xdd\x38\xc2\x32\xe9\xb8\xe7\xb6\xfa\x9d\x45\x13\x2c\x83\xfd\x4a\x69\x82\xcd\xdc\xb3\x76\x0c\x9e\xd8\xf4\x1b\x45\x15\xb4\x97\xe7\x58\x34\xe2\x03\x29\x5a\xbf\xb6\xe0\x5d\x13\xd9\x2b\xb4"
+ "\x80\xb2\x45\x81\x6a\x2e\x6c\x89\x7d\xee\xbb\x52\xdd\x1f\x18\xe7\x13\x6b\x33\x0e\xea\x36\x92\x77\x7b\x6d\x9c\x5a\x5f\x45\x7b\x7b\x35\x62\x23\xd1\xbf\x0f\xd0\x08\x1b\x2b\x80\x6b\x7e\xf1\x21\x47\xb0\x57\xd1\x98\x72\x90\x34\x1c\x20\x04\xff\x3d\x5c\xee\x0e\x57\x5f\x6f\x24\x4e\x3c\xea\xfc\xa5\xa9\x83\xc9\x61\xb4\x51\x24\xf8\x27\x5e\x46\x8c\xb1\x53\x02\x96\x35\xba\xb8\x4c\x71\xd3\x15\x59\x35\x22\x20\xad\x03\x9f\x66\x44\x3b\x9c\x35\x37\x1f\x9b\xbb\xf3\xdb\x35\x63\x30\x64\xaa\xa2\x06\xa8\x5d\xbb\xe1\x9f\x70\xec\x82\x11\x06\x36\xec\x8b\x69\x66\x24\x44\xc9\x4a\x57\xbb\x9b\x78\x13\xce\x9c\x0c\xba\x92\x93\x63\xb8\xe2\x95\x0f\x0f\x16\x39\x52\xfd\x3a\x6d\x02\x4b\xdf\x13\xd3\x2a\x22\xb4\x03\x7c\x54\x49\x96\x68\x54\x10\xfa\xef\xaa\x6c\xe8\x22\xdc\x71\x16\x13\x1a\xf6\x28\xe5\x6d\x77\x3d\xcd\x30\x63\xb1\x70\x52\xa1"
+ "\xc5\x94\x5f\xcf\xe8\xb8\x26\x98\xf7\x06\xa0\x0a\x70\xfa\x03\x80\xac\xc1\xec\xd6\x4c\x54\xd7\xfe\x47\xb6\x88\x4a\xf7\x71\x24\xee\xf3\xd2\xc2\x4a\x7f\xfe\x61\xc7\x35\xc9\x37\x67\xcb\x24\x35\xda\x7e\xca\x5f\xf3\x8d\xd4\x13\x8e\xd6\xcb\x4d\x53\x8f\x53\x1f\xc0\x74\xf7\x53\xb9\x5e\x23\x37\xba\x6e\xe3\x9d\x07\x55\x25\x7b\xe6\x2a\x64\xd1\x32\xdd\x54\x1b\x4b\xc0\xe1\xd7\x69\x58\xf8\x93\x29\xc4\xdd\x23\x2f\xa5\xfc\x9d\x7e\xf8\xd4\x90\xcd\x82\x55\xdc\x16\x16\x9f\x07\x52\x9b\x9d\x25\xed\x32\xc5\x7b\xdf\xf6\x83\x46\x3d\x65\xb7\xef\x87\x7a\x12\x69\x8f\x06\x7c\x51\x15\x4a\x08\xe8\xac\x9a\x0c\x24\xa7\x27\xd8\x46\x2f\xe7\x01\x0e\x1c\xc6\x91\xb0\x6e\x85\x65\xf0\x29\x0d\x2e\x6b\x3b\xfb\x4b\xdf\xe4\x80\x93\x03\x66\x46\x3e\x8a\x6e\xf3\x5e\x4d\x62\x0e\x49\x05\xaf\xd4\xf8\x21\x20\x61\x1d\x39\x17\xf4\x61\x47\x95\xfb\x15"
+ "\x2e\xb3\x4f\xd0\x5d\xf5\x7d\x40\xda\x90\x3c\x6b\xcb\x17\x00\x13\x3b\x64\x34\x1b\xf0\xf2\xe5\x3b\xb2\xc7\xd3\x5f\x3a\x44\xa6\x9b\xb7\x78\x0e\x42\x5d\x4c\xc1\xe9\xd2\xcb\xb7\x78\xd1\xfe\x9a\xb5\x07\xe9\xe0\xbe\xe2\x8a\xa7\x01\x83\x00\x8c\x5c\x08\xe6\x63\x12\x92\xb7\xb7\xa6\x19\x7d\x38\x13\x38\x92\x87\x24\xf9\x48\xb3\x5e\x87\x6a\x40\x39\x5c\x3f\xed\x8f\xee\xdb\x15\x82\x06\xda\x49\x21\x2b\xb5\xbf\x32\x7c\x9f\x42\x28\x63\xcf\xaf\x1e\xf8\xc6\xa0\xd1\x02\x43\x57\x62\xec\x9b\x0f\x01\x9e\x71\xd8\x87\x9d\x01\xc1\x58\x77\xd9\xaf\xb1\x10\x7e\xdd\xa6\x50\x96\xe5\xf0\x72\x00\x6d\x4b\xf8\x2a\x8f\x19\xf3\x22\x88\x11\x4a\x8b\x7c\xfd\xb7\xed\xe1\xf6\x40\x39\xe0\xe9\xf6\x3d\x25\xe6\x74\x3c\x58\x57\x7f\xe1\x22\x96\x47\x31\x91\xba\x70\x85\x28\x6b\x9f\x6e\x25\xac\x23\x66\x2f\x29\x88\x28\xce\x8c\x5c\x88\x53\xd1\x3b"
+ "\xcc\x6a\x51\xb2\xe1\x28\x3f\x91\xb4\x0d\x00\x3a\xe3\xf8\xc3\x8f\xd7\x96\x62\x0e\x2e\xfc\xc8\x6c\x77\xa6\x1d\x22\xc1\xb8\xe6\x61\xd7\x67\x36\x13\x7b\xbb\x9b\x59\x09\xa6\xdf\xf7\x6b\xa3\x40\x1a\xf5\x4f\xb4\xda\xd3\xf3\x81\x93\xc6\x18\xd9\x26\xee\xac\xf0\xaa\xdf\xc5\x9c\xca\xc2\xa2\xcc\x7b\x5c\x24\xb0\xbc\xd0\x6a\x4d\x89\x09\xb8\x07\xfe\x87\xad\x0a\xea\xb8\x42\xf9\x5e\xb3\x3e\x36\x4c\xaf\x75\x9e\x1c\xeb\xbd\xbc\xbb\x80\x40\xa7\x3a\x30\xbf\xa8\x44\xf4\xeb\x38\xad\x29\xba\x23\xed\x41\x0c\xea\xd2\xbb\x41\x18\xd6\xb9\xba\x65\x2b\xa3\x91\x6d\x1f\xa9\xf4\xd1\x25\x8d\x4d\x38\xff\x64\xa0\xec\xde\xa6\xb6\x79\xab\x8e\x33\x6c\x47\xde\xaf\x94\xa4\xa5\x86\x77\x55\x09\x92\x81\x31\x76\xc7\x34\x22\x89\x8e\x3d\x26\x26\xd7\xfc\x1e\x16\x72\x13\x33\x63\xd5\x22\xbe\xb8\x04\x34\x84\x41\xbb\x80\xd0\x9f\x46\x48\x07\xa7"
+ "\xfc\x2b\x3a\x75\x55\x8c\xc7\x6a\xbd\x7e\x46\x08\x84\x0f\xd5\x74\xc0\x82\x8e\xaa\x61\x05\x01\xb2\x47\x6e\x20\x6a\x2d\x58\x70\x48\x32\xa7\x37\xd2\xb8\x82\x1a\x51\xb9\x61\xdd\xfd\x9d\x6b\x0e\x18\x97\xf8\x45\x5f\x87\x10\xcf\x34\x72\x45\x26\x49\x70\xe7\xa3\x78\xe0\x52\x89\x84\x94\x83\x82\xc2\x69\x8f\xe3\xe1\x3f\x60\x74\x88\xc4\xf7\x75\x2c\xfb\xbd\xb6\xc4\x7e\x10\x0a\x6c\x90\x04\x9e\xc3\x3f\x59\x7c\xce\x31\x18\x60\x57\x73\x46\x94\x7d\x06\xa0\x6d\x44\xec\xa2\x0a\x9e\x05\x15\xef\xca\x5c\xbf\x00\xeb\xf7\x3d\x32\xd4\xa5\xef\x49\x89\x5e\x46\xb0\xa6\x63\x5b\x8a\x73\xae\x6f\xd5\x9d\xf8\x4f\x40\xb5\xb2\x6e\xd3\xb6\x01\xa9\x26\xa2\x21\xcf\x33\x7a\x3a\xa4\x23\x13\xb0\x69\x6a\xee\xce\xd8\x9d\x01\x1d\x50\xc1\x30\x6c\xb1\xcd\xa0\xf0\xf0\xa2\x64\x6f\xbb\xbf\x5e\xe6\xab\x87\xb4\x0f\x4f\x15\xaf\xb5\x25\xa1\xb2\xd0\x80"
+ "\x2c\xfb\xf9\xfe\xd2\x33\xbb\x76\xfe\x7c\xa8\x66\xf7\xe7\x85\x9f\x1f\x85\x57\x88\xe1\xe9\x63\xe4\xd8\x1c\xa1\xfb\xda\x44\x05\x2e\x1d\x3a\x1c\xff\xc8\x3b\xc0\xfe\xda\x22\x0b\x43\xd6\x88\x39\x4c\x4a\xa6\x69\x18\x93\x42\x4e\xb5\xcc\x66\x0d\x09\xf8\x1e\x7c\xd3\x3c\x99\x0d\x50\x1d\x62\xe9\x57\x06\xbf\x19\x88\xdd\xad\x7b\x4f\xf9\xc7\x82\x6d\x8d\xc8\xc4\xc5\x78\x17\x20\x15\xc5\x52\x41\xcf\x5b\xd6\x7f\x94\x02\x41\xe0\x40\x22\x03\x5e\xd1\x53\xd4\x86\xd3\x2c\x9f\x0f\x96\xe3\x6b\x9a\x76\x32\x06\x47\x4b\x11\xb3\xdd\x03\x65\xbd\x9b\x01\xda\x9c\xb9\x7e\x3f\x6a\xc4\x7b\xea\xd4\x3c\xb9\xfb\x5c\x6b\x64\x33\x52\xba\x64\x78\x8f\xa4\xaf\x7a\x61\x8d\xbc\xc5\x73\xe9\x6b\x58\x97\x4b\xbf\x63\x22\xd3\x37\x02\x54\xc5\xb9\x16\x4a\xf0\x19\xd8\x94\x57\xb8\x8a\xb3\x16\x3b\xd0\x84\x8e\x67\xa6\xa3\x7d\x78\xec\x00",
+ .ilen = 2011,
+ .result = "\x52\x34\xb3\x65\x3b\xb7\xe5\xd3\xab\x49\x17\x60\xd2\x52\x56\xdf\xdf\x34\x56\x82\xe2\xbe\xe5\xe1\x28\xd1\x4e\x5f\x4f\x01\x7d\x3f\x99\x6b\x30\x6e\x1a\x7c\x4c\x8e\x62\x81\xae\x86\x3f\x6b\xd0\xb5\xa9\xcf\x50\xf1\x02\x12\xa0\x0b\x24\xe9\xe6\x72\x89\x2c\x52\x1b\x34\x38\xf8\x75\x5f\xa0\x74\xe2\x99\xdd\xa6\x4b\x14\x50\x4e\xf1\xbe\xd6\x9e\xdb\xb2\x24\x27\x74\x12\x4a\x78\x78\x17\xa5\x58\x8e\x2f\xf9\xf4\x8d\xee\x03\x88\xae\xb8\x29\xa1\x2f\x4b\xee\x92\xbd\x87\xb3\xce\x34\x21\x57\x46\x04\x49\x0c\x80\xf2\x01\x13\xa1\x55\xb3\xff\x44\x30\x3c\x1c\xd0\xef\xbc\x18\x74\x26\xad\x41\x5b\x5b\x3e\x9a\x7a\x46\x4f\x16\xd6\x74\x5a\xb7\x3a\x28\x31\xd8\xae\x26\xac\x50\x53\x86\xf2\x56\xd7\x3f\x29\xbc\x45\x68\x8e\xcb\x98\x64\xdd\xc9\xba\xb8\x4b\x7b\x82\xdd\x14\xa7\xcb\x71\x72\x00\x5c\xad\x7b\x6a\x89\xa4\x3d\xbf\xb5"
+ "\x4b\x3e\x7c\x5a\xcf\xb8\xa1\xc5\x6e\xc8\xb6\x31\x57\x7b\xdf\xa5\x7e\xb1\xd6\x42\x2a\x31\x36\xd1\xd0\x3f\x7a\xe5\x94\xd6\x36\xa0\x6f\xb7\x40\x7d\x37\xc6\x55\x7c\x50\x40\x6d\x29\x89\xe3\x5a\xae\x97\xe7\x44\x49\x6e\xbd\x81\x3d\x03\x93\x06\x12\x06\xe2\x41\x12\x4a\xf1\x6a\xa4\x58\xa2\xfb\xd2\x15\xba\xc9\x79\xc9\xce\x5e\x13\xbb\xf1\x09\x04\xcc\xfd\xe8\x51\x34\x6a\xe8\x61\x88\xda\xed\x01\x47\x84\xf5\x73\x25\xf9\x1c\x42\x86\x07\xf3\x5b\x1a\x01\xb3\xeb\x24\x32\x8d\xf6\xed\x7c\x4b\xeb\x3c\x36\x42\x28\xdf\xdf\xb6\xbe\xd9\x8c\x52\xd3\x2b\x08\x90\x8c\xe7\x98\x31\xe2\x32\x8e\xfc\x11\x48\x00\xa8\x6a\x42\x4a\x02\xc6\x4b\x09\xf1\xe3\x49\xf3\x45\x1f\x0e\xbc\x56\xe2\xe4\xdf\xfb\xeb\x61\xfa\x24\xc1\x63\x75\xbb\x47\x75\xaf\xe1\x53\x16\x96\x21\x85\x26\x11\xb3\x76\xe3\x23\xa1\x6b\x74\x37\xd0\xde\x06\x90\x71\x5d\x43\x88"
+ "\x9b\x00\x54\xa6\x75\x2f\xa1\xc2\x0b\x73\x20\x1d\xb6\x21\x79\x57\x3f\xfa\x09\xbe\x8a\x33\xc3\x52\xf0\x1d\x82\x31\xd1\x55\xb5\x6c\x99\x25\xcf\x5c\x32\xce\xe9\x0d\xfa\x69\x2c\xd5\x0d\xc5\x6d\x86\xd0\x0c\x3b\x06\x50\x79\xe8\xc3\xae\x04\xe6\xcd\x51\xe4\x26\x9b\x4f\x7e\xa6\x0f\xab\xd8\xe5\xde\xa9\x00\x95\xbe\xa3\x9d\x5d\xb2\x09\x70\x18\x1c\xf0\xac\x29\x23\x02\x29\x28\xd2\x74\x35\x57\x62\x0f\x24\xea\x5e\x33\xc2\x92\xf3\x78\x4d\x30\x1e\xa1\x99\xa9\x82\xb0\x42\x31\x8d\xad\x8a\xbc\xfc\xd4\x57\x47\x3e\xb4\x50\xdd\x6e\x2c\x80\x4d\x22\xf1\xfb\x57\xc4\xdd\x17\xe1\x8a\x36\x4a\xb3\x37\xca\xc9\x4e\xab\xd5\x69\xc4\xf4\xbc\x0b\x3b\x44\x4b\x29\x9c\xee\xd4\x35\x22\x21\xb0\x1f\x27\x64\xa8\x51\x1b\xf0\x9f\x19\x5c\xfb\x5a\x64\x74\x70\x45\x09\xf5\x64\xfe\x1a\x2d\xc9\x14\x04\x14\xcf\xd5\x7d\x60\xaf\x94\x39\x94\xe2\x7d\x79"
+ "\x82\xd0\x65\x3b\x6b\x9c\x19\x84\xb4\x6d\xb3\x0c\x99\xc0\x56\xa8\xbd\x73\xce\x05\x84\x3e\x30\xaa\xc4\x9b\x1b\x04\x2a\x9f\xd7\x43\x2b\x23\xdf\xbf\xaa\xd5\xc2\x43\x2d\x70\xab\xdc\x75\xad\xac\xf7\xc0\xbe\x67\xb2\x74\xed\x67\x10\x4a\x92\x60\xc1\x40\x50\x19\x8a\x8a\x8c\x09\x0e\x72\xe1\x73\x5e\xe8\x41\x85\x63\x9f\x3f\xd7\x7d\xc4\xfb\x22\x5d\x92\x6c\xb3\x1e\xe2\x50\x2f\x82\xa8\x28\xc0\xb5\xd7\x5f\x68\x0d\x2c\x2d\xaf\x7e\xfa\x2e\x08\x0f\x1f\x70\x9f\xe9\x19\x72\x55\xf8\xfb\x51\xd2\x33\x5d\xa0\xd3\x2b\x0a\x6c\xbc\x4e\xcf\x36\x4d\xdc\x3b\xe9\x3e\x81\x7c\x61\xdb\x20\x2d\x3a\xc3\xb3\x0c\x1e\x00\xb9\x7c\xf5\xca\x10\x5f\x3a\x71\xb3\xe4\x20\xdb\x0c\x2a\x98\x63\x45\x00\x58\xf6\x68\xe4\x0b\xda\x13\x3b\x60\x5c\x76\xdb\xb9\x97\x71\xe4\xd9\xb7\xdb\xbd\x68\xc7\x84\x84\xaa\x7c\x68\x62\x5e\x16\xfc\xba\x72\xaa\x9a\xa9\xeb"
+ "\x7c\x75\x47\x97\x7e\xad\xe2\xd9\x91\xe8\xe4\xa5\x31\xd7\x01\x8e\xa2\x11\x88\x95\xb9\xf2\x9b\xd3\x7f\x1b\x81\x22\xf7\x98\x60\x0a\x64\xa6\xc1\xf6\x49\xc7\xe3\x07\x4d\x94\x7a\xcf\x6e\x68\x0c\x1b\x3f\x6e\x2e\xee\x92\xfa\x52\xb3\x59\xf8\xf1\x8f\x6a\x66\xa3\x82\x76\x4a\x07\x1a\xc7\xdd\xf5\xda\x9c\x3c\x24\xbf\xfd\x42\xa1\x10\x64\x6a\x0f\x89\xee\x36\xa5\xce\x99\x48\x6a\xf0\x9f\x9e\x69\xa4\x40\x20\xe9\x16\x15\xf7\xdb\x75\x02\xcb\xe9\x73\x8b\x3b\x49\x2f\xf0\xaf\x51\x06\x5c\xdf\x27\x27\x49\x6a\xd1\xcc\xc7\xb5\x63\xb5\xfc\xb8\x5c\x87\x7f\x84\xb4\xcc\x14\xa9\x53\xda\xa4\x56\xf8\xb6\x1b\xcc\x40\x27\x52\x06\x5a\x13\x81\xd7\x3a\xd4\x3b\xfb\x49\x65\x31\x33\xb2\xfa\xcd\xad\x58\x4e\x2b\xae\xd2\x20\xfb\x1a\x48\xb4\x3f\x9a\xd8\x7a\x35\x4a\xc8\xee\x88\x5e\x07\x66\x54\xb9\xec\x9f\xa3\xe3\xb9\x37\xaa\x49\x76\x31\xda"
+ "\x74\x2d\x3c\xa4\x65\x10\x32\x38\xf0\xde\xd3\x99\x17\xaa\x71\xaa\x8f\x0f\x8c\xaf\xa2\xf8\x5d\x64\xba\x1d\xa3\xef\x96\x73\xe8\xa1\x02\x8d\x0c\x6d\xb8\x06\x90\xb8\x08\x56\x2c\xa7\x06\xc9\xc2\x38\xdb\x7c\x63\xb1\x57\x8e\xea\x7c\x79\xf3\x49\x1d\xfe\x9f\xf3\x6e\xb1\x1d\xba\x19\x80\x1a\x0a\xd3\xb0\x26\x21\x40\xb1\x7c\xf9\x4d\x8d\x10\xc1\x7e\xf4\xf6\x3c\xa8\xfd\x7c\xa3\x92\xb2\x0f\xaa\xcc\xa6\x11\xfe\x04\xe3\xd1\x7a\x32\x89\xdf\x0d\xc4\x8f\x79\x6b\xca\x16\x7c\x6e\xf9\xad\x0f\xf6\xfe\x27\xdb\xc4\x13\x70\xf1\x62\x1a\x4f\x79\x40\xc9\x9b\x8b\x21\xea\x84\xfa\xf5\xf1\x89\xce\xb7\x55\x0a\x80\x39\x2f\x55\x36\x16\x9c\x7b\x08\xbd\x87\x0d\xa5\x32\xf1\x52\x7c\xe8\x55\x60\x5b\xd7\x69\xe4\xfc\xfa\x12\x85\x96\xea\x50\x28\xab\x8a\xf7\xbb\x0e\x53\x74\xca\xa6\x27\x09\xc2\xb5\xde\x18\x14\xd9\xea\xe5\x29\x1c\x40\x56\xcf\xd7"
+ "\xae\x05\x3f\x65\xaf\x05\x73\xe2\x35\x96\x27\x07\x14\xc0\xad\x33\xf1\xdc\x44\x7a\x89\x17\x77\xd2\x9c\x58\x60\xf0\x3f\x7b\x2d\x2e\x57\x95\x54\x87\xed\xf2\xc7\x4c\xf0\xae\x56\x29\x19\x7d\x66\x4b\x9b\x83\x84\x42\x3b\x01\x25\x66\x8e\x02\xde\xb9\x83\x54\x19\xf6\x9f\x79\x0d\x67\xc5\x1d\x7a\x44\x02\x98\xa7\x16\x1c\x29\x0d\x74\xff\x85\x40\x06\xef\x2c\xa9\xc6\xf5\x53\x07\x06\xae\xe4\xfa\x5f\xd8\x39\x4d\xf1\x9b\x6b\xd9\x24\x84\xfe\x03\x4c\xb2\x3f\xdf\xa1\x05\x9e\x50\x14\x5a\xd9\x1a\xa2\xa7\xfa\xfa\x17\xf7\x78\xd6\xb5\x92\x61\x91\xac\x36\xfa\x56\x0d\x38\x32\x18\x85\x08\x58\x37\xf0\x4b\xdb\x59\xe7\xa4\x34\xc0\x1b\x01\xaf\x2d\xde\xa1\xaa\x5d\xd3\xec\xe1\xd4\xf7\xe6\x54\x68\xf0\x51\x97\xa7\x89\xea\x24\xad\xd3\x6e\x47\x93\x8b\x4b\xb4\xf7\x1c\x42\x06\x67\xe8\x99\xf6\xf5\x7b\x85\xb5\x65\xb5\xb5\xd2\x37\xf5\xf3\x02\xa6"
+ "\x4d\x11\xa7\xdc\x51\x09\x7f\xa0\xd8\x88\x1c\x13\x71\xae\x9c\xb7\x7b\x34\xd6\x4e\x68\x26\x83\x51\xaf\x1d\xee\x8b\xbb\x69\x43\x2b\x9e\x8a\xbc\x02\x0e\xa0\x1b\xe0\xa8\x5f\x6f\xaf\x1b\x8f\xe7\x64\x71\x74\x11\x7e\xa8\xd8\xf9\x97\x06\xc3\xb6\xfb\xfb\xb7\x3d\x35\x9d\x3b\x52\xed\x54\xca\xf4\x81\x01\x2d\x1b\xc3\xa7\x00\x3d\x1a\x39\x54\xe1\xf6\xff\xed\x6f\x0b\x5a\x68\xda\x58\xdd\xa9\xcf\x5c\x4a\xe5\x09\x4e\xde\x9d\xbc\x3e\xee\x5a\x00\x3b\x2c\x87\x10\x65\x60\xdd\xd7\x56\xd1\x4c\x64\x45\xe4\x21\xec\x78\xf8\x25\x7a\x3e\x16\x5d\x09\x53\x14\xbe\x4f\xae\x87\xd8\xd1\xaa\x3c\xf6\x3e\xa4\x70\x8c\x5e\x70\xa4\xb3\x6b\x66\x73\xd3\xbf\x31\x06\x19\x62\x93\x15\xf2\x86\xe4\x52\x7e\x53\x4c\x12\x38\xcc\x34\x7d\x57\xf6\x42\x93\x8a\xc4\xee\x5c\x8a\xe1\x52\x8f\x56\x64\xf6\xa6\xd1\x91\x57\x70\xcd\x11\x76\xf5\x59\x60\x60\x3c"
+ "\xc1\xc3\x0b\x7f\x58\x1a\x50\x91\xf1\x68\x8f\x6e\x74\x74\xa8\x51\x0b\xf7\x7a\x98\x37\xf2\x0a\x0e\xa4\x97\x04\xb8\x9b\xfd\xa0\xea\xf7\x0d\xe1\xdb\x03\xf0\x31\x29\xf8\xdd\x6b\x8b\x5d\xd8\x59\xa9\x29\xcf\x9a\x79\x89\x19\x63\x46\x09\x79\x6a\x11\xda\x63\x68\x48\x77\x23\xfb\x7d\x3a\x43\xcb\x02\x3b\x7a\x6d\x10\x2a\x9e\xac\xf1\xd4\x19\xf8\x23\x64\x1d\x2c\x5f\xf2\xb0\x5c\x23\x27\xf7\x27\x30\x16\x37\xb1\x90\xab\x38\xfb\x55\xcd\x78\x58\xd4\x7d\x43\xf6\x45\x5e\x55\x8d\xb1\x02\x65\x58\xb4\x13\x4b\x36\xf7\xcc\xfe\x3d\x0b\x82\xe2\x12\x11\xbb\xe6\xb8\x3a\x48\x71\xc7\x50\x06\x16\x3a\xe6\x7c\x05\xc7\xc8\x4d\x2f\x08\x6a\x17\x9a\x95\x97\x50\x68\xdc\x28\x18\xc4\x61\x38\xb9\xe0\x3e\x78\xdb\x29\xe0\x9f\x52\xdd\xf8\x4f\x91\xc1\xd0\x33\xa1\x7a\x8e\x30\x13\x82\x07\x9f\xd3\x31\x0f\x23\xbe\x32\x5a\x75\xcf\x96\xb2\xec\xb5\x32"
+ "\xac\x21\xd1\x82\x33\xd3\x15\x74\xbd\x90\xf1\x2c\xe6\x5f\x8d\xe3\x02\xe8\xe9\xc4\xca\x96\xeb\x0e\xbc\x91\xf4\xb9\xea\xd9\x1b\x75\xbd\xe1\xac\x2a\x05\x37\x52\x9b\x1b\x3f\x5a\xdc\x21\xc3\x98\xbb\xaf\xa3\xf2\x00\xbf\x0d\x30\x89\x05\xcc\xa5\x76\xf5\x06\xf0\xc6\x54\x8a\x5d\xd4\x1e\xc1\xf2\xce\xb0\x62\xc8\xfc\x59\x42\x9a\x90\x60\x55\xfe\x88\xa5\x8b\xb8\x33\x0c\x23\x24\x0d\x15\x70\x37\x1e\x3d\xf6\xd2\xea\x92\x10\xb2\xc4\x51\xac\xf2\xac\xf3\x6b\x6c\xaa\xcf\x12\xc5\x6c\x90\x50\xb5\x0c\xfc\x1a\x15\x52\xe9\x26\xc6\x52\xa4\xe7\x81\x69\xe1\xe7\x9e\x30\x01\xec\x84\x89\xb2\x0d\x66\xdd\xce\x28\x5c\xec\x98\x46\x68\x21\x9f\x88\x3f\x1f\x42\x77\xce\xd0\x61\xd4\x20\xa7\xff\x53\xad\x37\xd0\x17\x35\xc9\xfc\xba\x0a\x78\x3f\xf2\xcc\x86\x89\xe8\x4b\x3c\x48\x33\x09\x7f\xc6\xc0\xdd\xb8\xfd\x7a\x66\x66\x65\xeb\x47\xa7\x04\x28"
+ "\xa3\x19\x8e\xa9\xb1\x13\x67\x62\x70\xcf\xd6"
+}, { /* wycheproof - rfc7539 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x07\x00\x00\x00\x40\x41\x42\x43\x44\x45\x46\x47",
+ .nlen = 12,
+ .assoc = "\x50\x51\x52\x53\xc0\xc1\xc2\xc3\xc4\xc5\xc6\xc7",
+ .alen = 12,
+ .input = "\x4c\x61\x64\x69\x65\x73\x20\x61\x6e\x64\x20\x47\x65\x6e\x74\x6c\x65\x6d\x65\x6e\x20\x6f\x66\x20\x74\x68\x65\x20\x63\x6c\x61\x73\x73\x20\x6f\x66\x20\x27\x39\x39\x3a\x20\x49\x66\x20\x49\x20\x63\x6f\x75\x6c\x64\x20\x6f\x66\x66\x65\x72\x20\x79\x6f\x75\x20\x6f\x6e\x6c\x79\x20\x6f\x6e\x65\x20\x74\x69\x70\x20\x66\x6f\x72\x20\x74\x68\x65\x20\x66\x75\x74\x75\x72\x65\x2c\x20\x73\x75\x6e\x73\x63\x72\x65\x65\x6e\x20\x77\x6f\x75\x6c\x64\x20\x62\x65\x20\x69\x74\x2e",
+ .ilen = 114,
+ .result = "\xd3\x1a\x8d\x34\x64\x8e\x60\xdb\x7b\x86\xaf\xbc\x53\xef\x7e\xc2\xa4\xad\xed\x51\x29\x6e\x08\xfe\xa9\xe2\xb5\xa7\x36\xee\x62\xd6\x3d\xbe\xa4\x5e\x8c\xa9\x67\x12\x82\xfa\xfb\x69\xda\x92\x72\x8b\x1a\x71\xde\x0a\x9e\x06\x0b\x29\x05\xd6\xa5\xb6\x7e\xcd\x3b\x36\x92\xdd\xbd\x7f\x2d\x77\x8b\x8c\x98\x03\xae\xe3\x28\x09\x1b\x58\xfa\xb3\x24\xe4\xfa\xd6\x75\x94\x55\x85\x80\x8b\x48\x31\xd7\xbc\x3f\xf4\xde\xf0\x8e\x4b\x7a\x9d\xe5\x76\xd2\x65\x86\xce\xc6\x4b\x61\x16\x1a\xe1\x0b\x59\x4f\x09\xe2\x6a\x7e\x90\x2e\xcb\xd0\x60\x06\x91"
+}, { /* wycheproof - misc */
+ .key = "\x80\xba\x31\x92\xc8\x03\xce\x96\x5e\xa3\x71\xd5\xff\x07\x3c\xf0\xf4\x3b\x6a\x2a\xb5\x76\xb2\x08\x42\x6e\x11\x40\x9c\x09\xb9\xb0",
+ .nonce = "\x4d\xa5\xbf\x8d\xfd\x58\x52\xc1\xea\x12\x37\x9d",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "",
+ .ilen = 0,
+ .result = "\x76\xac\xb3\x42\xcf\x31\x66\xa5\xb6\x3c\x0c\x0e\xa1\x38\x3c\x8d"
+}, { /* wycheproof - misc */
+ .key = "\x7a\x4c\xd7\x59\x17\x2e\x02\xeb\x20\x4d\xb2\xc3\xf5\xc7\x46\x22\x7d\xf5\x84\xfc\x13\x45\x19\x63\x91\xdb\xb9\x57\x7a\x25\x07\x42",
+ .nonce = "\xa9\x2e\xf0\xac\x99\x1d\xd5\x16\xa3\xc6\xf6\x89",
+ .nlen = 12,
+ .assoc = "\xbd\x50\x67\x64\xf2\xd2\xc4\x10",
+ .alen = 8,
+ .input = "",
+ .ilen = 0,
+ .result = "\x90\x6f\xa6\x28\x4b\x52\xf8\x7b\x73\x59\xcb\xaa\x75\x63\xc7\x09"
+}, { /* wycheproof - misc */
+ .key = "\xcc\x56\xb6\x80\x55\x2e\xb7\x50\x08\xf5\x48\x4b\x4c\xb8\x03\xfa\x50\x63\xeb\xd6\xea\xb9\x1f\x6a\xb6\xae\xf4\x91\x6a\x76\x62\x73",
+ .nonce = "\x99\xe2\x3e\xc4\x89\x85\xbc\xcd\xee\xab\x60\xf1",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x2a",
+ .ilen = 1,
+ .result = "\x3a\xca\xc2\x7d\xec\x09\x68\x80\x1e\x9f\x6e\xde\xd6\x9d\x80\x75\x22"
+}, { /* wycheproof - misc */
+ .key = "\x46\xf0\x25\x49\x65\xf7\x69\xd5\x2b\xdb\x4a\x70\xb4\x43\x19\x9f\x8e\xf2\x07\x52\x0d\x12\x20\xc5\x5e\x4b\x70\xf0\xfd\xa6\x20\xee",
+ .nonce = "\xab\x0d\xca\x71\x6e\xe0\x51\xd2\x78\x2f\x44\x03",
+ .nlen = 12,
+ .assoc = "\x91\xca\x6c\x59\x2c\xbc\xca\x53",
+ .alen = 8,
+ .input = "\x51",
+ .ilen = 1,
+ .result = "\xc4\x16\x83\x10\xca\x45\xb1\xf7\xc6\x6c\xad\x4e\x99\xe4\x3f\x72\xb9"
+}, { /* wycheproof - misc */
+ .key = "\x2f\x7f\x7e\x4f\x59\x2b\xb3\x89\x19\x49\x89\x74\x35\x07\xbf\x3e\xe9\xcb\xde\x17\x86\xb6\x69\x5f\xe6\xc0\x25\xfd\x9b\xa4\xc1\x00",
+ .nonce = "\x46\x1a\xf1\x22\xe9\xf2\xe0\x34\x7e\x03\xf2\xdb",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x5c\x60",
+ .ilen = 2,
+ .result = "\x4d\x13\x91\xe8\xb6\x1e\xfb\x39\xc1\x22\x19\x54\x53\x07\x7b\x22\xe5\xe2"
+}, { /* wycheproof - misc */
+ .key = "\xc8\x83\x3d\xce\x5e\xa9\xf2\x48\xaa\x20\x30\xea\xcf\xe7\x2b\xff\xe6\x9a\x62\x0c\xaf\x79\x33\x44\xe5\x71\x8f\xe0\xd7\xab\x1a\x58",
+ .nonce = "\x61\x54\x6b\xa5\xf1\x72\x05\x90\xb6\x04\x0a\xc6",
+ .nlen = 12,
+ .assoc = "\x88\x36\x4f\xc8\x06\x05\x18\xbf",
+ .alen = 8,
+ .input = "\xdd\xf2",
+ .ilen = 2,
+ .result = "\xb6\x0d\xea\xd0\xfd\x46\x97\xec\x2e\x55\x58\x23\x77\x19\xd0\x24\x37\xa2"
+}, { /* wycheproof - misc */
+ .key = "\x55\x56\x81\x58\xd3\xa6\x48\x3f\x1f\x70\x21\xea\xb6\x9b\x70\x3f\x61\x42\x51\xca\xdc\x1a\xf5\xd3\x4a\x37\x4f\xdb\xfc\x5a\xda\xc7",
+ .nonce = "\x3c\x4e\x65\x4d\x66\x3f\xa4\x59\x6d\xc5\x5b\xb7",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xab\x85\xe9\xc1\x57\x17\x31",
+ .ilen = 7,
+ .result = "\x5d\xfe\x34\x40\xdb\xb3\xc3\xed\x7a\x43\x4e\x26\x02\xd3\x94\x28\x1e\x0a\xfa\x9f\xb7\xaa\x42"
+}, { /* wycheproof - misc */
+ .key = "\xe3\xc0\x9e\x7f\xab\x1a\xef\xb5\x16\xda\x6a\x33\x02\x2a\x1d\xd4\xeb\x27\x2c\x80\xd5\x40\xc5\xda\x52\xa7\x30\xf3\x4d\x84\x0d\x7f",
+ .nonce = "\x58\x38\x93\x75\xc6\x9e\xe3\x98\xde\x94\x83\x96",
+ .nlen = 12,
+ .assoc = "\x84\xe4\x6b\xe8\xc0\x91\x90\x53",
+ .alen = 8,
+ .input = "\x4e\xe5\xcd\xa2\x0d\x42\x90",
+ .ilen = 7,
+ .result = "\x4b\xd4\x72\x12\x94\x1c\xe3\x18\x5f\x14\x08\xee\x7f\xbf\x18\xf5\xab\xad\x6e\x22\x53\xa1\xba"
+}, { /* wycheproof - misc */
+ .key = "\x51\xe4\xbf\x2b\xad\x92\xb7\xaf\xf1\xa4\xbc\x05\x55\x0b\xa8\x1d\xf4\xb9\x6f\xab\xf4\x1c\x12\xc7\xb0\x0e\x60\xe4\x8d\xb7\xe1\x52",
+ .nonce = "\x4f\x07\xaf\xed\xfd\xc3\xb6\xc2\x36\x18\x23\xd3",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xbe\x33\x08\xf7\x2a\x2c\x6a\xed",
+ .ilen = 8,
+ .result = "\x8e\x94\x39\xa5\x6e\xee\xc8\x17\xfb\xe8\xa6\xed\x8f\xab\xb1\x93\x75\x39\xdd\x6c\x00\xe9\x00\x21"
+}, { /* wycheproof - misc */
+ .key = "\x11\x31\xc1\x41\x85\x77\xa0\x54\xde\x7a\x4a\xc5\x51\x95\x0f\x1a\x05\x3f\x9a\xe4\x6e\x5b\x75\xfe\x4a\xbd\x56\x08\xd7\xcd\xda\xdd",
+ .nonce = "\xb4\xea\x66\x6e\xe1\x19\x56\x33\x66\x48\x4a\x78",
+ .nlen = 12,
+ .assoc = "\x66\xc0\xae\x70\x07\x6c\xb1\x4d",
+ .alen = 8,
+ .input = "\xa4\xc9\xc2\x80\x1b\x71\xf7\xdf",
+ .ilen = 8,
+ .result = "\xb9\xb9\x10\x43\x3a\xf0\x52\xb0\x45\x30\xf5\x1a\xee\xe0\x24\xe0\xa4\x45\xa6\x32\x8f\xa6\x7a\x18"
+}, { /* wycheproof - misc */
+ .key = "\x99\xb6\x2b\xd5\xaf\xbe\x3f\xb0\x15\xbd\xe9\x3f\x0a\xbf\x48\x39\x57\xa1\xc3\xeb\x3c\xa5\x9c\xb5\x0b\x39\xf7\xf8\xa9\xcc\x51\xbe",
+ .nonce = "\x9a\x59\xfc\xe2\x6d\xf0\x00\x5e\x07\x53\x86\x56",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x42\xba\xae\x59\x78\xfe\xaf\x5c\x36\x8d\x14\xe0",
+ .ilen = 12,
+ .result = "\xff\x7d\xc2\x03\xb2\x6c\x46\x7a\x6b\x50\xdb\x33\x57\x8c\x0f\x27\x58\xc2\xe1\x4e\x36\xd4\xfc\x10\x6d\xcb\x29\xb4"
+}, { /* wycheproof - misc */
+ .key = "\x85\xf3\x5b\x62\x82\xcf\xf4\x40\xbc\x10\x20\xc8\x13\x6f\xf2\x70\x31\x11\x0f\xa6\x3e\xc1\x6f\x1e\x82\x51\x18\xb0\x06\xb9\x12\x57",
+ .nonce = "\x58\xdb\xd4\xad\x2c\x4a\xd3\x5d\xd9\x06\xe9\xce",
+ .nlen = 12,
+ .assoc = "\xa5\x06\xe1\xa5\xc6\x90\x93\xf9",
+ .alen = 8,
+ .input = "\xfd\xc8\x5b\x94\xa4\xb2\xa6\xb7\x59\xb1\xa0\xda",
+ .ilen = 12,
+ .result = "\x9f\x88\x16\xde\x09\x94\xe9\x38\xd9\xe5\x3f\x95\xd0\x86\xfc\x6c\x9d\x8f\xa9\x15\xfd\x84\x23\xa7\xcf\x05\x07\x2f"
+}, { /* wycheproof - misc */
+ .key = "\x67\x11\x96\x27\xbd\x98\x8e\xda\x90\x62\x19\xe0\x8c\x0d\x0d\x77\x9a\x07\xd2\x08\xce\x8a\x4f\xe0\x70\x9a\xf7\x55\xee\xec\x6d\xcb",
+ .nonce = "\x68\xab\x7f\xdb\xf6\x19\x01\xda\xd4\x61\xd2\x3c",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x51\xf8\xc1\xf7\x31\xea\x14\xac\xdb\x21\x0a\x6d\x97\x3e\x07",
+ .ilen = 15,
+ .result = "\x0b\x29\x63\x8e\x1f\xbd\xd6\xdf\x53\x97\x0b\xe2\x21\x00\x42\x2a\x91\x34\x08\x7d\x67\xa4\x6e\x79\x17\x8d\x0a\x93\xf5\xe1\xd2"
+}, { /* wycheproof - misc */
+ .key = "\xe6\xf1\x11\x8d\x41\xe4\xb4\x3f\xb5\x82\x21\xb7\xed\x79\x67\x38\x34\xe0\xd8\xac\x5c\x4f\xa6\x0b\xbc\x8b\xc4\x89\x3a\x58\x89\x4d",
+ .nonce = "\xd9\x5b\x32\x43\xaf\xae\xf7\x14\xc5\x03\x5b\x6a",
+ .nlen = 12,
+ .assoc = "\x64\x53\xa5\x33\x84\x63\x22\x12",
+ .alen = 8,
+ .input = "\x97\x46\x9d\xa6\x67\xd6\x11\x0f\x9c\xbd\xa1\xd1\xa2\x06\x73",
+ .ilen = 15,
+ .result = "\x32\xdb\x66\xc4\xa3\x81\x9d\x81\x55\x74\x55\xe5\x98\x0f\xed\xfe\xae\x30\xde\xc9\x4e\x6a\xd3\xa9\xee\xa0\x6a\x0d\x70\x39\x17"
+}, { /* wycheproof - misc */
+ .key = "\x59\xd4\xea\xfb\x4d\xe0\xcf\xc7\xd3\xdb\x99\xa8\xf5\x4b\x15\xd7\xb3\x9f\x0a\xcc\x8d\xa6\x97\x63\xb0\x19\xc1\x69\x9f\x87\x67\x4a",
+ .nonce = "\x2f\xcb\x1b\x38\xa9\x9e\x71\xb8\x47\x40\xad\x9b",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x54\x9b\x36\x5a\xf9\x13\xf3\xb0\x81\x13\x1c\xcb\x6b\x82\x55\x88",
+ .ilen = 16,
+ .result = "\xe9\x11\x0e\x9f\x56\xab\x3c\xa4\x83\x50\x0c\xea\xba\xb6\x7a\x13\x83\x6c\xca\xbf\x15\xa6\xa2\x2a\x51\xc1\x07\x1c\xfa\x68\xfa\x0c"
+}, { /* wycheproof - misc */
+ .key = "\xb9\x07\xa4\x50\x75\x51\x3f\xe8\xa8\x01\x9e\xde\xe3\xf2\x59\x14\x87\xb2\xa0\x30\xb0\x3c\x6e\x1d\x77\x1c\x86\x25\x71\xd2\xea\x1e",
+ .nonce = "\x11\x8a\x69\x64\xc2\xd3\xe3\x80\x07\x1f\x52\x66",
+ .nlen = 12,
+ .assoc = "\x03\x45\x85\x62\x1a\xf8\xd7\xff",
+ .alen = 8,
+ .input = "\x55\xa4\x65\x64\x4f\x5b\x65\x09\x28\xcb\xee\x7c\x06\x32\x14\xd6",
+ .ilen = 16,
+ .result = "\xe4\xb1\x13\xcb\x77\x59\x45\xf3\xd3\xa8\xae\x9e\xc1\x41\xc0\x0c\x7c\x43\xf1\x6c\xe0\x96\xd0\xdc\x27\xc9\x58\x49\xdc\x38\x3b\x7d"
+}, { /* wycheproof - misc */
+ .key = "\x3b\x24\x58\xd8\x17\x6e\x16\x21\xc0\xcc\x24\xc0\xc0\xe2\x4c\x1e\x80\xd7\x2f\x7e\xe9\x14\x9a\x4b\x16\x61\x76\x62\x96\x16\xd0\x11",
+ .nonce = "\x45\xaa\xa3\xe5\xd1\x6d\x2d\x42\xdc\x03\x44\x5d",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x3f\xf1\x51\x4b\x1c\x50\x39\x15\x91\x8f\x0c\x0c\x31\x09\x4a\x6e\x1f",
+ .ilen = 17,
+ .result = "\x02\xcc\x3a\xcb\x5e\xe1\xfc\xdd\x12\xa0\x3b\xb8\x57\x97\x64\x74\xd3\xd8\x3b\x74\x63\xa2\xc3\x80\x0f\xe9\x58\xc2\x8e\xaa\x29\x08\x13"
+}, { /* wycheproof - misc */
+ .key = "\xf6\x0c\x6a\x1b\x62\x57\x25\xf7\x6c\x70\x37\xb4\x8f\xe3\x57\x7f\xa7\xf7\xb8\x7b\x1b\xd5\xa9\x82\x17\x6d\x18\x23\x06\xff\xb8\x70",
+ .nonce = "\xf0\x38\x4f\xb8\x76\x12\x14\x10\x63\x3d\x99\x3d",
+ .nlen = 12,
+ .assoc = "\x9a\xaf\x29\x9e\xee\xa7\x8f\x79",
+ .alen = 8,
+ .input = "\x63\x85\x8c\xa3\xe2\xce\x69\x88\x7b\x57\x8a\x3c\x16\x7b\x42\x1c\x9c",
+ .ilen = 17,
+ .result = "\x35\x76\x64\x88\xd2\xbc\x7c\x2b\x8d\x17\xcb\xbb\x9a\xbf\xad\x9e\x6d\x1f\x39\x1e\x65\x7b\x27\x38\xdd\xa0\x84\x48\xcb\xa2\x81\x1c\xeb"
+}, { /* wycheproof - misc */
+ .key = "\x02\x12\xa8\xde\x50\x07\xed\x87\xb3\x3f\x1a\x70\x90\xb6\x11\x4f\x9e\x08\xce\xfd\x96\x07\xf2\xc2\x76\xbd\xcf\xdb\xc5\xce\x9c\xd7",
+ .nonce = "\xe6\xb1\xad\xf2\xfd\x58\xa8\x76\x2c\x65\xf3\x1b",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x10\xf1\xec\xf9\xc6\x05\x84\x66\x5d\x9a\xe5\xef\xe2\x79\xe7\xf7\x37\x7e\xea\x69\x16\xd2\xb1\x11",
+ .ilen = 24,
+ .result = "\x42\xf2\x6c\x56\xcb\x4b\xe2\x1d\x9d\x8d\x0c\x80\xfc\x99\xdd\xe0\x0d\x75\xf3\x80\x74\xbf\xe7\x64\x54\xaa\x7e\x13\xd4\x8f\xff\x7d\x75\x57\x03\x94\x57\x04\x0a\x3a"
+}, { /* wycheproof - misc */
+ .key = "\xc5\xbc\x09\x56\x56\x46\xe7\xed\xda\x95\x4f\x1f\x73\x92\x23\xda\xda\x20\xb9\x5c\x44\xab\x03\x3d\x0f\xae\x4b\x02\x83\xd1\x8b\xe3",
+ .nonce = "\x6b\x28\x2e\xbe\xcc\x54\x1b\xcd\x78\x34\xed\x55",
+ .nlen = 12,
+ .assoc = "\x3e\x8b\xc5\xad\xe1\x82\xff\x08",
+ .alen = 8,
+ .input = "\x92\x22\xf9\x01\x8e\x54\xfd\x6d\xe1\x20\x08\x06\xa9\xee\x8e\x4c\xc9\x04\xd2\x9f\x25\xcb\xa1\x93",
+ .ilen = 24,
+ .result = "\x12\x30\x32\x43\x7b\x4b\xfd\x69\x20\xe8\xf7\xe7\xe0\x08\x7a\xe4\x88\x9e\xbe\x7a\x0a\xd0\xe9\x00\x3c\xf6\x8f\x17\x95\x50\xda\x63\xd3\xb9\x6c\x2d\x55\x41\x18\x65"
+}, { /* wycheproof - misc */
+ .key = "\x2e\xb5\x1c\x46\x9a\xa8\xeb\x9e\x6c\x54\xa8\x34\x9b\xae\x50\xa2\x0f\x0e\x38\x27\x11\xbb\xa1\x15\x2c\x42\x4f\x03\xb6\x67\x1d\x71",
+ .nonce = "\x04\xa9\xbe\x03\x50\x8a\x5f\x31\x37\x1a\x6f\xd2",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xb0\x53\x99\x92\x86\xa2\x82\x4f\x42\xcc\x8c\x20\x3a\xb2\x4e\x2c\x97\xa6\x85\xad\xcc\x2a\xd3\x26\x62\x55\x8e\x55\xa5\xc7\x29",
+ .ilen = 31,
+ .result = "\x45\xc7\xd6\xb5\x3a\xca\xd4\xab\xb6\x88\x76\xa6\xe9\x6a\x48\xfb\x59\x52\x4d\x2c\x92\xc9\xd8\xa1\x89\xc9\xfd\x2d\xb9\x17\x46\x56\x6d\x3c\xa1\x0e\x31\x1b\x69\x5f\x3e\xae\x15\x51\x65\x24\x93"
+}, { /* wycheproof - misc */
+ .key = "\x7f\x5b\x74\xc0\x7e\xd1\xb4\x0f\xd1\x43\x58\xfe\x2f\xf2\xa7\x40\xc1\x16\xc7\x70\x65\x10\xe6\xa4\x37\xf1\x9e\xa4\x99\x11\xce\xc4",
+ .nonce = "\x47\x0a\x33\x9e\xcb\x32\x19\xb8\xb8\x1a\x1f\x8b",
+ .nlen = 12,
+ .assoc = "\x37\x46\x18\xa0\x6e\xa9\x8a\x48",
+ .alen = 8,
+ .input = "\xf4\x52\x06\xab\xc2\x55\x52\xb2\xab\xc9\xab\x7f\xa2\x43\x03\x5f\xed\xaa\xdd\xc3\xb2\x29\x39\x56\xf1\xea\x6e\x71\x56\xe7\xeb",
+ .ilen = 31,
+ .result = "\x46\xa8\x0c\x41\x87\x02\x47\x20\x08\x46\x27\x58\x00\x80\xdd\xe5\xa3\xf4\xa1\x10\x93\xa7\x07\x6e\xd6\xf3\xd3\x26\xbc\x7b\x70\x53\x4d\x4a\xa2\x83\x5a\x52\xe7\x2d\x14\xdf\x0e\x4f\x47\xf2\x5f"
+}, { /* wycheproof - misc */
+ .key = "\xe1\x73\x1d\x58\x54\xe1\xb7\x0c\xb3\xff\xe8\xb7\x86\xa2\xb3\xeb\xf0\x99\x43\x70\x95\x47\x57\xb9\xdc\x8c\x7b\xc5\x35\x46\x34\xa3",
+ .nonce = "\x72\xcf\xd9\x0e\xf3\x02\x6c\xa2\x2b\x7e\x6e\x6a",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xb9\xc5\x54\xcb\xc3\x6a\xc1\x8a\xe8\x97\xdf\x7b\xee\xca\xc1\xdb\xeb\x4e\xaf\xa1\x56\xbb\x60\xce\x2e\x5d\x48\xf0\x57\x15\xe6\x78",
+ .ilen = 32,
+ .result = "\xea\x29\xaf\xa4\x9d\x36\xe8\x76\x0f\x5f\xe1\x97\x23\xb9\x81\x1e\xd5\xd5\x19\x93\x4a\x44\x0f\x50\x81\xac\x43\x0b\x95\x3b\x0e\x21\x22\x25\x41\xaf\x46\xb8\x65\x33\xc6\xb6\x8d\x2f\xf1\x08\xa7\xea"
+}, { /* wycheproof - misc */
+ .key = "\x27\xd8\x60\x63\x1b\x04\x85\xa4\x10\x70\x2f\xea\x61\xbc\x87\x3f\x34\x42\x26\x0c\xad\xed\x4a\xbd\xe2\x5b\x78\x6a\x2d\x97\xf1\x45",
+ .nonce = "\x26\x28\x80\xd4\x75\xf3\xda\xc5\x34\x0d\xd1\xb8",
+ .nlen = 12,
+ .assoc = "\x23\x33\xe5\xce\x0f\x93\xb0\x59",
+ .alen = 8,
+ .input = "\x6b\x26\x04\x99\x6c\xd3\x0c\x14\xa1\x3a\x52\x57\xed\x6c\xff\xd3\xbc\x5e\x29\xd6\xb9\x7e\xb1\x79\x9e\xb3\x35\xe2\x81\xea\x45\x1e",
+ .ilen = 32,
+ .result = "\x6d\xad\x63\x78\x97\x54\x4d\x8b\xf6\xbe\x95\x07\xed\x4d\x1b\xb2\xe9\x54\xbc\x42\x7e\x5d\xe7\x29\xda\xf5\x07\x62\x84\x6f\xf2\xf4\x7b\x99\x7d\x93\xc9\x82\x18\x9d\x70\x95\xdc\x79\x4c\x74\x62\x32"
+}, { /* wycheproof - misc */
+ .key = "\xcf\x0d\x40\xa4\x64\x4e\x5f\x51\x81\x51\x65\xd5\x30\x1b\x22\x63\x1f\x45\x44\xc4\x9a\x18\x78\xe3\xa0\xa5\xe8\xe1\xaa\xe0\xf2\x64",
+ .nonce = "\xe7\x4a\x51\x5e\x7e\x21\x02\xb9\x0b\xef\x55\xd2",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x97\x3d\x0c\x75\x38\x26\xba\xe4\x66\xcf\x9a\xbb\x34\x93\x15\x2e\x9d\xe7\x81\x9e\x2b\xd0\xc7\x11\x71\x34\x6b\x4d\x2c\xeb\xf8\x04\x1a\xa3\xce\xdc\x0d\xfd\x7b\x46\x7e\x26\x22\x8b\xc8\x6c\x9a",
+ .ilen = 47,
+ .result = "\xfb\xa7\x8a\xe4\xf9\xd8\x08\xa6\x2e\x3d\xa4\x0b\xe2\xcb\x77\x00\xc3\x61\x3d\x9e\xb2\xc5\x29\xc6\x52\xe7\x6a\x43\x2c\x65\x8d\x27\x09\x5f\x0e\xb8\xf9\x40\xc3\x24\x98\x1e\xa9\x35\xe5\x07\xf9\x8f\x04\x69\x56\xdb\x3a\x51\x29\x08\xbd\x7a\xfc\x8f\x2a\xb0\xa9"
+}, { /* wycheproof - misc */
+ .key = "\x6c\xbf\xd7\x1c\x64\x5d\x18\x4c\xf5\xd2\x3c\x40\x2b\xdb\x0d\x25\xec\x54\x89\x8c\x8a\x02\x73\xd4\x2e\xb5\xbe\x10\x9f\xdc\xb2\xac",
+ .nonce = "\xd4\xd8\x07\x34\x16\x83\x82\x5b\x31\xcd\x4d\x95",
+ .nlen = 12,
+ .assoc = "\xb3\xe4\x06\x46\x83\xb0\x2d\x84",
+ .alen = 8,
+ .input = "\xa9\x89\x95\x50\x4d\xf1\x6f\x74\x8b\xfb\x77\x85\xff\x91\xee\xb3\xb6\x60\xea\x9e\xd3\x45\x0c\x3d\x5e\x7b\x0e\x79\xef\x65\x36\x59\xa9\x97\x8d\x75\x54\x2e\xf9\x1c\x45\x67\x62\x21\x56\x40\xb9",
+ .ilen = 47,
+ .result = "\xa1\xff\xed\x80\x76\x18\x29\xec\xce\x24\x2e\x0e\x88\xb1\x38\x04\x90\x16\xbc\xa0\x18\xda\x2b\x6e\x19\x98\x6b\x3e\x31\x8c\xae\x8d\x80\x61\x98\xfb\x4c\x52\x7c\xc3\x93\x50\xeb\xdd\xea\xc5\x73\xc4\xcb\xf0\xbe\xfd\xa0\xb7\x02\x42\xc6\x40\xd7\xcd\x02\xd7\xa3"
+}, { /* wycheproof - misc */
+ .key = "\x5b\x1d\x10\x35\xc0\xb1\x7e\xe0\xb0\x44\x47\x67\xf8\x0a\x25\xb8\xc1\xb7\x41\xf4\xb5\x0a\x4d\x30\x52\x22\x6b\xaa\x1c\x6f\xb7\x01",
+ .nonce = "\xd6\x10\x40\xa3\x13\xed\x49\x28\x23\xcc\x06\x5b",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xd0\x96\x80\x31\x81\xbe\xef\x9e\x00\x8f\xf8\x5d\x5d\xdc\x38\xdd\xac\xf0\xf0\x9e\xe5\xf7\xe0\x7f\x1e\x40\x79\xcb\x64\xd0\xdc\x8f\x5e\x67\x11\xcd\x49\x21\xa7\x88\x7d\xe7\x6e\x26\x78\xfd\xc6\x76\x18\xf1\x18\x55\x86\xbf\xea\x9d\x4c\x68\x5d\x50\xe4\xbb\x9a\x82",
+ .ilen = 64,
+ .result = "\x9a\x4e\xf2\x2b\x18\x16\x77\xb5\x75\x5c\x08\xf7\x47\xc0\xf8\xd8\xe8\xd4\xc1\x8a\x9c\xc2\x40\x5c\x12\xbb\x51\xbb\x18\x72\xc8\xe8\xb8\x77\x67\x8b\xec\x44\x2c\xfc\xbb\x0f\xf4\x64\xa6\x4b\x74\x33\x2c\xf0\x72\x89\x8c\x7e\x0e\xdd\xf6\x23\x2e\xa6\xe2\x7e\xfe\x50\x9f\xf3\x42\x7a\x0f\x32\xfa\x56\x6d\x9c\xa0\xa7\x8a\xef\xc0\x13"
+}, { /* wycheproof - misc */
+ .key = "\x97\xd6\x35\xc4\xf4\x75\x74\xd9\x99\x8a\x90\x87\x5d\xa1\xd3\xa2\x84\xb7\x55\xb2\xd3\x92\x97\xa5\x72\x52\x35\x19\x0e\x10\xa9\x7e",
+ .nonce = "\xd3\x1c\x21\xab\xa1\x75\xb7\x0d\xe4\xeb\xb1\x9c",
+ .nlen = 12,
+ .assoc = "\x71\x93\xf6\x23\x66\x33\x21\xa2",
+ .alen = 8,
+ .input = "\x94\xee\x16\x6d\x6d\x6e\xcf\x88\x32\x43\x71\x36\xb4\xae\x80\x5d\x42\x88\x64\x35\x95\x86\xd9\x19\x3a\x25\x01\x62\x93\xed\xba\x44\x3c\x58\xe0\x7e\x7b\x71\x95\xec\x5b\xd8\x45\x82\xa9\xd5\x6c\x8d\x4a\x10\x8c\x7d\x7c\xe3\x4e\x6c\x6f\x8e\xa1\xbe\xc0\x56\x73\x17",
+ .ilen = 64,
+ .result = "\x5f\xbb\xde\xcc\x34\xbe\x20\x16\x14\xf6\x36\x03\x1e\xeb\x42\xf1\xca\xce\x3c\x79\xa1\x2c\xff\xd8\x71\xee\x8e\x73\x82\x0c\x82\x97\x49\xf1\xab\xb4\x29\x43\x67\x84\x9f\xb6\xc2\xaa\x56\xbd\xa8\xa3\x07\x8f\x72\x3d\x7c\x1c\x85\x20\x24\xb0\x17\xb5\x89\x73\xfb\x1e\x09\x26\x3d\xa7\xb4\xcb\x92\x14\x52\xf9\x7d\xca\x40\xf5\x80\xec"
+}, { /* wycheproof - misc */
+ .key = "\xfe\x6e\x55\xbd\xae\xd1\xf7\x28\x4c\xa5\xfc\x0f\x8c\x5f\x2b\x8d\xf5\x6d\xc0\xf4\x9e\x8c\xa6\x6a\x41\x99\x5e\x78\x33\x51\xf9\x01",
+ .nonce = "\x17\xc8\x6a\x8a\xbb\xb7\xe0\x03\xac\xde\x27\x99",
+ .nlen = 12,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xb4\x29\xeb\x80\xfb\x8f\xe8\xba\xed\xa0\xc8\x5b\x9c\x33\x34\x58\xe7\xc2\x99\x2e\x55\x84\x75\x06\x9d\x12\xd4\x5c\x22\x21\x75\x64\x12\x15\x88\x03\x22\x97\xef\xf5\x67\x83\x74\x2a\x5f\xc2\x2d\x74\x10\xff\xb2\x9d\x66\x09\x86\x61\xd7\x6f\x12\x6c\x3c\x27\x68\x9e\x43\xb3\x72\x67\xca\xc5\xa3\xa6\xd3\xab\x49\xe3\x91\xda\x29\xcd\x30\x54\xa5\x69\x2e\x28\x07\xe4\xc3\xea\x46\xc8\x76\x1d\x50\xf5\x92",
+ .ilen = 97,
+ .result = "\xd0\x10\x2f\x6c\x25\x8b\xf4\x97\x42\xce\xc3\x4c\xf2\xd0\xfe\xdf\x23\xd1\x05\xfb\x4c\x84\xcf\x98\x51\x5e\x1b\xc9\xa6\x4f\x8a\xd5\xbe\x8f\x07\x21\xbd\xe5\x06\x45\xd0\x00\x83\xc3\xa2\x63\xa3\x10\x53\xb7\x60\x24\x5f\x52\xae\x28\x66\xa5\xec\x83\xb1\x9f\x61\xbe\x1d\x30\xd5\xc5\xd9\xfe\xcc\x4c\xbb\xe0\x8f\xd3\x85\x81\x3a\x2a\xa3\x9a\x00\xff\x9c\x10\xf7\xf2\x37\x02\xad\xd1\xe4\xb2\xff\xa3\x1c\x41\x86\x5f\xc7\x1d\xe1\x2b\x19\x61\x21\x27\xce\x49\x99\x3b\xb0"
+}, { /* wycheproof - misc */
+ .key = "\xaa\xbc\x06\x34\x74\xe6\x5c\x4c\x3e\x9b\xdc\x48\x0d\xea\x97\xb4\x51\x10\xc8\x61\x88\x46\xff\x6b\x15\xbd\xd2\xa4\xa5\x68\x2c\x4e",
+ .nonce = "\x46\x36\x2f\x45\xd6\x37\x9e\x63\xe5\x22\x94\x60",
+ .nlen = 12,
+ .assoc = "\xa1\x1c\x40\xb6\x03\x76\x73\x30",
+ .alen = 8,
+ .input = "\xce\xb5\x34\xce\x50\xdc\x23\xff\x63\x8a\xce\x3e\xf6\x3a\xb2\xcc\x29\x73\xee\xad\xa8\x07\x85\xfc\x16\x5d\x06\xc2\xf5\x10\x0f\xf5\xe8\xab\x28\x82\xc4\x75\xaf\xcd\x05\xcc\xd4\x9f\x2e\x7d\x8f\x55\xef\x3a\x72\xe3\xdc\x51\xd6\x85\x2b\x8e\x6b\x9e\x7a\xec\xe5\x7b\xe6\x55\x6b\x0b\x6d\x94\x13\xe3\x3f\xc5\xfc\x24\xa9\xa2\x05\xad\x59\x57\x4b\xb3\x9d\x94\x4a\x92\xdc\x47\x97\x0d\x84\xa6\xad\x31\x76",
+ .ilen = 97,
+ .result = "\x75\x45\x39\x1b\x51\xde\x01\xd5\xc5\x3d\xfa\xca\x77\x79\x09\x06\x3e\x58\xed\xee\x4b\xb1\x22\x7e\x71\x10\xac\x4d\x26\x20\xc2\xae\xc2\xf8\x48\xf5\x6d\xee\xb0\x37\xa8\xdc\xed\x75\xaf\xa8\xa6\xc8\x90\xe2\xde\xe4\x2f\x95\x0b\xb3\x3d\x9e\x24\x24\xd0\x8a\x50\x5d\x89\x95\x63\x97\x3e\xd3\x88\x70\xf3\xde\x6e\xe2\xad\xc7\xfe\x07\x2c\x36\x6c\x14\xe2\xcf\x7c\xa6\x2f\xb3\xd3\x6b\xee\x11\x68\x54\x61\xb7\x0d\x44\xef\x8c\x66\xc5\xc7\xbb\xf1\x0d\xca\xdd\x7f\xac\xf6"
+}, { /* wycheproof - misc */
+ .key = "\x7d\x00\xb4\x80\x95\xad\xfa\x32\x72\x05\x06\x07\xb2\x64\x18\x50\x02\xba\x99\x95\x7c\x49\x8b\xe0\x22\x77\x0f\x2c\xe2\xf3\x14\x3c",
+ .nonce = "\x87\x34\x5f\x10\x55\xfd\x9e\x21\x02\xd5\x06\x56",
+ .nlen = 12,
+ .assoc = "\x02",
+ .alen = 1,
+ .input = "\xe5\xcc\xaa\x44\x1b\xc8\x14\x68\x8f\x8f\x6e\x8f\x28\xb5\x00\xb2",
+ .ilen = 16,
+ .result = "\x7e\x72\xf5\xa1\x85\xaf\x16\xa6\x11\x92\x1b\x43\x8f\x74\x9f\x0b\x12\x42\xc6\x70\x73\x23\x34\x02\x9a\xdf\xe1\xc5\x00\x16\x51\xe4"
+}, { /* wycheproof - misc */
+ .key = "\x64\x32\x71\x7f\x1d\xb8\x5e\x41\xac\x78\x36\xbc\xe2\x51\x85\xa0\x80\xd5\x76\x2b\x9e\x2b\x18\x44\x4b\x6e\xc7\x2c\x3b\xd8\xe4\xdc",
+ .nonce = "\x87\xa3\x16\x3e\xc0\x59\x8a\xd9\x5b\x3a\xa7\x13",
+ .nlen = 12,
+ .assoc = "\xb6\x48",
+ .alen = 2,
+ .input = "\x02\xcd\xe1\x68\xfb\xa3\xf5\x44\xbb\xd0\x33\x2f\x7a\xde\xad\xa8",
+ .ilen = 16,
+ .result = "\x85\xf2\x9a\x71\x95\x57\xcd\xd1\x4d\x1f\x8f\xff\xab\x6d\x9e\x60\x73\x2c\xa3\x2b\xec\xd5\x15\xa1\xed\x35\x3f\x54\x2e\x99\x98\x58"
+}, { /* wycheproof - misc */
+ .key = "\x8e\x34\xcf\x73\xd2\x45\xa1\x08\x2a\x92\x0b\x86\x36\x4e\xb8\x96\xc4\x94\x64\x67\xbc\xb3\xd5\x89\x29\xfc\xb3\x66\x90\xe6\x39\x4f",
+ .nonce = "\x6f\x57\x3a\xa8\x6b\xaa\x49\x2b\xa4\x65\x96\xdf",
+ .nlen = 12,
+ .assoc = "\xbd\x4c\xd0\x2f\xc7\x50\x2b\xbd\xbd\xf6\xc9\xa3\xcb\xe8\xf0",
+ .alen = 15,
+ .input = "\x16\xdd\xd2\x3f\xf5\x3f\x3d\x23\xc0\x63\x34\x48\x70\x40\xeb\x47",
+ .ilen = 16,
+ .result = "\xc1\xb2\x95\x93\x6d\x56\xfa\xda\xc0\x3e\x5f\x74\x2b\xff\x73\xa1\x39\xc4\x57\xdb\xab\x66\x38\x2b\xab\xb3\xb5\x58\x00\xcd\xa5\xb8"
+}, { /* wycheproof - misc */
+ .key = "\xcb\x55\x75\xf5\xc7\xc4\x5c\x91\xcf\x32\x0b\x13\x9f\xb5\x94\x23\x75\x60\xd0\xa3\xe6\xf8\x65\xa6\x7d\x4f\x63\x3f\x2c\x08\xf0\x16",
+ .nonce = "\x1a\x65\x18\xf0\x2e\xde\x1d\xa6\x80\x92\x66\xd9",
+ .nlen = 12,
+ .assoc = "\x89\xcc\xe9\xfb\x47\x44\x1d\x07\xe0\x24\x5a\x66\xfe\x8b\x77\x8b",
+ .alen = 16,
+ .input = "\x62\x3b\x78\x50\xc3\x21\xe2\xcf\x0c\x6f\xbc\xc8\xdf\xd1\xaf\xf2",
+ .ilen = 16,
+ .result = "\xc8\x4c\x9b\xb7\xc6\x1c\x1b\xcb\x17\x77\x2a\x1c\x50\x0c\x50\x95\xdb\xad\xf7\xa5\x13\x8c\xa0\x34\x59\xa2\xcd\x65\x83\x1e\x09\x2f"
+}, { /* wycheproof - misc */
+ .key = "\xa5\x56\x9e\x72\x9a\x69\xb2\x4b\xa6\xe0\xff\x15\xc4\x62\x78\x97\x43\x68\x24\xc9\x41\xe9\xd0\x0b\x2e\x93\xfd\xdc\x4b\xa7\x76\x57",
+ .nonce = "\x56\x4d\xee\x49\xab\x00\xd2\x40\xfc\x10\x68\xc3",
+ .nlen = 12,
+ .assoc = "\xd1\x9f\x2d\x98\x90\x95\xf7\xab\x03\xa5\xfd\xe8\x44\x16\xe0\x0c\x0e",
+ .alen = 17,
+ .input = "\x87\xb3\xa4\xd7\xb2\x6d\x8d\x32\x03\xa0\xde\x1d\x64\xef\x82\xe3",
+ .ilen = 16,
+ .result = "\x94\xbc\x80\x62\x1e\xd1\xe7\x1b\x1f\xd2\xb5\xc3\xa1\x5e\x35\x68\x33\x35\x11\x86\x17\x96\x97\x84\x01\x59\x8b\x96\x37\x22\xf5\xb3"
+}, { /* wycheproof - misc */
+ .key = "\x56\x20\x74\x65\xb4\xe4\x8e\x6d\x04\x63\x0f\x4a\x42\xf3\x5c\xfc\x16\x3a\xb2\x89\xc2\x2a\x2b\x47\x84\xf6\xf9\x29\x03\x30\xbe\xe0",
+ .nonce = "\xdf\x87\x13\xe8\x7e\xc3\xdb\xcf\xad\x14\xd5\x3e",
+ .nlen = 12,
+ .assoc = "\x5e\x64\x70\xfa\xcd\x99\xc1\xd8\x1e\x37\xcd\x44\x01\x5f\xe1\x94\x80\xa2\xa4\xd3\x35\x2a\x4f\xf5\x60\xc0\x64\x0f\xdb\xda",
+ .alen = 30,
+ .input = "\xe6\x01\xb3\x85\x57\x79\x7d\xa2\xf8\xa4\x10\x6a\x08\x9d\x1d\xa6",
+ .ilen = 16,
+ .result = "\x29\x9b\x5d\x3f\x3d\x03\xc0\x87\x20\x9a\x16\xe2\x85\x14\x31\x11\x4b\x45\x4e\xd1\x98\xde\x11\x7e\x83\xec\x49\xfa\x8d\x85\x08\xd6"
+}, { /* wycheproof - misc */
+ .key = "\x39\x37\x98\x6a\xf8\x6d\xaf\xc1\xba\x0c\x46\x72\xd8\xab\xc4\x6c\x20\x70\x62\x68\x2d\x9c\x26\x4a\xb0\x6d\x6c\x58\x07\x20\x51\x30",
+ .nonce = "\x8d\xf4\xb1\x5a\x88\x8c\x33\x28\x6a\x7b\x76\x51",
+ .nlen = 12,
+ .assoc = "\xba\x44\x6f\x6f\x9a\x0c\xed\x22\x45\x0f\xeb\x10\x73\x7d\x90\x07\xfd\x69\xab\xc1\x9b\x1d\x4d\x90\x49\xa5\x55\x1e\x86\xec\x2b\x37",
+ .alen = 32,
+ .input = "\xdc\x9e\x9e\xaf\x11\xe3\x14\x18\x2d\xf6\xa4\xeb\xa1\x7a\xec\x9c",
+ .ilen = 16,
+ .result = "\x60\x5b\xbf\x90\xae\xb9\x74\xf6\x60\x2b\xc7\x78\x05\x6f\x0d\xca\x38\xea\x23\xd9\x90\x54\xb4\x6b\x42\xff\xe0\x04\x12\x9d\x22\x04"
+}, { /* wycheproof - misc */
+ .key = "\x36\x37\x2a\xbc\xdb\x78\xe0\x27\x96\x46\xac\x3d\x17\x6b\x96\x74\xe9\x15\x4e\xec\xf0\xd5\x46\x9c\x65\x1e\xc7\xe1\x6b\x4c\x11\x99",
+ .nonce = "\xbe\x40\xe5\xf1\xa1\x18\x17\xa0\xa8\xfa\x89\x49",
+ .nlen = 12,
+ .assoc = "\xd4\x1a\x82\x8d\x5e\x71\x82\x92\x47\x02\x19\x05\x40\x2e\xa2\x57\xdc\xcb\xc3\xb8\x0f\xcd\x56\x75\x05\x6b\x68\xbb\x59\xe6\x2e\x88\x73",
+ .alen = 33,
+ .input = "\x81\xce\x84\xed\xe9\xb3\x58\x59\xcc\x8c\x49\xa8\xf6\xbe\x7d\xc6",
+ .ilen = 16,
+ .result = "\x7b\x7c\xe0\xd8\x24\x80\x9a\x70\xde\x32\x56\x2c\xcf\x2c\x2b\xbd\x15\xd4\x4a\x00\xce\x0d\x19\xb4\x23\x1f\x92\x1e\x22\xbc\x0a\x43"
+}, { /* wycheproof - misc */
+ .key = "\x9f\x14\x79\xed\x09\x7d\x7f\xe5\x29\xc1\x1f\x2f\x5a\xdd\x9a\xaf\xf4\xa1\xca\x0b\x68\x99\x7a\x2c\xb7\xf7\x97\x49\xbd\x90\xaa\xf4",
+ .nonce = "\x84\xc8\x7d\xae\x4e\xee\x27\x73\x0e\xc3\x5d\x12",
+ .nlen = 12,
+ .assoc = "\x3f\x2d\xd4\x9b\xbf\x09\xd6\x9a\x78\xa3\xd8\x0e\xa2\x56\x66\x14\xfc\x37\x94\x74\x19\x6c\x1a\xae\x84\x58\x3d\xa7\x3d\x7f\xf8\x5c\x6f\x42\xca\x42\x05\x6a\x97\x92\xcc\x1b\x9f\xb3\xc7\xd2\x61",
+ .alen = 47,
+ .input = "\xa6\x67\x47\xc8\x9e\x85\x7a\xf3\xa1\x8e\x2c\x79\x50\x00\x87\xed",
+ .ilen = 16,
+ .result = "\xca\x82\xbf\xf3\xe2\xf3\x10\xcc\xc9\x76\x67\x2c\x44\x15\xe6\x9b\x57\x63\x8c\x62\xa5\xd8\x5d\xed\x77\x4f\x91\x3c\x81\x3e\xa0\x32"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x88\x80\x94\x17\x83\x55\xd3\x04\x84\x64\x43\xfe\xe8\xdf\x99\x47\x03\x03\xfb\x3b\x7b\x80\xe0\x30\xbe\xeb\xd3\x29\xbe",
+ .ilen = 32,
+ .result = "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xe6\xd3\xd7\x32\x4a\x1c\xbb\xa7\x77\xbb\xb0\xec\xdd\xa3\x78\x07"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x88\x80\x94\x17\x83\x55\xd3\x04\x84\x64\x43\xfe\xe8\xdf\x99\x47\x03\x03\xfb\x3b\x7b\x80\xe0\x30\xbe\xeb\xd3\x29\xbe\xe3\xbc\xdb\x5b\x1e\xde\xfc\xfe\x8b\xcd\xa1\xb6\xa1\x5c\x8c\x2b\x08\x69\xff\xd2\xec\x5e\x26\xe5\x53\xb7\xb2\x27\xfe\x87\xfd\xbd",
+ .ilen = 64,
+ .result = "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x06\x2d\xe6\x79\x5f\x27\x4f\xd2\xa3\x05\xd7\x69\x80\xbc\x9c\xce"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x88\x80\x94\x17\x83\x55\xd3\x04\x84\x64\x43\xfe\xe8\xdf\x99\x47\x03\x03\xfb\x3b\x7b\x80\xe0\x30\xbe\xeb\xd3\x29\xbe\xe3\xbc\xdb\x5b\x1e\xde\xfc\xfe\x8b\xcd\xa1\xb6\xa1\x5c\x8c\x2b\x08\x69\xff\xd2\xec\x5e\x26\xe5\x53\xb7\xb2\x27\xfe\x87\xfd\xbd\x7a\xda\x44\x42\x42\x69\xbf\xfa\x55\x27\xf2\x70\xac\xf6\x85\x02\xb7\x4c\x5a\xe2\xe6\x0c\x05\x80\x98\x1a\x49\x38\x45\x93\x92\xc4\x9b\xb2\xf2\x84\xb6\x46\xef\xc7\xf3\xf0\xb1\x36\x1d\xc3\x48\xed\x77\xd3\x0b\xc5\x76\x92\xed\x38\xfb\xac\x01\x88\x38\x04\x88\xc7",
+ .ilen = 128,
+ .result = "\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xd8\xb4\x79\x02\xba\xae\xaf\xb3\x42\x03\x05\x15\x29\xaf\x28\x2e"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 16,
+ .input = "\xda\x92\xbf\x77\x7f\x6b\xe8\x7c\xaa\x2c\xfb\x7b\x9b\xbc\x01\x17\x20\x66\xb8\xfc\xfc\x04\xc4\x84\x7f\x1f\xcf\x41\x14\x2c\xd6\x41",
+ .ilen = 32,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xb3\x89\x1c\x84\x9c\xb5\x2c\x27\x74\x7e\xdf\xcf\x31\x21\x3b\xb6"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 16,
+ .input = "\xda\x92\xbf\x77\x7f\x6b\xe8\x7c\xaa\x2c\xfb\x7b\x9b\xbc\x01\x17\x20\x66\xb8\xfc\xfc\x04\xc4\x84\x7f\x1f\xcf\x41\x14\x2c\xd6\x41\x1c\x43\x24\xa4\xe1\x21\x03\x01\x74\x32\x5e\x49\x5e\xa3\x73\xd4\xf7\x96\x00\x2d\x13\xa1\xd9\x1a\xac\x48\x4d\xd8\x01\x78\x02\x42",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xf0\xc1\x2d\x26\xef\x03\x02\x9b\x62\xc0\x08\xda\x27\xc5\xdc\x68"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 16,
+ .input = "\xda\x92\xbf\x77\x7f\x6b\xe8\x7c\xaa\x2c\xfb\x7b\x9b\xbc\x01\x17\x20\x66\xb8\xfc\xfc\x04\xc4\x84\x7f\x1f\xcf\x41\x14\x2c\xd6\x41\x1c\x43\x24\xa4\xe1\x21\x03\x01\x74\x32\x5e\x49\x5e\xa3\x73\xd4\xf7\x96\x00\x2d\x13\xa1\xd9\x1a\xac\x48\x4d\xd8\x01\x78\x02\x42\x85\x25\xbb\xbd\xbd\x96\x40\x05\xaa\xd8\x0d\x8f\x53\x09\x7a\xfd\x48\xb3\xa5\x1d\x19\xf3\xfa\x7f\x67\xe5\xb6\xc7\xba\x6c\x6d\x3b\x64\x4d\x0d\x7b\x49\xb9\x10\x38\x0c\x0f\x4e\xc9\xe2\x3c\xb7\x12\x88\x2c\xf4\x3a\x89\x6d\x12\xc7\x04\x53\xfe\x77\xc7\xfb\x77\x38",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xee\x65\x78\x30\x01\xc2\x56\x91\xfa\x28\xd0\xf5\xf1\xc1\xd7\x62"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x08\x80\x94\x17\x03\x55\xd3\x04\x04\x64\x43\xfe\x68\xdf\x99\x47\x83\x03\xfb\x3b\xfb\x80\xe0\x30\x3e\xeb\xd3\x29\x3e",
+ .ilen = 32,
+ .result = "\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x79\xba\x7a\x29\xf5\xa7\xbb\x75\x79\x7a\xf8\x7a\x61\x01\x29\xa4"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x08\x80\x94\x17\x03\x55\xd3\x04\x04\x64\x43\xfe\x68\xdf\x99\x47\x83\x03\xfb\x3b\xfb\x80\xe0\x30\x3e\xeb\xd3\x29\x3e\xe3\xbc\xdb\xdb\x1e\xde\xfc\x7e\x8b\xcd\xa1\x36\xa1\x5c\x8c\xab\x08\x69\xff\x52\xec\x5e\x26\x65\x53\xb7\xb2\xa7\xfe\x87\xfd\x3d",
+ .ilen = 64,
+ .result = "\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x36\xb1\x74\x38\x19\xe1\xb9\xba\x15\x51\xe8\xed\x92\x2a\x95\x9a"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x08\x80\x94\x17\x03\x55\xd3\x04\x04\x64\x43\xfe\x68\xdf\x99\x47\x83\x03\xfb\x3b\xfb\x80\xe0\x30\x3e\xeb\xd3\x29\x3e\xe3\xbc\xdb\xdb\x1e\xde\xfc\x7e\x8b\xcd\xa1\x36\xa1\x5c\x8c\xab\x08\x69\xff\x52\xec\x5e\x26\x65\x53\xb7\xb2\xa7\xfe\x87\xfd\x3d\x7a\xda\x44\xc2\x42\x69\xbf\x7a\x55\x27\xf2\xf0\xac\xf6\x85\x82\xb7\x4c\x5a\x62\xe6\x0c\x05\x00\x98\x1a\x49\xb8\x45\x93\x92\x44\x9b\xb2\xf2\x04\xb6\x46\xef\x47\xf3\xf0\xb1\xb6\x1d\xc3\x48\x6d\x77\xd3\x0b\x45\x76\x92\xed\xb8\xfb\xac\x01\x08\x38\x04\x88\x47",
+ .ilen = 128,
+ .result = "\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\xfe\xac\x49\x55\x55\x4e\x80\x6f\x3a\x19\x02\xe2\x44\x32\xc0\x8a"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f",
+ .alen = 16,
+ .input = "\xda\x92\xbf\xf7\x7f\x6b\xe8\xfc\xaa\x2c\xfb\xfb\x9b\xbc\x01\x97\x20\x66\xb8\x7c\xfc\x04\xc4\x04\x7f\x1f\xcf\xc1\x14\x2c\xd6\xc1",
+ .ilen = 32,
+ .result = "\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\x20\xa3\x79\x8d\xf1\x29\x2c\x59\x72\xbf\x97\x41\xae\xc3\x8a\x19"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f",
+ .alen = 16,
+ .input = "\xda\x92\xbf\xf7\x7f\x6b\xe8\xfc\xaa\x2c\xfb\xfb\x9b\xbc\x01\x97\x20\x66\xb8\x7c\xfc\x04\xc4\x04\x7f\x1f\xcf\xc1\x14\x2c\xd6\xc1\x1c\x43\x24\x24\xe1\x21\x03\x81\x74\x32\x5e\xc9\x5e\xa3\x73\x54\xf7\x96\x00\xad\x13\xa1\xd9\x9a\xac\x48\x4d\x58\x01\x78\x02\xc2",
+ .ilen = 64,
+ .result = "\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xc0\x3d\x9f\x67\x35\x4a\x97\xb2\xf0\x74\xf7\x55\x15\x57\xe4\x9c"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f",
+ .alen = 16,
+ .input = "\xda\x92\xbf\xf7\x7f\x6b\xe8\xfc\xaa\x2c\xfb\xfb\x9b\xbc\x01\x97\x20\x66\xb8\x7c\xfc\x04\xc4\x04\x7f\x1f\xcf\xc1\x14\x2c\xd6\xc1\x1c\x43\x24\x24\xe1\x21\x03\x81\x74\x32\x5e\xc9\x5e\xa3\x73\x54\xf7\x96\x00\xad\x13\xa1\xd9\x9a\xac\x48\x4d\x58\x01\x78\x02\xc2\x85\x25\xbb\x3d\xbd\x96\x40\x85\xaa\xd8\x0d\x0f\x53\x09\x7a\x7d\x48\xb3\xa5\x9d\x19\xf3\xfa\xff\x67\xe5\xb6\x47\xba\x6c\x6d\xbb\x64\x4d\x0d\xfb\x49\xb9\x10\xb8\x0c\x0f\x4e\x49\xe2\x3c\xb7\x92\x88\x2c\xf4\xba\x89\x6d\x12\x47\x04\x53\xfe\xf7\xc7\xfb\x77\xb8",
+ .ilen = 128,
+ .result = "\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xc8\x6d\xa8\xdd\x65\x22\x86\xd5\x02\x13\xd3\x28\xd6\x3e\x40\x06"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff",
+ .alen = 16,
+ .input = "\x5a\x92\xbf\x77\xff\x6b\xe8\x7c\x2a\x2c\xfb\x7b\x1b\xbc\x01\x17\xa0\x66\xb8\xfc\x7c\x04\xc4\x84\xff\x1f\xcf\x41\x94\x2c\xd6\x41",
+ .ilen = 32,
+ .result = "\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\xbe\xde\x90\x83\xce\xb3\x6d\xdf\xe5\xfa\x81\x1f\x95\x47\x1c\x67"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff",
+ .alen = 16,
+ .input = "\x5a\x92\xbf\x77\xff\x6b\xe8\x7c\x2a\x2c\xfb\x7b\x1b\xbc\x01\x17\xa0\x66\xb8\xfc\x7c\x04\xc4\x84\xff\x1f\xcf\x41\x94\x2c\xd6\x41\x9c\x43\x24\xa4\x61\x21\x03\x01\xf4\x32\x5e\x49\xde\xa3\x73\xd4\x77\x96\x00\x2d\x93\xa1\xd9\x1a\x2c\x48\x4d\xd8\x81\x78\x02\x42",
+ .ilen = 64,
+ .result = "\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x30\x08\x74\xbb\x06\x92\xb6\x89\xde\xad\x9a\xe1\x5b\x06\x73\x90"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff",
+ .alen = 16,
+ .input = "\x5a\x92\xbf\x77\xff\x6b\xe8\x7c\x2a\x2c\xfb\x7b\x1b\xbc\x01\x17\xa0\x66\xb8\xfc\x7c\x04\xc4\x84\xff\x1f\xcf\x41\x94\x2c\xd6\x41\x9c\x43\x24\xa4\x61\x21\x03\x01\xf4\x32\x5e\x49\xde\xa3\x73\xd4\x77\x96\x00\x2d\x93\xa1\xd9\x1a\x2c\x48\x4d\xd8\x81\x78\x02\x42\x05\x25\xbb\xbd\x3d\x96\x40\x05\x2a\xd8\x0d\x8f\xd3\x09\x7a\xfd\xc8\xb3\xa5\x1d\x99\xf3\xfa\x7f\xe7\xe5\xb6\xc7\x3a\x6c\x6d\x3b\xe4\x4d\x0d\x7b\xc9\xb9\x10\x38\x8c\x0f\x4e\xc9\x62\x3c\xb7\x12\x08\x2c\xf4\x3a\x09\x6d\x12\xc7\x84\x53\xfe\x77\x47\xfb\x77\x38",
+ .ilen = 128,
+ .result = "\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x99\xca\xd8\x5f\x45\xca\x40\x94\x2d\x0d\x4d\x5e\x95\x0a\xde\x22"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x88\x7f\x6b\xe8\x7c\x55\xd3\x04\x84\x9b\xbc\x01\x17\xdf\x99\x47\x03\xfc\x04\xc4\x84\x80\xe0\x30\xbe\x14\x2c\xd6\x41",
+ .ilen = 32,
+ .result = "\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x8b\xbe\x14\x52\x72\xe7\xc2\xd9\xa1\x89\x1a\x3a\xb0\x98\x3d\x9d"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x88\x7f\x6b\xe8\x7c\x55\xd3\x04\x84\x9b\xbc\x01\x17\xdf\x99\x47\x03\xfc\x04\xc4\x84\x80\xe0\x30\xbe\x14\x2c\xd6\x41\xe3\xbc\xdb\x5b\xe1\x21\x03\x01\x8b\xcd\xa1\xb6\x5e\xa3\x73\xd4\x08\x69\xff\xd2\x13\xa1\xd9\x1a\x53\xb7\xb2\x27\x01\x78\x02\x42",
+ .ilen = 64,
+ .result = "\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x3b\x41\x86\x19\x13\xa8\xf6\xde\x7f\x61\xe2\x25\x63\x1b\xc3\x82"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff",
+ .alen = 16,
+ .input = "\x25\x6d\x40\x88\x7f\x6b\xe8\x7c\x55\xd3\x04\x84\x9b\xbc\x01\x17\xdf\x99\x47\x03\xfc\x04\xc4\x84\x80\xe0\x30\xbe\x14\x2c\xd6\x41\xe3\xbc\xdb\x5b\xe1\x21\x03\x01\x8b\xcd\xa1\xb6\x5e\xa3\x73\xd4\x08\x69\xff\xd2\x13\xa1\xd9\x1a\x53\xb7\xb2\x27\x01\x78\x02\x42\x7a\xda\x44\x42\xbd\x96\x40\x05\x55\x27\xf2\x70\x53\x09\x7a\xfd\xb7\x4c\x5a\xe2\x19\xf3\xfa\x7f\x98\x1a\x49\x38\xba\x6c\x6d\x3b\x9b\xb2\xf2\x84\x49\xb9\x10\x38\xf3\xf0\xb1\x36\xe2\x3c\xb7\x12\x77\xd3\x0b\xc5\x89\x6d\x12\xc7\xfb\xac\x01\x88\xc7\xfb\x77\x38",
+ .ilen = 128,
+ .result = "\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x84\x28\xbc\xf0\x23\xec\x6b\xf3\x1f\xd9\xef\xb2\x03\xff\x08\x71"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00",
+ .alen = 16,
+ .input = "\xda\x92\xbf\x77\x80\x94\x17\x83\xaa\x2c\xfb\x7b\x64\x43\xfe\xe8\x20\x66\xb8\xfc\x03\xfb\x3b\x7b\x7f\x1f\xcf\x41\xeb\xd3\x29\xbe",
+ .ilen = 32,
+ .result = "\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\x13\x9f\xdf\x64\x74\xea\x24\xf5\x49\xb0\x75\x82\x5f\x2c\x76\x20"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00",
+ .alen = 16,
+ .input = "\xda\x92\xbf\x77\x80\x94\x17\x83\xaa\x2c\xfb\x7b\x64\x43\xfe\xe8\x20\x66\xb8\xfc\x03\xfb\x3b\x7b\x7f\x1f\xcf\x41\xeb\xd3\x29\xbe\x1c\x43\x24\xa4\x1e\xde\xfc\xfe\x74\x32\x5e\x49\xa1\x5c\x8c\x2b\xf7\x96\x00\x2d\xec\x5e\x26\xe5\xac\x48\x4d\xd8\xfe\x87\xfd\xbd",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xbb\xad\x8d\x86\x3b\x83\x5a\x8e\x86\x64\xfd\x1d\x45\x66\xb6\xb4"
+}, { /* wycheproof - misc */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\xee\x32\x00",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00",
+ .alen = 16,
+ .input = "\xda\x92\xbf\x77\x80\x94\x17\x83\xaa\x2c\xfb\x7b\x64\x43\xfe\xe8\x20\x66\xb8\xfc\x03\xfb\x3b\x7b\x7f\x1f\xcf\x41\xeb\xd3\x29\xbe\x1c\x43\x24\xa4\x1e\xde\xfc\xfe\x74\x32\x5e\x49\xa1\x5c\x8c\x2b\xf7\x96\x00\x2d\xec\x5e\x26\xe5\xac\x48\x4d\xd8\xfe\x87\xfd\xbd\x85\x25\xbb\xbd\x42\x69\xbf\xfa\xaa\xd8\x0d\x8f\xac\xf6\x85\x02\x48\xb3\xa5\x1d\xe6\x0c\x05\x80\x67\xe5\xb6\xc7\x45\x93\x92\xc4\x64\x4d\x0d\x7b\xb6\x46\xef\xc7\x0c\x0f\x4e\xc9\x1d\xc3\x48\xed\x88\x2c\xf4\x3a\x76\x92\xed\x38\x04\x53\xfe\x77\x38\x04\x88\xc7",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\xff\xff\xff\xff\x00\x00\x00\x00\x42\xf2\x35\x42\x97\x84\x9a\x51\x1d\x53\xe5\x57\x17\x72\xf7\x1f"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x30\x30\x30\x30\x30\x30\x30\x30\x00\x02\x50\x6e",
+ .nlen = 12,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\xd4\x50\x0b\xf0\x09\x49\x35\x51\xc3\x80\xad\xf5\x2c\x57\x3a\x69\xdf\x7e\x8b\x76\x24\x63\x33\x0f\xac\xc1\x6a\x57\x26\xbe\x71\x90\xc6\x3c\x5a\x1c\x92\x65\x84\xa0\x96\x75\x68\x28\xdc\xdc\x64\xac\xdf\x96\x3d\x93\x1b\xf1\xda\xe2\x38\xf3\xf1\x57\x22\x4a\xc4\xb5\x42\xd7\x85\xb0\xdd\x84\xdb\x6b\xe3\xbc\x5a\x36\x63\xe8\x41\x49\xff\xbe\xd0\x9e\x54\xf7\x8f\x16\xa8\x22\x3b\x24\xcb\x01\x9f\x58\xb2\x1b\x0e\x55\x1e\x7a\xa0\x73\x27\x62\x95\x51\x37\x6c\xcb\xc3\x93\x76\x71\xa0\x62\x9b\xd9\x5c\x99\x15\xc7\x85\x55\x77\x1e\x7a",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x0b\x30\x0d\x8d\xa5\x6c\x21\x85\x75\x52\x79\x55\x3c\x4c\x82\xca"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x30\x30\x30\x30\x30\x30\x30\x30\x00\x03\x18\xa5",
+ .nlen = 12,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\x7d\xe8\x7f\x67\x29\x94\x52\x75\xd0\x65\x5d\xa4\xc7\xfd\xe4\x56\x9e\x16\xf1\x11\xb5\xeb\x26\xc2\x2d\x85\x9e\x3f\xf8\x22\xec\xed\x3a\x6d\xd9\xa6\x0f\x22\x95\x7f\x7b\x7c\x85\x7e\x88\x22\xeb\x9f\xe0\xb8\xd7\x02\x21\x41\xf2\xd0\xb4\x8f\x4b\x56\x12\xd3\x22\xa8\x8d\xd0\xfe\x0b\x4d\x91\x79\x32\x4f\x7c\x6c\x9e\x99\x0e\xfb\xd8\x0e\x5e\xd6\x77\x58\x26\x49\x8b\x1e\xfe\x0f\x71\xa0\xf3\xec\x5b\x29\xcb\x28\xc2\x54\x0a\x7d\xcd\x51\xb7\xda\xae\xe0\xff\x4a\x7f\x3a\xc1\xee\x54\xc2\x9e\xe4\xc1\x70\xde\x40\x8f\x66\x69\x21\x94",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xc5\x78\xe2\xaa\x44\xd3\x09\xb7\xb6\xa5\x19\x3b\xdc\x61\x18\xf5"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x00\x00\x00\x00\x00\x07\xb4\xf0",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\x1b\x99\x6f\x9a\x3c\xcc\x67\x85\xde\x22\xff\x5b\x8a\xdd\x95\x02\xce\x03\xa0\xfa\xf5\x99\x2a\x09\x52\x2c\xdd\x12\x06\xd2\x20\xb8\xf8\xbd\x07\xd1\xf1\xf5\xa1\xbd\x9a\x71\xd1\x1c\x7f\x57\x9b\x85\x58\x18\xc0\x8d\x4d\xe0\x36\x39\x31\x83\xb7\xf5\x90\xb3\x35\xae\xd8\xde\x5b\x57\xb1\x3c\x5f\xed\xe2\x44\x1c\x3e\x18\x4a\xa9\xd4\x6e\x61\x59\x85\x06\xb3\xe1\x1c\x43\xc6\x2c\xbc\xac\xec\xed\x33\x19\x08\x75\xb0\x12\x21\x8b\x19\x30\xfb\x7c\x38\xec\x45\xac\x11\xc3\x53\xd0\xcf\x93\x8d\xcc\xb9\xef\xad\x8f\xed\xbe\x46\xda\xa5",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x4b\x0b\xda\x8a\xd0\x43\x83\x0d\x83\x19\xab\x82\xc5\x0c\x76\x63"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x00\x00\x00\x00\x00\x20\xfb\x66",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\x86\xcb\xac\xae\x4d\x3f\x74\xae\x01\x21\x3e\x05\x51\xcc\x15\x16\x0e\xa1\xbe\x84\x08\xe3\xd5\xd7\x4f\x01\x46\x49\x95\xa6\x9e\x61\x76\xcb\x9e\x02\xb2\x24\x7e\xd2\x99\x89\x2f\x91\x82\xa4\x5c\xaf\x4c\x69\x40\x56\x11\x76\x6e\xdf\xaf\xdc\x28\x55\x19\xea\x30\x48\x0c\x44\xf0\x5e\x78\x1e\xac\xf8\xfc\xec\xc7\x09\x0a\xbb\x28\xfa\x5f\xd5\x85\xac\x8c\xda\x7e\x87\x72\xe5\x94\xe4\xce\x6c\x88\x32\x81\x93\x2e\x0f\x89\xf8\x77\xa1\xf0\x4d\x9c\x32\xb0\x6c\xf9\x0b\x0e\x76\x2b\x43\x0c\x4d\x51\x7c\x97\x10\x70\x68\xf4\x98\xef\x7f",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x4b\xc9\x8f\x72\xc4\x94\xc2\xa4\x3c\x2b\x15\xa1\x04\x3f\x1c\xfa"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x00\x00\x00\x00\x00\x38\xbb\x90",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\xfa\xb1\xcd\xdf\x4f\xe1\x98\xef\x63\xad\xd8\x81\xd6\xea\xd6\xc5\x76\x37\xbb\xe9\x20\x18\xca\x7c\x0b\x96\xfb\xa0\x87\x1e\x93\x2d\xb1\xfb\xf9\x07\x61\xbe\x25\xdf\x8d\xfa\xf9\x31\xce\x57\x57\xe6\x17\xb3\xd7\xa9\xf0\xbf\x0f\xfe\x5d\x59\x1a\x33\xc1\x43\xb8\xf5\x3f\xd0\xb5\xa1\x96\x09\xfd\x62\xe5\xc2\x51\xa4\x28\x1a\x20\x0c\xfd\xc3\x4f\x28\x17\x10\x40\x6f\x4e\x37\x62\x54\x46\xff\x6e\xf2\x24\x91\x3d\xeb\x0d\x89\xaf\x33\x71\x28\xe3\xd1\x55\xd1\x6d\x3e\xc3\x24\x60\x41\x43\x21\x43\xe9\xab\x3a\x6d\x2c\xcc\x2f\x4d\x62",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xf7\xe9\xe1\x51\xb0\x25\x33\xc7\x46\x58\xbf\xc7\x73\x7c\x68\x0d"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x00\x00\x00\x00\x00\x70\x48\x4a",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\x22\x72\x02\xbe\x7f\x35\x15\xe9\xd1\xc0\x2e\xea\x2f\x19\x50\xb6\x48\x1b\x04\x8a\x4c\x91\x50\x6c\xb4\x0d\x50\x4e\x6c\x94\x9f\x82\xd1\x97\xc2\x5a\xd1\x7d\xc7\x21\x65\x11\x25\x78\x2a\xc7\xa7\x12\x47\xfe\xae\xf3\x2f\x1f\x25\x0c\xe4\xbb\x8f\x79\xac\xaa\x17\x9d\x45\xa7\xb0\x54\x5f\x09\x24\x32\x5e\xfa\x87\xd5\xe4\x41\xd2\x84\x78\xc6\x1f\x22\x23\xee\x67\xc3\xb4\x1f\x43\x94\x53\x5e\x2a\x24\x36\x9a\x2e\x16\x61\x3c\x45\x94\x90\xc1\x4f\xb1\xd7\x55\xfe\x53\xfb\xe1\xee\x45\xb1\xb2\x1f\x71\x62\xe2\xfc\xaa\x74\x2a\xbe\xfd",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x79\x5b\xcf\xf6\x47\xc5\x53\xc2\xe4\xeb\x6e\x0e\xaf\xd9\xe0\x4e"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x00\x00\x00\x00\x00\x93\x2f\x40",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\xfa\xe5\x83\x45\xc1\x6c\xb0\xf5\xcc\x53\x7f\x2b\x1b\x34\x69\xc9\x69\x46\x3b\x3e\xa7\x1b\xcf\x6b\x98\xd6\x69\xa8\xe6\x0e\x04\xfc\x08\xd5\xfd\x06\x9c\x36\x26\x38\xe3\x40\x0e\xf4\xcb\x24\x2e\x27\xe2\x24\x5e\x68\xcb\x9e\xc5\x83\xda\x53\x40\xb1\x2e\xdf\x42\x3b\x73\x26\xad\x20\xfe\xeb\x57\xda\xca\x2e\x04\x67\xa3\x28\x99\xb4\x2d\xf8\xe5\x6d\x84\xe0\x06\xbc\x8a\x7a\xcc\x73\x1e\x7c\x1f\x6b\xec\xb5\x71\x9f\x70\x77\xf0\xd4\xf4\xc6\x1a\xb1\x1e\xba\xc1\x00\x18\x01\xce\x33\xc4\xe4\xa7\x7d\x83\x1d\x3c\xe3\x4e\x84\x10\xe1",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x19\x46\xd6\x53\x96\x0f\x94\x7a\x74\xd3\xe8\x09\x3c\xf4\x85\x02"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30\x30",
+ .nonce = "\x00\x00\x00\x00\x00\xe2\x93\x35",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\xeb\xb2\x16\xdd\xd7\xca\x70\x92\x15\xf5\x03\xdf\x9c\xe6\x3c\x5c\xd2\x19\x4e\x7d\x90\x99\xe8\xa9\x0b\x2a\xfa\xad\x5e\xba\x35\x06\x99\x25\xa6\x03\xfd\xbc\x34\x1a\xae\xd4\x15\x05\xb1\x09\x41\xfa\x38\x56\xa7\xe2\x47\xb1\x04\x07\x09\x74\x6c\xfc\x20\x96\xca\xa6\x31\xb2\xff\xf4\x1c\x25\x05\x06\xd8\x89\xc1\xc9\x06\x71\xad\xe8\x53\xee\x63\x94\xc1\x91\x92\xa5\xcf\x37\x10\xd1\x07\x30\x99\xe5\xbc\x94\x65\x82\xfc\x0f\xab\x9f\x54\x3c\x71\x6a\xe2\x48\x6a\x86\x83\xfd\xca\x39\xd2\xe1\x4f\x23\xd0\x0a\x58\x26\x64\xf4\xec\xb1",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x36\xc3\x00\x29\x85\xdd\x21\xba\xf8\x95\xd6\x33\x57\x3f\x12\xc0"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x00\x0e\xf7\xd5",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\x40\x8a\xe6\xef\x1c\x7e\xf0\xfb\x2c\x2d\x61\x08\x16\xfc\x78\x49\xef\xa5\x8f\x78\x27\x3f\x5f\x16\x6e\xa6\x5f\x81\xb5\x75\x74\x7d\x03\x5b\x30\x40\xfe\xde\x1e\xb9\x45\x97\x88\x66\x97\x88\x40\x8e\x00\x41\x3b\x3e\x37\x6d\x15\x2d\x20\x4a\xa2\xb7\xa8\x35\x58\xfc\xd4\x8a\x0e\xf7\xa2\x6b\x1c\xd6\xd3\x5d\x23\xb3\xf5\xdf\xe0\xca\x77\xa4\xce\x32\xb9\x4a\xbf\x83\xda\x2a\xef\xca\xf0\x68\x38\x08\x79\xe8\x9f\xb0\xa3\x82\x95\x95\xcf\x44\xc3\x85\x2a\xe2\xcc\x66\x2b\x68\x9f\x93\x55\xd9\xc1\x83\x80\x1f\x6a\xcc\x31\x3f\x89\x07",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x65\x14\x51\x8e\x0a\x26\x41\x42\xe0\xb7\x35\x1f\x96\x7f\xc2\xae"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x00\x3d\xfc\xe4",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\x0a\x0a\x24\x49\x9b\xca\xde\x58\xcf\x15\x76\xc3\x12\xac\xa9\x84\x71\x8c\xb4\xcc\x7e\x01\x53\xf5\xa9\x01\x58\x10\x85\x96\x44\xdf\xc0\x21\x17\x4e\x0b\x06\x0a\x39\x74\x48\xde\x8b\x48\x4a\x86\x03\xbe\x68\x0a\x69\x34\xc0\x90\x6f\x30\xdd\x17\xea\xe2\xd4\xc5\xfa\xa7\x77\xf8\xca\x53\x37\x0e\x08\x33\x1b\x88\xc3\x42\xba\xc9\x59\x78\x7b\xbb\x33\x93\x0e\x3b\x56\xbe\x86\xda\x7f\x2a\x6e\xb1\xf9\x40\x89\xd1\xd1\x81\x07\x4d\x43\x02\xf8\xe0\x55\x2d\x0d\xe1\xfa\xb3\x06\xa2\x1b\x42\xd4\xc3\xba\x6e\x6f\x0c\xbc\xc8\x1e\x87\x7a",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x4c\x19\x4d\xa6\xa9\x9f\xd6\x5b\x40\xe9\xca\xd7\x98\xf4\x4b\x19"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x01\x84\x86\xa8",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\x4a\x0a\xaf\xf8\x49\x47\x29\x18\x86\x91\x70\x13\x40\xf3\xce\x2b\x8a\x78\xee\xd3\xa0\xf0\x65\x99\x4b\x72\x48\x4e\x79\x91\xd2\x5c\x29\xaa\x07\x5e\xb1\xfc\x16\xde\x93\xfe\x06\x90\x58\x11\x2a\xb2\x84\xa3\xed\x18\x78\x03\x26\xd1\x25\x8a\x47\x22\x2f\xa6\x33\xd8\xb2\x9f\x3b\xd9\x15\x0b\x23\x9b\x15\x46\xc2\xbb\x9b\x9f\x41\x0f\xeb\xea\xd3\x96\x00\x0e\xe4\x77\x70\x15\x32\xc3\xd0\xf5\xfb\xf8\x95\xd2\x80\x19\x6d\x2f\x73\x7c\x5e\x9f\xec\x50\xd9\x2b\xb0\xdf\x5d\x7e\x51\x3b\xe5\xb8\xea\x97\x13\x10\xd5\xbf\x16\xba\x7a\xee",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xc8\xae\x77\x88\xcd\x28\x74\xab\xc1\x38\x54\x1e\x11\xfd\x05\x87"
+}, { /* wycheproof - checking for int overflows */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff",
+ .alen = 64,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x78\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x9f\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x9c\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\x47\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\xd4\xd2\x06\x61\x6f\x92\x93\xf6\x5b\x45\xdb\xbc\x74\xe7\xc2\xed\xfb\xcb\xbf\x1c\xfb\x67\x9b\xb7\x39\xa5\x86\x2d\xe2\xbc\xb9\x37\xf7\x4d\x5b\xf8\x67\x1c\x5a\x8a\x50\x92\xf6\x1d\x54\xc9\xaa\x5b",
+ .ilen = 128,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x93\x3a\x51\x63\xc7\xf6\x23\x68\x32\x7b\x3f\xbc\x10\x36\xc9\x43"
+}, { /* wycheproof - special case tag */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b",
+ .nlen = 12,
+ .assoc = "\x85\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xa6\x90\x2f\xcb\xc8\x83\xbb\xc1\x80\xb2\x56\xae\x34\xad\x7f\x00",
+ .alen = 32,
+ .input = "\x9a\x49\xc4\x0f\x8b\x48\xd7\xc6\x6d\x1d\xb4\xe5\x3f\x20\xf2\xdd\x4a\xaa\x24\x1d\xda\xb2\x6b\x5b\xc0\xe2\x18\xb7\x2c\x33\x90\xf2\xdf\x3e\xbd\x01\x76\x70\x44\x19\x97\x2b\xcd\xbc\x6b\xbc\xb3\xe4\xe7\x4a\x71\x52\x8e\xf5\x12\x63\xce\x24\xe0\xd5\x75\xe0\xe4\x4d",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b\x0c\x0d\x0e\x0f"
+}, { /* wycheproof - special case tag */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b",
+ .nlen = 12,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x24\x7e\x50\x64\x2a\x1c\x0a\x2f\x8f\x77\x21\x96\x09\xdb\xa9\x58",
+ .alen = 32,
+ .input = "\x9a\x49\xc4\x0f\x8b\x48\xd7\xc6\x6d\x1d\xb4\xe5\x3f\x20\xf2\xdd\x4a\xaa\x24\x1d\xda\xb2\x6b\x5b\xc0\xe2\x18\xb7\x2c\x33\x90\xf2\xdf\x3e\xbd\x01\x76\x70\x44\x19\x97\x2b\xcd\xbc\x6b\xbc\xb3\xe4\xe7\x4a\x71\x52\x8e\xf5\x12\x63\xce\x24\xe0\xd5\x75\xe0\xe4\x4d",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
+}, { /* wycheproof - special case tag */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b",
+ .nlen = 12,
+ .assoc = "\x7c\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xd9\xe7\x2c\x06\x4a\xc8\x96\x1f\x3f\xa5\x85\xe0\xe2\xab\xd6\x00",
+ .alen = 32,
+ .input = "\x9a\x49\xc4\x0f\x8b\x48\xd7\xc6\x6d\x1d\xb4\xe5\x3f\x20\xf2\xdd\x4a\xaa\x24\x1d\xda\xb2\x6b\x5b\xc0\xe2\x18\xb7\x2c\x33\x90\xf2\xdf\x3e\xbd\x01\x76\x70\x44\x19\x97\x2b\xcd\xbc\x6b\xbc\xb3\xe4\xe7\x4a\x71\x52\x8e\xf5\x12\x63\xce\x24\xe0\xd5\x75\xe0\xe4\x4d",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff"
+}, { /* wycheproof - special case tag */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b",
+ .nlen = 12,
+ .assoc = "\x65\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x95\xaf\x0f\x4d\x0b\x68\x6e\xae\xcc\xca\x43\x07\xd5\x96\xf5\x02",
+ .alen = 32,
+ .input = "\x9a\x49\xc4\x0f\x8b\x48\xd7\xc6\x6d\x1d\xb4\xe5\x3f\x20\xf2\xdd\x4a\xaa\x24\x1d\xda\xb2\x6b\x5b\xc0\xe2\x18\xb7\x2c\x33\x90\xf2\xdf\x3e\xbd\x01\x76\x70\x44\x19\x97\x2b\xcd\xbc\x6b\xbc\xb3\xe4\xe7\x4a\x71\x52\x8e\xf5\x12\x63\xce\x24\xe0\xd5\x75\xe0\xe4\x4d",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80\x00\x00\x00\x80"
+}, { /* wycheproof - special case tag */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b",
+ .nlen = 12,
+ .assoc = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x85\x40\xb4\x64\x35\x77\x07\xbe\x3a\x39\xd5\x5c\x34\xf8\xbc\xb3",
+ .alen = 32,
+ .input = "\x9a\x49\xc4\x0f\x8b\x48\xd7\xc6\x6d\x1d\xb4\xe5\x3f\x20\xf2\xdd\x4a\xaa\x24\x1d\xda\xb2\x6b\x5b\xc0\xe2\x18\xb7\x2c\x33\x90\xf2\xdf\x3e\xbd\x01\x76\x70\x44\x19\x97\x2b\xcd\xbc\x6b\xbc\xb3\xe4\xe7\x4a\x71\x52\x8e\xf5\x12\x63\xce\x24\xe0\xd5\x75\xe0\xe4\x4d",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f\xff\xff\xff\x7f"
+}, { /* wycheproof - special case tag */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b",
+ .nlen = 12,
+ .assoc = "\x4f\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x66\x23\xd9\x90\xb8\x98\xd8\x30\xd2\x12\xaf\x23\x83\x33\x07\x01",
+ .alen = 32,
+ .input = "\x9a\x49\xc4\x0f\x8b\x48\xd7\xc6\x6d\x1d\xb4\xe5\x3f\x20\xf2\xdd\x4a\xaa\x24\x1d\xda\xb2\x6b\x5b\xc0\xe2\x18\xb7\x2c\x33\x90\xf2\xdf\x3e\xbd\x01\x76\x70\x44\x19\x97\x2b\xcd\xbc\x6b\xbc\xb3\xe4\xe7\x4a\x71\x52\x8e\xf5\x12\x63\xce\x24\xe0\xd5\x75\xe0\xe4\x4d",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00"
+}, { /* wycheproof - special case tag */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b",
+ .nlen = 12,
+ .assoc = "\x83\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x16\xd0\x9f\x17\x78\x72\x11\xb7\xd4\x84\xe0\x24\xf8\x97\x01",
+ .alen = 32,
+ .input = "\x9a\x49\xc4\x0f\x8b\x48\xd7\xc6\x6d\x1d\xb4\xe5\x3f\x20\xf2\xdd\x4a\xaa\x24\x1d\xda\xb2\x6b\x5b\xc0\xe2\x18\xb7\x2c\x33\x90\xf2\xdf\x3e\xbd\x01\x76\x70\x44\x19\x97\x2b\xcd\xbc\x6b\xbc\xb3\xe4\xe7\x4a\x71\x52\x8e\xf5\x12\x63\xce\x24\xe0\xd5\x75\xe0\xe4\x4d",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x00\x52\x35\xd2\xa9\x19\xf2\x8d\x3d\xb7\x66\x4a\x34\xae\x6b\x44\x4d\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x5b\x8b\x94\x50\x9e\x2b\x74\xa3\x6d\x34\x6e\x33\xd5\x72\x65\x9b\xa9\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\x83\xdc\xe9\xf3\x07\x3e\xfa\xdb\x7d\x23\xb8\x7a\xce\x35\x16\x8c",
+ .ilen = 80,
+ .result = "\x00\x39\xe2\xfd\x2f\xd3\x12\x14\x9e\x98\x98\x80\x88\x48\x13\xe7\xca\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3b\x0e\x86\x9a\xaa\x8e\xa4\x96\x32\xff\xff\x37\xb9\xe8\xce\x00\xca\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3b\x0e\x86\x9a\xaa\x8e\xa4\x96\x32\xff\xff\x37\xb9\xe8\xce\x00\xa5\x19\xac\x1a\x35\xb4\xa5\x77\x87\x51\x0a\xf7\x8d\x8d\x20\x0a"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xd3\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xe5\xda\x78\x76\x6f\xa1\x92\x90\xc0\x31\xf7\x52\x08\x50\x67\x45\xae\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x49\x6d\xde\xb0\x55\x09\xc6\xef\xff\xab\x75\xeb\x2d\xf4\xab\x09\x76\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x01\x49\xef\x50\x4b\x71\xb1\x20\xca\x4f\xf3\x95\x19\xc2\xc2\x10",
+ .ilen = 96,
+ .result = "\xd3\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x62\x18\xb2\x7f\x83\xb8\xb4\x66\x02\xf6\xe1\xd8\x34\x20\x7b\x02\xce\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2a\x64\x16\xce\xdb\x1c\xdd\x29\x6e\xf5\xd7\xd6\x92\xda\xff\x02\xce\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2a\x64\x16\xce\xdb\x1c\xdd\x29\x6e\xf5\xd7\xd6\x92\xda\xff\x02\x30\x2f\xe8\x2a\xb0\xa0\x9a\xf6\x44\x00\xd0\x15\xae\x83\xd9\xcc"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xe9\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x6d\xf1\x39\x4e\xdc\x53\x9b\x5b\x3a\x09\x57\xbe\x0f\xb8\x59\x46\x80\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\xd1\x76\x9f\xe8\x06\xbb\xfe\xb6\xf5\x90\x95\x0f\x2e\xac\x9e\x0a\x58\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x99\x52\xae\x08\x18\xc3\x89\x79\xc0\x74\x13\x71\x1a\x9a\xf7\x13",
+ .ilen = 96,
+ .result = "\xe9\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xea\x33\xf3\x47\x30\x4a\xbd\xad\xf8\xce\x41\x34\x33\xc8\x45\x01\xe0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xb2\x7f\x57\x96\x88\xae\xe5\x70\x64\xce\x37\x32\x91\x82\xca\x01\xe0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xb2\x7f\x57\x96\x88\xae\xe5\x70\x64\xce\x37\x32\x91\x82\xca\x01\x98\xa7\xe8\x36\xe0\xee\x4d\x02\x35\x00\xd0\x55\x7e\xc2\xcb\xe0"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x64\xf9\x0f\x5b\x26\x92\xb8\x60\xd4\x59\x6f\xf4\xb3\x40\x2c\x5c\x00\xb9\xbb\x53\x70\x7a\xa6\x67\xd3\x56\xfe\x50\xc7\x19\x96\x94\x03\x35\x61\xe7\xca\xca\x6d\x94\x1d\xc3\xcd\x69\x14\xad\x69\x04",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xe3\x3b\xc5\x52\xca\x8b\x9e\x96\x16\x9e\x79\x7e\x8f\x30\x30\x1b\x60\x3c\xa9\x99\x44\xdf\x76\x52\x8c\x9d\x6f\x54\xab\x83\x3d\x0f\x60\x3c\xa9\x99\x44\xdf\x76\x52\x8c\x9d\x6f\x54\xab\x83\x3d\x0f\x6a\xb8\xdc\xe2\xc5\x9d\xa4\x73\x71\x30\xb0\x25\x2f\x68\xa8\xd8"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x68\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xb0\x8f\x25\x67\x5b\x9b\xcb\xf6\xe3\x84\x07\xde\x2e\xc7\x5a\x47\x9f\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x2d\x2a\xf7\xcd\x6b\x08\x05\x01\xd3\x1b\xa5\x4f\xb2\xeb\x75\x96\x47\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x65\x0e\xc6\x2d\x75\x70\x72\xce\xe6\xff\x23\x31\x86\xdd\x1c\x8f",
+ .ilen = 96,
+ .result = "\x68\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x37\x4d\xef\x6e\xb7\x82\xed\x00\x21\x43\x11\x54\x12\xb7\x46\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x4e\x23\x3f\xb3\xe5\x1d\x1e\xc7\x42\x45\x07\x72\x0d\xc5\x21\x9d\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x4e\x23\x3f\xb3\xe5\x1d\x1e\xc7\x42\x45\x07\x72\x0d\xc5\x21\x9d\x04\x4d\xea\x60\x88\x80\x41\x2b\xfd\xff\xcf\x35\x57\x9e\x9b\x26"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x6d\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xa1\x61\xb5\xab\x04\x09\x00\x62\x9e\xfe\xff\x78\xd7\xd8\x6b\x45\x9f\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\xc6\xf8\x07\x8c\xc8\xef\x12\xa0\xff\x65\x7d\x6d\x08\xdb\x10\xb8\x47\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x8e\xdc\x36\x6c\xd6\x97\x65\x6f\xca\x81\xfb\x13\x3c\xed\x79\xa1",
+ .ilen = 96,
+ .result = "\x6d\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x26\xa3\x7f\xa2\xe8\x10\x26\x94\x5c\x39\xe9\xf2\xeb\xa8\x77\x02\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xa5\xf1\xcf\xf2\x46\xfa\x09\x66\x6e\x3b\xdf\x50\xb7\xf5\x44\xb3\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xa5\xf1\xcf\xf2\x46\xfa\x09\x66\x6e\x3b\xdf\x50\xb7\xf5\x44\xb3\x1e\x6b\xea\x63\x14\x54\x2e\x2e\xf9\xff\xcf\x45\x0b\x2e\x98\x2b"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xfc\x01\xb8\x91\xe5\xf0\xf9\x12\x8d\x7d\x1c\x57\x91\x92\xb6\x98\x63\x41\x44\x15\xb6\x99\x68\x95\x9a\x72\x91\xb7\xa5\xaf\x13\x48\x60\xcd\x9e\xa1\x0c\x29\xa3\x66\x54\xe7\xa2\x8e\x76\x1b\xec\xd8",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x7b\xc3\x72\x98\x09\xe9\xdf\xe4\x4f\xba\x0a\xdd\xad\xe2\xaa\xdf\x03\xc4\x56\xdf\x82\x3c\xb8\xa0\xc5\xb9\x00\xb3\xc9\x35\xb8\xd3\x03\xc4\x56\xdf\x82\x3c\xb8\xa0\xc5\xb9\x00\xb3\xc9\x35\xb8\xd3\xed\x20\x17\xc8\xdb\xa4\x77\x56\x29\x04\x9d\x78\x6e\x3b\xce\xb1"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x6b\x6d\xc9\xd2\x1a\x81\x9e\x70\xb5\x77\xf4\x41\x37\xd3\xd6\xbd\x13\x35\xf5\xeb\x44\x49\x40\x77\xb2\x64\x49\xa5\x4b\x6c\x7c\x75\x10\xb9\x2f\x5f\xfe\xf9\x8b\x84\x7c\xf1\x7a\x9c\x98\xd8\x83\xe5",
+ .ilen = 64,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xec\xaf\x03\xdb\xf6\x98\xb8\x86\x77\xb0\xe2\xcb\x0b\xa3\xca\xfa\x73\xb0\xe7\x21\x70\xec\x90\x42\xed\xaf\xd8\xa1\x27\xf6\xd7\xee\x73\xb0\xe7\x21\x70\xec\x90\x42\xed\xaf\xd8\xa1\x27\xf6\xd7\xee\x07\x3f\x17\xcb\x67\x78\x64\x59\x25\x04\x9d\x88\x22\xcb\xca\xb6"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\xcb\x2b\x11\x06\xf8\x23\x4c\x5e\x99\xd4\xdb\x4c\x70\x48\xde\x32\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x16\xe9\x88\x4a\x11\x4f\x0e\x92\x66\xce\xa3\x88\x5f\xe3\x6b\x9f\xd6\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\xce\xbe\xf5\xe9\x88\x5a\x80\xea\x76\xd9\x75\xc1\x44\xa4\x18\x88",
+ .ilen = 80,
+ .result = "\xff\xa0\xfc\x3e\x80\x32\xc3\xd5\xfd\xb6\x2a\x11\xf0\x96\x30\x7d\xb5\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x76\x6c\x9a\x80\x25\xea\xde\xa7\x39\x05\x32\x8c\x33\x79\xc0\x04\xb5\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x76\x6c\x9a\x80\x25\xea\xde\xa7\x39\x05\x32\x8c\x33\x79\xc0\x04\x8b\x9b\xb4\xb4\x86\x12\x89\x65\x8c\x69\x6a\x83\x40\x15\x04\x05"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x6f\x9e\x70\xed\x3b\x8b\xac\xa0\x26\xe4\x6a\x5a\x09\x43\x15\x8d\x21\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x0c\x61\x2c\x5e\x8d\x89\xa8\x73\xdb\xca\xad\x5b\x73\x46\x42\x9b\xc5\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\xd4\x36\x51\xfd\x14\x9c\x26\x0b\xcb\xdd\x7b\x12\x68\x01\x31\x8c",
+ .ilen = 80,
+ .result = "\x6f\xf5\xa7\xc2\xbd\x41\x4c\x39\x85\xcb\x94\x90\xb5\xa5\x6d\x2e\xa6\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x6c\xe4\x3e\x94\xb9\x2c\x78\x46\x84\x01\x3c\x5f\x1f\xdc\xe9\x00\xa6\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x6c\xe4\x3e\x94\xb9\x2c\x78\x46\x84\x01\x3c\x5f\x1f\xdc\xe9\x00\x8b\x3b\xbd\x51\x64\x44\x59\x56\x8d\x81\xca\x1f\xa7\x2c\xe4\x04"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x41\x2b\x08\x0a\x3e\x19\xc1\x0d\x44\xa1\xaf\x1e\xab\xde\xb4\xce\x35\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x6b\x83\x94\x33\x09\x21\x48\x6c\xa1\x1d\x29\x1c\x3e\x97\xee\x9a\xd1\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\xb3\xd4\xe9\x90\x90\x34\xc6\x14\xb1\x0a\xff\x55\x25\xd0\x9d\x8d",
+ .ilen = 80,
+ .result = "\x41\x40\xdf\x25\xb8\xd3\x21\x94\xe7\x8e\x51\xd4\x17\x38\xcc\x6d\xb2\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x0b\x06\x86\xf9\x3d\x84\x98\x59\xfe\xd6\xb8\x18\x52\x0d\x45\x01\xb2\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x0b\x06\x86\xf9\x3d\x84\x98\x59\xfe\xd6\xb8\x18\x52\x0d\x45\x01\x86\xfb\xab\x2b\x4a\x94\xf4\x7a\xa5\x6f\x0a\xea\x65\xd1\x10\x08"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xb2\x47\xa7\x47\x23\x49\x1a\xac\xac\xaa\xd7\x09\xc9\x1e\x93\x2b\x31\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x9a\xde\x04\xe7\x5b\xb7\x01\xd9\x66\x06\x01\xb3\x47\x65\xde\x98\xd5\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\x42\x89\x79\x44\xc2\xa2\x8f\xa1\x76\x11\xd7\xfa\x5c\x22\xad\x8f",
+ .ilen = 80,
+ .result = "\xb2\x2c\x70\x68\xa5\x83\xfa\x35\x0f\x85\x29\xc3\x75\xf8\xeb\x88\xb6\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xfa\x5b\x16\x2d\x6f\x12\xd1\xec\x39\xcd\x90\xb7\x2b\xff\x75\x03\xb6\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xfa\x5b\x16\x2d\x6f\x12\xd1\xec\x39\xcd\x90\xb7\x2b\xff\x75\x03\xa0\x19\xac\x2e\xd6\x67\xe1\x7d\xa1\x6f\x0a\xfa\x19\x61\x0d\x0d"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x74\x0f\x9e\x49\xf6\x10\xef\xa5\x85\xb6\x59\xca\x6e\xd8\xb4\x99\x2d\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x41\x2d\x96\xaf\xbe\x80\xec\x3e\x79\xd4\x51\xb0\x0a\x2d\xb2\x9a\xc9\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\x99\x7a\xeb\x0c\x27\x95\x62\x46\x69\xc3\x87\xf9\x11\x6a\xc1\x8d",
+ .ilen = 80,
+ .result = "\x74\x64\x49\x66\x70\xda\x0f\x3c\x26\x99\xa7\x00\xd2\x3e\xcc\x3a\xaa\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x21\xa8\x84\x65\x8a\x25\x3c\x0b\x26\x1f\xc0\xb4\x66\xb7\x19\x01\xaa\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x21\xa8\x84\x65\x8a\x25\x3c\x0b\x26\x1f\xc0\xb4\x66\xb7\x19\x01\x73\x6e\x18\x18\x16\x96\xa5\x88\x9c\x31\x59\xfa\xab\xab\x20\xfd"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xad\xba\x5d\x10\x5b\xc8\xaa\x06\x2c\x23\x36\xcb\x88\x9d\xdb\xd5\x37\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x17\x7c\x5f\xfe\x28\x75\xf4\x68\xf6\xc2\x96\x57\x48\xf3\x59\x9a\xd3\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\xcf\x2b\x22\x5d\xb1\x60\x7a\x10\xe6\xd5\x40\x1e\x53\xb4\x2a\x8d",
+ .ilen = 80,
+ .result = "\xad\xd1\x8a\x3f\xdd\x02\x4a\x9f\x8f\x0c\xc8\x01\x34\x7b\xa3\x76\xb0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x77\xf9\x4d\x34\x1c\xd0\x24\x5d\xa9\x09\x07\x53\x24\x69\xf2\x01\xb0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x77\xf9\x4d\x34\x1c\xd0\x24\x5d\xa9\x09\x07\x53\x24\x69\xf2\x01\xba\xd5\x8f\x10\xa9\x1e\x6a\x88\x9a\xba\x32\xfd\x17\xd8\x33\x1a"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xfe\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xc0\x01\xed\xc5\xda\x44\x2e\x71\x9b\xce\x9a\xbe\x27\x3a\xf1\x44\xb4\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x48\x02\x5f\x41\xfa\x4e\x33\x6c\x78\x69\x57\xa2\xa7\xc4\x93\x0a\x6c\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x00\x26\x6e\xa1\xe4\x36\x44\xa3\x4d\x8d\xd1\xdc\x93\xf2\xfa\x13",
+ .ilen = 96,
+ .result = "\xfe\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x47\xc3\x27\xcc\x36\x5d\x08\x87\x59\x09\x8c\x34\x1b\x4a\xed\x03\xd4\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2b\x0b\x97\x3f\x74\x5b\x28\xaa\xe9\x37\xf5\x9f\x18\xea\xc7\x01\xd4\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2b\x0b\x97\x3f\x74\x5b\x28\xaa\xe9\x37\xf5\x9f\x18\xea\xc7\x01\xd6\x8c\xe1\x74\x07\x9a\xdd\x02\x8d\xd0\x5c\xf8\x14\x63\x04\x88"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xb5\x13\xb0\x6a\xb9\xac\x14\x43\x5a\xcb\x8a\xa3\xa3\x7a\xfd\xb6\x54\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x61\x95\x01\x93\xb1\xbf\x03\x11\xff\x11\x79\x89\xae\xd9\xa9\x99\xb0\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\xb9\xc2\x7c\x30\x28\xaa\x8d\x69\xef\x06\xaf\xc0\xb5\x9e\xda\x8e",
+ .ilen = 80,
+ .result = "\xb5\x78\x67\x45\x3f\x66\xf4\xda\xf9\xe4\x74\x69\x1f\x9c\x85\x15\xd3\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x01\x10\x13\x59\x85\x1a\xd3\x24\xa0\xda\xe8\x8d\xc2\x43\x02\x02\xd3\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x01\x10\x13\x59\x85\x1a\xd3\x24\xa0\xda\xe8\x8d\xc2\x43\x02\x02\xaa\x48\xa3\x88\x7d\x4b\x05\x96\x99\xc2\xfd\xf9\xc6\x78\x7e\x0a"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xd4\xf1\x09\xe8\x14\xce\xa8\x5a\x08\xc0\x11\xd8\x50\xdd\x1d\xcb\xcf\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x53\x40\xb8\x5a\x9a\xa0\x82\x96\xb7\x7a\x5f\xc3\x96\x1f\x66\x0f\x17\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x1b\x64\x89\xba\x84\xd8\xf5\x59\x82\x9e\xd9\xbd\xa2\x29\x0f\x16",
+ .ilen = 96,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x53\x33\xc3\xe1\xf8\xd7\x8e\xac\xca\x07\x07\x52\x6c\xad\x01\x8c\xaf\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x30\x49\x70\x24\x14\xb5\x99\x50\x26\x24\xfd\xfe\x29\x31\x32\x04\xaf\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x30\x49\x70\x24\x14\xb5\x99\x50\x26\x24\xfd\xfe\x29\x31\x32\x04\xb9\x36\xa8\x17\xf2\x21\x1a\xf1\x29\xe2\xcf\x16\x0f\xd4\x2b\xcb"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xdf\x4c\x62\x03\x2d\x41\x19\xb5\x88\x47\x7e\x99\x92\x5a\x56\xd9\xd6\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\xfa\x84\xf0\x64\x55\x36\x42\x1b\x2b\xb9\x24\x6e\xc2\x19\xed\x0b\x0e\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\xb2\xa0\xc1\x84\x4b\x4e\x35\xd4\x1e\x5d\xa2\x10\xf6\x2f\x84\x12",
+ .ilen = 96,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x58\x8e\xa8\x0a\xc1\x58\x3f\x43\x4a\x80\x68\x13\xae\x2a\x4a\x9e\xb6\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x99\x8d\x38\x1a\xdb\x23\x59\xdd\xba\xe7\x86\x53\x7d\x37\xb9\x00\xb6\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x99\x8d\x38\x1a\xdb\x23\x59\xdd\xba\xe7\x86\x53\x7d\x37\xb9\x00\x9f\x7a\xc4\x35\x1f\x6b\x91\xe6\x30\x97\xa7\x13\x11\x5d\x05\xbe"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x13\xf8\x0a\x00\x6d\xc1\xbb\xda\xd6\x39\xa9\x2f\xc7\xec\xa6\x55\xf7\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x63\x48\xb8\xfd\x29\xbf\x96\xd5\x63\xa5\x17\xe2\x7d\x7b\xfc\x0f\x2f\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x2b\x6c\x89\x1d\x37\xc7\xe1\x1a\x56\x41\x91\x9c\x49\x4d\x95\x16",
+ .ilen = 96,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x94\x3a\xc0\x09\x81\xd8\x9d\x2c\x14\xfe\xbf\xa5\xfb\x9c\xba\x12\x97\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x41\x70\x83\xa7\xaa\x8d\x13\xf2\xfb\xb5\xdf\xc2\x55\xa8\x04\x97\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x41\x70\x83\xa7\xaa\x8d\x13\xf2\xfb\xb5\xdf\xc2\x55\xa8\x04\x9a\x18\xa8\x28\x07\x02\x69\xf4\x47\x00\xd0\x09\xe7\x17\x1c\xc9"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x82\xe5\x9b\x45\x82\x91\x50\x38\xf9\x33\x81\x1e\x65\x2d\xc6\x6a\xfc\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\xb6\x71\xc8\xca\xc2\x70\xc2\x65\xa0\xac\x2f\x53\x57\x99\x88\x0a\x24\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\xfe\x55\xf9\x2a\xdc\x08\xb5\xaa\x95\x48\xa9\x2d\x63\xaf\xe1\x13",
+ .ilen = 96,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x05\x27\x51\x4c\x6e\x88\x76\xce\x3b\xf4\x97\x94\x59\x5d\xda\x2d\x9c\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xd5\x78\x00\xb4\x4c\x65\xd9\xa3\x31\xf2\x8d\x6e\xe8\xb7\xdc\x01\x9c\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xd5\x78\x00\xb4\x4c\x65\xd9\xa3\x31\xf2\x8d\x6e\xe8\xb7\xdc\x01\xb4\x36\xa8\x2b\x93\xd5\x55\xf7\x43\x00\xd0\x19\x9b\xa7\x18\xce"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xff\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\xf1\xd1\x28\x87\xb7\x21\x69\x86\xa1\x2d\x79\x09\x8b\x6d\xe6\x0f\xc0\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\xa7\xc7\x58\x99\xf3\xe6\x0a\xf1\xfc\xb6\xc7\x30\x7d\x87\x59\x0f\x18\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\xef\xe3\x69\x79\xed\x9e\x7d\x3e\xc9\x52\x41\x4e\x49\xb1\x30\x16",
+ .ilen = 96,
+ .result = "\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x76\x13\xe2\x8e\x5b\x38\x4f\x70\x63\xea\x6f\x83\xb7\x1d\xfa\x48\xa0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xc4\xce\x90\xe7\x7d\xf3\x11\x37\x6d\xe8\x65\x0d\xc2\xa9\x0d\x04\xa0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xc4\xce\x90\xe7\x7d\xf3\x11\x37\x6d\xe8\x65\x0d\xc2\xa9\x0d\x04\xce\x54\xa8\x2e\x1f\xa9\x42\xfa\x3f\x00\xd0\x29\x4f\x37\x15\xd3"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xcb\xf1\xda\x9e\x0b\xa9\x37\x73\x74\xe6\x9e\x1c\x0e\x60\x0c\xfc\x34\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\xbe\x3f\xa6\x6b\x6c\xe7\x80\x8a\xa3\xe4\x59\x49\xf9\x44\x64\x9f\xd0\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\x66\x68\xdb\xc8\xf5\xf2\x0e\xf2\xb3\xf3\x8f\x00\xe2\x03\x17\x88",
+ .ilen = 80,
+ .result = "\xcb\x9a\x0d\xb1\x8d\x63\xd7\xea\xd7\xc9\x60\xd6\xb2\x86\x74\x5f\xb3\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xde\xba\xb4\xa1\x58\x42\x50\xbf\xfc\x2f\xc8\x4d\x95\xde\xcf\x04\xb3\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xde\xba\xb4\xa1\x58\x42\x50\xbf\xfc\x2f\xc8\x4d\x95\xde\xcf\x04\x23\x83\xab\x0b\x79\x92\x05\x69\x9b\x51\x0a\xa7\x09\xbf\x31\xf1"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x8f\x27\x86\x94\xc4\xe9\xda\xeb\xd5\x8d\x3e\x5b\x96\x6e\x8b\x68\x42\x3d\x35\xf6\x13\xe6\xd9\x09\x3d\x38\xe9\x75\xc3\x8f\xe3\xb8\x06\x53\xe7\xa3\x31\x71\x88\x33\xac\xc3\xb9\xad\xff\x1c\x31\x98\xa6\xf6\x37\x81\x71\xea\xe4\x39\x6e\xa1\x5d\xc2\x40\xd1\xab\xf4\xde\x04\x9a\x00\xa8\x64\x06\x4b\xbc\xd4\x6f\xe4\xe4\x5b\x42\x8f",
+ .ilen = 80,
+ .result = "\x8f\x4c\x51\xbb\x42\x23\x3a\x72\x76\xa2\xc0\x91\x2a\x88\xf3\xcb\xc5\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x66\xd6\xf5\x69\x05\xd4\x58\x06\xf3\x08\x28\xa9\x93\x86\x9a\x03\xc5\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x66\xd6\xf5\x69\x05\xd4\x58\x06\xf3\x08\x28\xa9\x93\x86\x9a\x03\x8b\xfb\xab\x17\xa9\xe0\xb8\x74\x8b\x51\x0a\xe7\xd9\xfd\x23\x05"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xd5\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x9a\x22\xd7\x0a\x48\xe2\x4f\xdd\xcd\xd4\x41\x9d\xe6\x4c\x8f\x44\xfc\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x77\xb5\xc9\x07\xd9\xc9\xe1\xea\x51\x85\x1a\x20\x4a\xad\x9f\x0a\x24\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x3f\x91\xf8\xe7\xc7\xb1\x96\x25\x64\x61\x9c\x5e\x7e\x9b\xf6\x13",
+ .ilen = 96,
+ .result = "\xd5\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x1d\xe0\x1d\x03\xa4\xfb\x69\x2b\x0f\x13\x57\x17\xda\x3c\x93\x03\x9c\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x14\xbc\x01\x79\x57\xdc\xfa\x2c\xc0\xdb\xb8\x1d\xf5\x83\xcb\x01\x9c\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x14\xbc\x01\x79\x57\xdc\xfa\x2c\xc0\xdb\xb8\x1d\xf5\x83\xcb\x01\x49\xbc\x6e\x9f\xc5\x1c\x4d\x50\x30\x36\x64\x4d\x84\x27\x73\xd2"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\xdb\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x75\xd5\x64\x3a\xa5\xaf\x93\x4d\x8c\xce\x39\x2c\xc3\xee\xdb\x47\xc0\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\x60\x1b\x5a\xd2\x06\x7f\x28\x06\x6a\x8f\x32\x81\x71\x5b\xa8\x08\x18\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x28\x3f\x6b\x32\x18\x07\x5f\xc9\x5f\x6b\xb4\xff\x45\x6d\xc1\x11",
+ .ilen = 96,
+ .result = "\xdb\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xf2\x17\xae\x33\x49\xb6\xb5\xbb\x4e\x09\x2f\xa6\xff\x9e\xc7\x00\xa0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x03\x12\x92\xac\x88\x6a\x33\xc0\xfb\xd1\x90\xbc\xce\x75\xfc\x03\xa0\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x03\x12\x92\xac\x88\x6a\x33\xc0\xfb\xd1\x90\xbc\xce\x75\xfc\x03\x63\xda\x6e\xa2\x51\xf0\x39\x53\x2c\x36\x64\x5d\x38\xb7\x6f\xd7"
+}, { /* wycheproof - edge case intermediate sums in poly1305 */
+ .key = "\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x90\x91\x92\x93\x94\x95\x96\x97\x98\x99\x9a\x9b\x9c\x9d\x9e\x9f",
+ .nonce = "\x00\x00\x00\x00\x06\x4c\x2d\x52",
+ .nlen = 8,
+ .assoc = "\xff\xff\xff\xff",
+ .alen = 4,
+ .input = "\x93\x94\x28\xd0\x79\x35\x1f\x66\x5c\xd0\x01\x35\x43\x19\x87\x5c\x62\x48\x39\x60\x42\x16\xe4\x03\xeb\xcc\x6a\xf5\x59\xec\x8b\x43\x97\x7a\xed\x35\xcb\x5a\x2f\xca\xa0\x34\x6e\xfb\x93\x65\x54\x64\xd8\xc8\xc3\xfa\x1a\x9e\x47\x4a\xbe\x52\xd0\x2c\x81\x87\xe9\x0f\x4f\x2d\x90\x96\x52\x4f\xa1\xb2\xb0\x23\xb8\xb2\x88\x22\x27\x73\x90\xec\xf2\x1a\x04\xe6\x30\x85\x8b\xb6\x56\x52\xb5\xb1\x80\x16",
+ .ilen = 96,
+ .result = "\x93\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xe5\x8a\xf3\x69\xae\x0f\xc2\xf5\x29\x0b\x7c\x7f\x65\x9c\x97\x04\xf7\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xbb\xc1\x0b\x84\x94\x8b\x5c\x8c\x2f\x0c\x72\x11\x3e\xa9\xbd\x04\xf7\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xbb\xc1\x0b\x84\x94\x8b\x5c\x8c\x2f\x0c\x72\x11\x3e\xa9\xbd\x04\x73\xeb\x27\x24\xb5\xc4\x05\xf0\x4d\x00\xd0\xf1\x58\x40\xa1\xc1"
+} };
+static const struct chacha20poly1305_testvec chacha20poly1305_dec_vectors[] __initconst = { {
+ .key = "\x1c\x92\x40\xa5\xeb\x55\xd3\x8a\xf3\x33\x88\x86\x04\xf6\xb5\xf0\x47\x39\x17\xc1\x40\x2b\x80\x09\x9d\xca\x5c\xbc\x20\x70\x75\xc0",
+ .nonce = "\x01\x02\x03\x04\x05\x06\x07\x08",
+ .nlen = 8,
+ .assoc = "\xf3\x33\x88\x86\x00\x00\x00\x00\x00\x00\x4e\x91",
+ .alen = 12,
+ .input = "\x64\xa0\x86\x15\x75\x86\x1a\xf4\x60\xf0\x62\xc7\x9b\xe6\x43\xbd\x5e\x80\x5c\xfd\x34\x5c\xf3\x89\xf1\x08\x67\x0a\xc7\x6c\x8c\xb2\x4c\x6c\xfc\x18\x75\x5d\x43\xee\xa0\x9e\xe9\x4e\x38\x2d\x26\xb0\xbd\xb7\xb7\x3c\x32\x1b\x01\x00\xd4\xf0\x3b\x7f\x35\x58\x94\xcf\x33\x2f\x83\x0e\x71\x0b\x97\xce\x98\xc8\xa8\x4a\xbd\x0b\x94\x81\x14\xad\x17\x6e\x00\x8d\x33\xbd\x60\xf9\x82\xb1\xff\x37\xc8\x55\x97\x97\xa0\x6e\xf4\xf0\xef\x61\xc1\x86\x32\x4e\x2b\x35\x06\x38\x36\x06\x90\x7b\x6a\x7c\x02\xb0\xf9\xf6\x15\x7b\x53\xc8\x67\xe4\xb9\x16\x6c\x76\x7b\x80\x4d\x46\xa5\x9b\x52\x16\xcd\xe7\xa4\xe9\x90\x40\xc5\xa4\x04\x33\x22\x5e\xe2\x82\xa1\xb0\xa0\x6c\x52\x3e\xaf\x45\x34\xd7\xf8\x3f\xa1\x15\x5b\x00\x47\x71\x8c\xbc\x54\x6a\x0d\x07\x2b\x04\xb3\x56\x4e\xea\x1b\x42\x22\x73\xf5\x48\x27\x1a\x0b\xb2\x31\x60\x53"
+ "\xfa\x76\x99\x19\x55\xeb\xd6\x31\x59\x43\x4e\xce\xbb\x4e\x46\x6d\xae\x5a\x10\x73\xa6\x72\x76\x27\x09\x7a\x10\x49\xe6\x17\xd9\x1d\x36\x10\x94\xfa\x68\xf0\xff\x77\x98\x71\x30\x30\x5b\xea\xba\x2e\xda\x04\xdf\x99\x7b\x71\x4d\x6c\x6f\x2c\x29\xa6\xad\x5c\xb4\x02\x2b\x02\x70\x9b\xee\xad\x9d\x67\x89\x0c\xbb\x22\x39\x23\x36\xfe\xa1\x85\x1f\x38",
+ .ilen = 281,
+ .result = "\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x72\x65\x20\x64\x72\x61\x66\x74\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x76\x61\x6c\x69\x64\x20\x66\x6f\x72\x20\x61\x20\x6d\x61\x78\x69\x6d\x75\x6d\x20\x6f\x66\x20\x73\x69\x78\x20\x6d\x6f\x6e\x74\x68\x73\x20\x61\x6e\x64\x20\x6d\x61\x79\x20\x62\x65\x20\x75\x70\x64\x61\x74\x65\x64\x2c\x20\x72\x65\x70\x6c\x61\x63\x65\x64\x2c\x20\x6f\x72\x20\x6f\x62\x73\x6f\x6c\x65\x74\x65\x64\x20\x62\x79\x20\x6f\x74\x68\x65\x72\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x61\x74\x20\x61\x6e\x79\x20\x74\x69\x6d\x65\x2e\x20\x49\x74\x20\x69\x73\x20\x69\x6e\x61\x70\x70\x72\x6f\x70\x72\x69\x61\x74\x65\x20\x74\x6f\x20\x75\x73\x65\x20\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x73\x20\x72\x65\x66\x65\x72\x65"
+ "\x6e\x63\x65\x20\x6d\x61\x74\x65\x72\x69\x61\x6c\x20\x6f\x72\x20\x74\x6f\x20\x63\x69\x74\x65\x20\x74\x68\x65\x6d\x20\x6f\x74\x68\x65\x72\x20\x74\x68\x61\x6e\x20\x61\x73\x20\x2f\xe2\x80\x9c\x77\x6f\x72\x6b\x20\x69\x6e\x20\x70\x72\x6f\x67\x72\x65\x73\x73\x2e\x2f\xe2\x80\x9d"
+}, {
+ .key = "\x4c\xf5\x96\x83\x38\xe6\xae\x7f\x2d\x29\x25\x76\xd5\x75\x27\x86\x91\x9a\x27\x7a\xfb\x46\xc5\xef\x94\x81\x79\x57\x14\x59\x40\x68",
+ .nonce = "\xca\xbf\x33\x71\x32\x45\x77\x8e",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xea\xe0\x1e\x9e\x2c\x91\xaa\xe1\xdb\x5d\x99\x3f\x8a\xf7\x69\x92",
+ .ilen = 16,
+ .result = ""
+}, {
+ .key = "\x2d\xb0\x5d\x40\xc8\xed\x44\x88\x34\xd1\x13\xaf\x57\xa1\xeb\x3a\x2a\x80\x51\x36\xec\x5b\xbc\x08\x93\x84\x21\xb5\x13\x88\x3c\x0d",
+ .nonce = "\x3d\x86\xb5\x6b\xc8\xa3\x1f\x1d",
+ .nlen = 8,
+ .assoc = "\x33\x10\x41\x12\x1f\xf3\xd2\x6b",
+ .alen = 8,
+ .input = "\xdd\x6b\x3b\x82\xce\x5a\xbd\xd6\xa9\x35\x83\xd8\x8c\x3d\x85\x77",
+ .ilen = 16,
+ .result = ""
+}, {
+ .key = "\x4b\x28\x4b\xa3\x7b\xbe\xe9\xf8\x31\x80\x82\xd7\xd8\xe8\xb5\xa1\xe2\x18\x18\x8a\x9c\xfa\xa3\x3d\x25\x71\x3e\x40\xbc\x54\x7a\x3e",
+ .nonce = "\xd2\x32\x1f\x29\x28\xc6\xc4\xc4",
+ .nlen = 8,
+ .assoc = "\x6a\xe2\xad\x3f\x88\x39\x5a\x40",
+ .alen = 8,
+ .input = "\xb7\x1b\xb0\x73\x59\xb0\x84\xb2\x6d\x8e\xab\x94\x31\xa1\xae\xac\x89",
+ .ilen = 17,
+ .result = "\xa4"
+}, {
+ .key = "\x66\xca\x9c\x23\x2a\x4b\x4b\x31\x0e\x92\x89\x8b\xf4\x93\xc7\x87\x98\xa3\xd8\x39\xf8\xf4\xa7\x01\xc0\x2e\x0a\xa6\x7e\x5a\x78\x87",
+ .nonce = "\x20\x1c\xaa\x5f\x9c\xbf\x92\x30",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "\xbf\xe1\x5b\x0b\xdb\x6b\xf5\x5e\x6c\x5d\x84\x44\x39\x81\xc1\x9c\xac",
+ .ilen = 17,
+ .result = "\x2d"
+}, {
+ .key = "\x68\x7b\x8d\x8e\xe3\xc4\xdd\xae\xdf\x72\x7f\x53\x72\x25\x1e\x78\x91\xcb\x69\x76\x1f\x49\x93\xf9\x6f\x21\xcc\x39\x9c\xad\xb1\x01",
+ .nonce = "\xdf\x51\x84\x82\x42\x0c\x75\x9c",
+ .nlen = 8,
+ .assoc = "\x70\xd3\x33\xf3\x8b\x18\x0b",
+ .alen = 7,
+ .input = "\x8b\x06\xd3\x31\xb0\x93\x45\xb1\x75\x6e\x26\xf9\x67\xbc\x90\x15\x81\x2c\xb5\xf0\xc6\x2b\xc7\x8c\x56\xd1\xbf\x69\x6c\x07\xa0\xda\x65\x27\xc9\x90\x3d\xef\x4b\x11\x0f\x19\x07\xfd\x29\x92\xd9\xc8\xf7\x99\x2e\x4a\xd0\xb8\x2c\xdc\x93\xf5\x9e\x33\x78\xd1\x37\xc3\x66\xd7\x5e\xbc\x44\xbf\x53\xa5\xbc\xc4\xcb\x7b\x3a\x8e\x7f\x02\xbd\xbb\xe7\xca\xa6\x6c\x6b\x93\x21\x93\x10\x61\xe7\x69\xd0\x78\xf3\x07\x5a\x1a\x8f\x73\xaa\xb1\x4e\xd3\xda\x4f\xf3\x32\xe1\x66\x3e\x6c\xc6\x13\xba\x06\x5b\xfc\x6a\xe5\x6f\x60\xfb\x07\x40\xb0\x8c\x9d\x84\x43\x6b\xc1\xf7\x8d\x8d\x31\xf7\x7a\x39\x4d\x8f\x9a\xeb",
+ .ilen = 145,
+ .result = "\x33\x2f\x94\xc1\xa4\xef\xcc\x2a\x5b\xa6\xe5\x8f\x1d\x40\xf0\x92\x3c\xd9\x24\x11\xa9\x71\xf9\x37\x14\x99\xfa\xbe\xe6\x80\xde\x50\xc9\x96\xd4\xb0\xec\x9e\x17\xec\xd2\x5e\x72\x99\xfc\x0a\xe1\xcb\x48\xd2\x85\xdd\x2f\x90\xe0\x66\x3b\xe6\x20\x74\xbe\x23\x8f\xcb\xb4\xe4\xda\x48\x40\xa6\xd1\x1b\xc7\x42\xce\x2f\x0c\xa6\x85\x6e\x87\x37\x03\xb1\x7c\x25\x96\xa3\x05\xd8\xb0\xf4\xed\xea\xc2\xf0\x31\x98\x6c\xd1\x14\x25\xc0\xcb\x01\x74\xd0\x82\xf4\x36\xf5\x41\xd5\xdc\xca\xc5\xbb\x98\xfe\xfc\x69\x21\x70\xd8\xa4\x4b\xc8\xde\x8f"
+}, {
+ .key = "\x8d\xb8\x91\x48\xf0\xe7\x0a\xbd\xf9\x3f\xcd\xd9\xa0\x1e\x42\x4c\xe7\xde\x25\x3d\xa3\xd7\x05\x80\x8d\xf2\x82\xac\x44\x16\x51\x01",
+ .nonce = "\xde\x7b\xef\xc3\x65\x1b\x68\xb0",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x85\x04\xc2\xed\x8d\xfd\x97\x5c\xd2\xb7\xe2\xc1\x6b\xa3\xba\xf8\xc9\x50\xc3\xc6\xa5\xe3\xa4\x7c\xc3\x23\x49\x5e\xa9\xb9\x32\xeb\x8a\x7c\xca\xe5\xec\xfb\x7c\xc0\xcb\x7d\xdc\x2c\x9d\x92\x55\x21\x0a\xc8\x43\x63\x59\x0a\x31\x70\x82\x67\x41\x03\xf8\xdf\xf2\xac\xa7\x02\xd4\xd5\x8a\x2d\xc8\x99\x19\x66\xd0\xf6\x88\x2c\x77\xd9\xd4\x0d\x6c\xbd\x98\xde\xe7\x7f\xad\x7e\x8a\xfb\xe9\x4b\xe5\xf7\xe5\x50\xa0\x90\x3f\xd6\x22\x53\xe3\xfe\x1b\xcc\x79\x3b\xec\x12\x47\x52\xa7\xd6\x04\xe3\x52\xe6\x93\x90\x91\x32\x73\x79\xb8\xd0\x31\xde\x1f\x9f\x2f\x05\x38\x54\x2f\x35\x04\x39\xe0\xa7\xba\xc6\x52\xf6\x37\x65\x4c\x07\xa9\x7e\xb3\x21\x6f\x74\x8c\xc9\xde\xdb\x65\x1b\x9b\xaa\x60\xb1\x03\x30\x6b\xb2\x03\xc4\x1c\x04\xf8\x0f\x64\xaf\x46\xe4\x65\x99\x49\xe2\xea\xce\x78\x00\xd8\x8b\xd5\x2e\xcf\xfc\x40\x49\xe8\x58\xdc\x34\x9c"
+ "\x8c\x61\xbf\x0a\x8e\xec\x39\xa9\x30\x05\x5a\xd2\x56\x01\xc7\xda\x8f\x4e\xbb\x43\xa3\x3a\xf9\x15\x2a\xd0\xa0\x7a\x87\x34\x82\xfe\x8a\xd1\x2d\x5e\xc7\xbf\x04\x53\x5f\x3b\x36\xd4\x25\x5c\x34\x7a\x8d\xd5\x05\xce\x72\xca\xef\x7a\x4b\xbc\xb0\x10\x5c\x96\x42\x3a\x00\x98\xcd\x15\xe8\xb7\x53",
+ .ilen = 272,
+ .result = "\x9b\x18\xdb\xdd\x9a\x0f\x3e\xa5\x15\x17\xde\xdf\x08\x9d\x65\x0a\x67\x30\x12\xe2\x34\x77\x4b\xc1\xd9\xc6\x1f\xab\xc6\x18\x50\x17\xa7\x9d\x3c\xa6\xc5\x35\x8c\x1c\xc0\xa1\x7c\x9f\x03\x89\xca\xe1\xe6\xe9\xd4\xd3\x88\xdb\xb4\x51\x9d\xec\xb4\xfc\x52\xee\x6d\xf1\x75\x42\xc6\xfd\xbd\x7a\x8e\x86\xfc\x44\xb3\x4f\xf3\xea\x67\x5a\x41\x13\xba\xb0\xdc\xe1\xd3\x2a\x7c\x22\xb3\xca\xac\x6a\x37\x98\x3e\x1d\x40\x97\xf7\x9b\x1d\x36\x6b\xb3\x28\xbd\x60\x82\x47\x34\xaa\x2f\x7d\xe9\xa8\x70\x81\x57\xd4\xb9\x77\x0a\x9d\x29\xa7\x84\x52\x4f\xc2\x4a\x40\x3b\x3c\xd4\xc9\x2a\xdb\x4a\x53\xc4\xbe\x80\xe9\x51\x7f\x8f\xc7\xa2\xce\x82\x5c\x91\x1e\x74\xd9\xd0\xbd\xd5\xf3\xfd\xda\x4d\x25\xb4\xbb\x2d\xac\x2f\x3d\x71\x85\x7b\xcf\x3c\x7b\x3e\x0e\x22\x78\x0c\x29\xbf\xe4\xf4\x57\xb3\xcb\x49\xa0\xfc\x1e\x05\x4e\x16\xbc\xd5\xa8"
+ "\xa3\xee\x05\x35\xc6\x7c\xab\x60\x14\x55\x1a\x8e\xc5\x88\x5d\xd5\x81\xc2\x81\xa5\xc4\x60\xdb\xaf\x77\x91\xe1\xce\xa2\x7e\x7f\x42\xe3\xb0\x13\x1c\x1f\x25\x60\x21\xe2\x40\x5f\x99\xb7\x73\xec\x9b\x2b\xf0\x65\x11\xc8\xd0\x0a\x9f\xd3"
+}, {
+ .key = "\xf2\xaa\x4f\x99\xfd\x3e\xa8\x53\xc1\x44\xe9\x81\x18\xdc\xf5\xf0\x3e\x44\x15\x59\xe0\xc5\x44\x86\xc3\x91\xa8\x75\xc0\x12\x46\xba",
+ .nonce = "\x0e\x0d\x57\xbb\x7b\x40\x54\x02",
+ .nlen = 8,
+ .assoc = "",
+ .alen = 0,
+ .input = "\x14\xf6\x41\x37\xa6\xd4\x27\xcd\xdb\x06\x3e\x9a\x4e\xab\xd5\xb1\x1e\x6b\xd2\xbc\x11\xf4\x28\x93\x63\x54\xef\xbb\x5e\x1d\x3a\x1d\x37\x3c\x0a\x6c\x1e\xc2\xd1\x2c\xb5\xa3\xb5\x7b\xb8\x8f\x25\xa6\x1b\x61\x1c\xec\x28\x58\x26\xa4\xa8\x33\x28\x25\x5c\x45\x05\xe5\x6c\x99\xe5\x45\xc4\xa2\x03\x84\x03\x73\x1e\x8c\x49\xac\x20\xdd\x8d\xb3\xc4\xf5\xe7\x4f\xf1\xed\xa1\x98\xde\xa4\x96\xdd\x2f\xab\xab\x97\xcf\x3e\xd2\x9e\xb8\x13\x07\x28\x29\x19\xaf\xfd\xf2\x49\x43\xea\x49\x26\x91\xc1\x07\xd6\xbb\x81\x75\x35\x0d\x24\x7f\xc8\xda\xd4\xb7\xeb\xe8\x5c\x09\xa2\x2f\xdc\x28\x7d\x3a\x03\xfa\x94\xb5\x1d\x17\x99\x36\xc3\x1c\x18\x34\xe3\x9f\xf5\x55\x7c\xb0\x60\x9d\xff\xac\xd4\x61\xf2\xad\xf8\xce\xc7\xbe\x5c\xd2\x95\xa8\x4b\x77\x13\x19\x59\x26\xc9\xb7\x8f\x6a\xcb\x2d\x37\x91\xea\x92\x9c\x94\x5b\xda\x0b\xce\xfe\x30\x20"
+ "\xf8\x51\xad\xf2\xbe\xe7\xc7\xff\xb3\x33\x91\x6a\xc9\x1a\x41\xc9\x0f\xf3\x10\x0e\xfd\x53\xff\x6c\x16\x52\xd9\xf3\xf7\x98\x2e\xc9\x07\x31\x2c\x0c\x72\xd7\xc5\xc6\x08\x2a\x7b\xda\xbd\x7e\x02\xea\x1a\xbb\xf2\x04\x27\x61\x28\x8e\xf5\x04\x03\x1f\x4c\x07\x55\x82\xec\x1e\xd7\x8b\x2f\x65\x56\xd1\xd9\x1e\x3c\xe9\x1f\x5e\x98\x70\x38\x4a\x8c\x49\xc5\x43\xa0\xa1\x8b\x74\x9d\x4c\x62\x0d\x10\x0c\xf4\x6c\x8f\xe0\xaa\x9a\x8d\xb7\xe0\xbe\x4c\x87\xf1\x98\x2f\xcc\xed\xc0\x52\x29\xdc\x83\xf8\xfc\x2c\x0e\xa8\x51\x4d\x80\x0d\xa3\xfe\xd8\x37\xe7\x41\x24\xfc\xfb\x75\xe3\x71\x7b\x57\x45\xf5\x97\x73\x65\x63\x14\x74\xb8\x82\x9f\xf8\x60\x2f\x8a\xf2\x4e\xf1\x39\xda\x33\x91\xf8\x36\xe0\x8d\x3f\x1f\x3b\x56\xdc\xa0\x8f\x3c\x9d\x71\x52\xa7\xb8\xc0\xa5\xc6\xa2\x73\xda\xf4\x4b\x74\x5b\x00\x3d\x99\xd7\x96\xba\xe6\xe1\xa6\x96\x38\xad"
+ "\xb3\xc0\xd2\xba\x91\x6b\xf9\x19\xdd\x3b\xbe\xbe\x9c\x20\x50\xba\xa1\xd0\xce\x11\xbd\x95\xd8\xd1\xdd\x33\x85\x74\xdc\xdb\x66\x76\x44\xdc\x03\x74\x48\x35\x98\xb1\x18\x47\x94\x7d\xff\x62\xe4\x58\x78\xab\xed\x95\x36\xd9\x84\x91\x82\x64\x41\xbb\x58\xe6\x1c\x20\x6d\x15\x6b\x13\x96\xe8\x35\x7f\xdc\x40\x2c\xe9\xbc\x8a\x4f\x92\xec\x06\x2d\x50\xdf\x93\x5d\x65\x5a\xa8\xfc\x20\x50\x14\xa9\x8a\x7e\x1d\x08\x1f\xe2\x99\xd0\xbe\xfb\x3a\x21\x9d\xad\x86\x54\xfd\x0d\x98\x1c\x5a\x6f\x1f\x9a\x40\xcd\xa2\xff\x6a\xf1\x54",
+ .ilen = 528,
+ .result = "\xc3\x09\x94\x62\xe6\x46\x2e\x10\xbe\x00\xe4\xfc\xf3\x40\xa3\xe2\x0f\xc2\x8b\x28\xdc\xba\xb4\x3c\xe4\x21\x58\x61\xcd\x8b\xcd\xfb\xac\x94\xa1\x45\xf5\x1c\xe1\x12\xe0\x3b\x67\x21\x54\x5e\x8c\xaa\xcf\xdb\xb4\x51\xd4\x13\xda\xe6\x83\x89\xb6\x92\xe9\x21\x76\xa4\x93\x7d\x0e\xfd\x96\x36\x03\x91\x43\x5c\x92\x49\x62\x61\x7b\xeb\x43\x89\xb8\x12\x20\x43\xd4\x47\x06\x84\xee\x47\xe9\x8a\x73\x15\x0f\x72\xcf\xed\xce\x96\xb2\x7f\x21\x45\x76\xeb\x26\x28\x83\x6a\xad\xaa\xa6\x81\xd8\x55\xb1\xa3\x85\xb3\x0c\xdf\xf1\x69\x2d\x97\x05\x2a\xbc\x7c\x7b\x25\xf8\x80\x9d\x39\x25\xf3\x62\xf0\x66\x5e\xf4\xa0\xcf\xd8\xfd\x4f\xb1\x1f\x60\x3a\x08\x47\xaf\xe1\xf6\x10\x77\x09\xa7\x27\x8f\x9a\x97\x5a\x26\xfa\xfe\x41\x32\x83\x10\xe0\x1d\xbf\x64\x0d\xf4\x1c\x32\x35\xe5\x1b\x36\xef\xd4\x4a\x93\x4d\x00\x7c\xec\x02\x07\x8b\x5d\x7d"
+ "\x1b\x0e\xd1\xa6\xa5\x5d\x7d\x57\x88\xa8\xcc\x81\xb4\x86\x4e\xb4\x40\xe9\x1d\xc3\xb1\x24\x3e\x7f\xcc\x8a\x24\x9b\xdf\x6d\xf0\x39\x69\x3e\x4c\xc0\x96\xe4\x13\xda\x90\xda\xf4\x95\x66\x8b\x17\x17\xfe\x39\x43\x25\xaa\xda\xa0\x43\x3c\xb1\x41\x02\xa3\xf0\xa7\x19\x59\xbc\x1d\x7d\x6c\x6d\x91\x09\x5c\xb7\x5b\x01\xd1\x6f\x17\x21\x97\xbf\x89\x71\xa5\xb0\x6e\x07\x45\xfd\x9d\xea\x07\xf6\x7a\x9f\x10\x18\x22\x30\x73\xac\xd4\x6b\x72\x44\xed\xd9\x19\x9b\x2d\x4a\x41\xdd\xd1\x85\x5e\x37\x19\xed\xd2\x15\x8f\x5e\x91\xdb\x33\xf2\xe4\xdb\xff\x98\xfb\xa3\xb5\xca\x21\x69\x08\xe7\x8a\xdf\x90\xff\x3e\xe9\x20\x86\x3c\xe9\xfc\x0b\xfe\x5c\x61\xaa\x13\x92\x7f\x7b\xec\xe0\x6d\xa8\x23\x22\xf6\x6b\x77\xc4\xfe\x40\x07\x3b\xb6\xf6\x8e\x5f\xd4\xb9\xb7\x0f\x21\x04\xef\x83\x63\x91\x69\x40\xa3\x48\x5c\xd2\x60\xf9\x4f\x6c\x47\x8b\x3b"
+ "\xb1\x9f\x8e\xee\x16\x8a\x13\xfc\x46\x17\xc3\xc3\x32\x56\xf8\x3c\x85\x3a\xb6\x3e\xaa\x89\x4f\xb3\xdf\x38\xfd\xf1\xe4\x3a\xc0\xe6\x58\xb5\x8f\xc5\x29\xa2\x92\x4a\xb6\xa0\x34\x7f\xab\xb5\x8a\x90\xa1\xdb\x4d\xca\xb6\x2c\x41\x3c\xf7\x2b\x21\xc3\xfd\xf4\x17\x5c\xb5\x33\x17\x68\x2b\x08\x30\xf3\xf7\x30\x3c\x96\xe6\x6a\x20\x97\xe7\x4d\x10\x5f\x47\x5f\x49\x96\x09\xf0\x27\x91\xc8\xf8\x5a\x2e\x79\xb5\xe2\xb8\xe8\xb9\x7b\xd5\x10\xcb\xff\x5d\x14\x73\xf3"
+}, {
+ .key = "\xea\xbc\x56\x99\xe3\x50\xff\xc5\xcc\x1a\xd7\xc1\x57\x72\xea\x86\x5b\x89\x88\x61\x3d\x2f\x9b\xb2\xe7\x9c\xec\x74\x6e\x3e\xf4\x3b",
+ .nonce = "\xef\x2d\x63\xee\x6b\x80\x8b\x78",
+ .nlen = 8,
+ .assoc = "\x5a\x27\xff\xeb\xdf\x84\xb2\x9e\xef",
+ .alen = 9,
+ .input = "\xfd\x81\x8d\xd0\x3d\xb4\xd5\xdf\xd3\x42\x47\x5a\x6d\x19\x27\x66\x4b\x2e\x0c\x27\x9c\x96\x4c\x72\x02\xa3\x65\xc3\xb3\x6f\x2e\xbd\x63\x8a\x4a\x5d\x29\xa2\xd0\x28\x48\xc5\x3d\x98\xa3\xbc\xe0\xbe\x3b\x3f\xe6\x8a\xa4\x7f\x53\x06\xfa\x7f\x27\x76\x72\x31\xa1\xf5\xd6\x0c\x52\x47\xba\xcd\x4f\xd7\xeb\x05\x48\x0d\x7c\x35\x4a\x09\xc9\x76\x71\x02\xa3\xfb\xb7\x1a\x65\xb7\xed\x98\xc6\x30\x8a\x00\xae\xa1\x31\xe5\xb5\x9e\x6d\x62\xda\xda\x07\x0f\x38\x38\xd3\xcb\xc1\xb0\xad\xec\x72\xec\xb1\xa2\x7b\x59\xf3\x3d\x2b\xef\xcd\x28\x5b\x83\xcc\x18\x91\x88\xb0\x2e\xf9\x29\x31\x18\xf9\x4e\xe9\x0a\x91\x92\x9f\xae\x2d\xad\xf4\xe6\x1a\xe2\xa4\xee\x47\x15\xbf\x83\x6e\xd7\x72\x12\x3b\x2d\x24\xe9\xb2\x55\xcb\x3c\x10\xf0\x24\x8a\x4a\x02\xea\x90\x25\xf0\xb4\x79\x3a\xef\x6e\xf5\x52\xdf\xb0\x0a\xcd\x24\x1c\xd3\x2e\x22\x74\xea"
+ "\x21\x6f\xe9\xbd\xc8\x3e\x36\x5b\x19\xf1\xca\x99\x0a\xb4\xa7\x52\x1a\x4e\xf2\xad\x8d\x56\x85\xbb\x64\x89\xba\x26\xf9\xc7\xe1\x89\x19\x22\x77\xc3\xa8\xfc\xff\xad\xfe\xb9\x48\xae\x12\x30\x9f\x19\xfb\x1b\xef\x14\x87\x8a\x78\x71\xf3\xf4\xb7\x00\x9c\x1d\xb5\x3d\x49\x00\x0c\x06\xd4\x50\xf9\x54\x45\xb2\x5b\x43\xdb\x6d\xcf\x1a\xe9\x7a\x7a\xcf\xfc\x8a\x4e\x4d\x0b\x07\x63\x28\xd8\xe7\x08\x95\xdf\xa6\x72\x93\x2e\xbb\xa0\x42\x89\x16\xf1\xd9\x0c\xf9\xa1\x16\xfd\xd9\x03\xb4\x3b\x8a\xf5\xf6\xe7\x6b\x2e\x8e\x4c\x3d\xe2\xaf\x08\x45\x03\xff\x09\xb6\xeb\x2d\xc6\x1b\x88\x94\xac\x3e\xf1\x9f\x0e\x0e\x2b\xd5\x00\x4d\x3f\x3b\x53\xae\xaf\x1c\x33\x5f\x55\x6e\x8d\xaf\x05\x7a\x10\x34\xc9\xf4\x66\xcb\x62\x12\xa6\xee\xe8\x1c\x5d\x12\x86\xdb\x6f\x1c\x33\xc4\x1c\xda\x82\x2d\x3b\x59\xfe\xb1\xa4\x59\x41\x86\xd0\xef\xae\xfb\xda\x6d\x11"
+ "\xb8\xca\xe9\x6e\xff\xf7\xa9\xd9\x70\x30\xfc\x53\xe2\xd7\xa2\x4e\xc7\x91\xd9\x07\x06\xaa\xdd\xb0\x59\x28\x1d\x00\x66\xc5\x54\xc2\xfc\x06\xda\x05\x90\x52\x1d\x37\x66\xee\xf0\xb2\x55\x8a\x5d\xd2\x38\x86\x94\x9b\xfc\x10\x4c\xa1\xb9\x64\x3e\x44\xb8\x5f\xb0\x0c\xec\xe0\xc9\xe5\x62\x75\x3f\x09\xd5\xf5\xd9\x26\xba\x9e\xd2\xf4\xb9\x48\x0a\xbc\xa2\xd6\x7c\x36\x11\x7d\x26\x81\x89\xcf\xa4\xad\x73\x0e\xee\xcc\x06\xa9\xdb\xb1\xfd\xfb\x09\x7f\x90\x42\x37\x2f\xe1\x9c\x0f\x6f\xcf\x43\xb5\xd9\x90\xe1\x85\xf5\xa8\xae",
+ .ilen = 529,
+ .result = "\xe6\xc3\xdb\x63\x55\x15\xe3\x5b\xb7\x4b\x27\x8b\x5a\xdd\xc2\xe8\x3a\x6b\xd7\x81\x96\x35\x97\xca\xd7\x68\xe8\xef\xce\xab\xda\x09\x6e\xd6\x8e\xcb\x55\xb5\xe1\xe5\x57\xfd\xc4\xe3\xe0\x18\x4f\x85\xf5\x3f\x7e\x4b\x88\xc9\x52\x44\x0f\xea\xaf\x1f\x71\x48\x9f\x97\x6d\xb9\x6f\x00\xa6\xde\x2b\x77\x8b\x15\xad\x10\xa0\x2b\x7b\x41\x90\x03\x2d\x69\xae\xcc\x77\x7c\xa5\x9d\x29\x22\xc2\xea\xb4\x00\x1a\xd2\x7a\x98\x8a\xf9\xf7\x82\xb0\xab\xd8\xa6\x94\x8d\x58\x2f\x01\x9e\x00\x20\xfc\x49\xdc\x0e\x03\xe8\x45\x10\xd6\xa8\xda\x55\x10\x9a\xdf\x67\x22\x8b\x43\xab\x00\xbb\x02\xc8\xdd\x7b\x97\x17\xd7\x1d\x9e\x02\x5e\x48\xde\x8e\xcf\x99\x07\x95\x92\x3c\x5f\x9f\xc5\x8a\xc0\x23\xaa\xd5\x8c\x82\x6e\x16\x92\xb1\x12\x17\x07\xc3\xfb\x36\xf5\x6c\x35\xd6\x06\x1f\x9f\xa7\x94\xa2\x38\x63\x9c\xb0\x71\xb3\xa5\xd2\xd8\xba\x9f\x08\x01"
+ "\xb3\xff\x04\x97\x73\x45\x1b\xd5\xa9\x9c\x80\xaf\x04\x9a\x85\xdb\x32\x5b\x5d\x1a\xc1\x36\x28\x10\x79\xf1\x3c\xbf\x1a\x41\x5c\x4e\xdf\xb2\x7c\x79\x3b\x7a\x62\x3d\x4b\xc9\x9b\x2a\x2e\x7c\xa2\xb1\x11\x98\xa7\x34\x1a\x00\xf3\xd1\xbc\x18\x22\xba\x02\x56\x62\x31\x10\x11\x6d\xe0\x54\x9d\x40\x1f\x26\x80\x41\xca\x3f\x68\x0f\x32\x1d\x0a\x8e\x79\xd8\xa4\x1b\x29\x1c\x90\x8e\xc5\xe3\xb4\x91\x37\x9a\x97\x86\x99\xd5\x09\xc5\xbb\xa3\x3f\x21\x29\x82\x14\x5c\xab\x25\xfb\xf2\x4f\x58\x26\xd4\x83\xaa\x66\x89\x67\x7e\xc0\x49\xe1\x11\x10\x7f\x7a\xda\x29\x04\xff\xf0\xcb\x09\x7c\x9d\xfa\x03\x6f\x81\x09\x31\x60\xfb\x08\xfa\x74\xd3\x64\x44\x7c\x55\x85\xec\x9c\x6e\x25\xb7\x6c\xc5\x37\xb6\x83\x87\x72\x95\x8b\x9d\xe1\x69\x5c\x31\x95\x42\xa6\x2c\xd1\x36\x47\x1f\xec\x54\xab\xa2\x1c\xd8\x00\xcc\xbc\x0d\x65\xe2\x67\xbf\xbc\xea\xee"
+ "\x9e\xe4\x36\x95\xbe\x73\xd9\xa6\xd9\x0f\xa0\xcc\x82\x76\x26\xad\x5b\x58\x6c\x4e\xab\x29\x64\xd3\xd9\xa9\x08\x8c\x1d\xa1\x4f\x80\xd8\x3f\x94\xfb\xd3\x7b\xfc\xd1\x2b\xc3\x21\xeb\xe5\x1c\x84\x23\x7f\x4b\xfa\xdb\x34\x18\xa2\xc2\xe5\x13\xfe\x6c\x49\x81\xd2\x73\xe7\xe2\xd7\xe4\x4f\x4b\x08\x6e\xb1\x12\x22\x10\x9d\xac\x51\x1e\x17\xd9\x8a\x0b\x42\x88\x16\x81\x37\x7c\x6a\xf7\xef\x2d\xe3\xd9\xf8\x5f\xe0\x53\x27\x74\xb9\xe2\xd6\x1c\x80\x2c\x52\x65"
+}, {
+ .key = "\x47\x11\xeb\x86\x2b\x2c\xab\x44\x34\xda\x7f\x57\x03\x39\x0c\xaf\x2c\x14\xfd\x65\x23\xe9\x8e\x74\xd5\x08\x68\x08\xe7\xb4\x72\xd7",
+ .nonce = "\xdb\x92\x0f\x7f\x17\x54\x0c\x30",
+ .nlen = 8,
+ .assoc = "\xd2\xa1\x70\xdb\x7a\xf8\xfa\x27\xba\x73\x0f\xbf\x3d\x1e\x82\xb2",
+ .alen = 16,
+ .input = "\xe5\x26\xa4\x3d\xbd\x33\xd0\x4b\x6f\x05\xa7\x6e\x12\x7a\xd2\x74\xa6\xdd\xbd\x95\xeb\xf9\xa4\xf1\x59\x93\x91\x70\xd9\xfe\x9a\xcd\x53\x1f\x3a\xab\xa6\x7c\x9f\xa6\x9e\xbd\x99\xd9\xb5\x97\x44\xd5\x14\x48\x4d\x9d\xc0\xd0\x05\x96\xeb\x4c\x78\x55\x09\x08\x01\x02\x30\x90\x7b\x96\x7a\x7b\x5f\x30\x41\x24\xce\x68\x61\x49\x86\x57\x82\xdd\x53\x1c\x51\x28\x2b\x53\x6e\x2d\xc2\x20\x4c\xdd\x8f\x65\x10\x20\x50\xdd\x9d\x50\xe5\x71\x40\x53\x69\xfc\x77\x48\x11\xb9\xde\xa4\x8d\x58\xe4\xa6\x1a\x18\x47\x81\x7e\xfc\xdd\xf6\xef\xce\x2f\x43\x68\xd6\x06\xe2\x74\x6a\xad\x90\xf5\x37\xf3\x3d\x82\x69\x40\xe9\x6b\xa7\x3d\xa8\x1e\xd2\x02\x7c\xb7\x9b\xe4\xda\x8f\x95\x06\xc5\xdf\x73\xa3\x20\x9a\x49\xde\x9c\xbc\xee\x14\x3f\x81\x5e\xf8\x3b\x59\x3c\xe1\x68\x12\x5a\x3a\x76\x3a\x3f\xf7\x87\x33\x0a\x01\xb8\xd4\xed\xb6\xbe\x94\x5e"
+ "\x70\x40\x56\x67\x1f\x50\x44\x19\xce\x82\x70\x10\x87\x13\x20\x0b\x4c\x5a\xb6\xf6\xa7\xae\x81\x75\x01\x81\xe6\x4b\x57\x7c\xdd\x6d\xf8\x1c\x29\x32\xf7\xda\x3c\x2d\xf8\x9b\x25\x6e\x00\xb4\xf7\x2f\xf7\x04\xf7\xa1\x56\xac\x4f\x1a\x64\xb8\x47\x55\x18\x7b\x07\x4d\xbd\x47\x24\x80\x5d\xa2\x70\xc5\xdd\x8e\x82\xd4\xeb\xec\xb2\x0c\x39\xd2\x97\xc1\xcb\xeb\xf4\x77\x59\xb4\x87\xef\xcb\x43\x2d\x46\x54\xd1\xa7\xd7\x15\x99\x0a\x43\xa1\xe0\x99\x33\x71\xc1\xed\xfe\x72\x46\x33\x8e\x91\x08\x9f\xc8\x2e\xca\xfa\xdc\x59\xd5\xc3\x76\x84\x9f\xa3\x37\x68\xc3\xf0\x47\x2c\x68\xdb\x5e\xc3\x49\x4c\xe8\x92\x85\xe2\x23\xd3\x3f\xad\x32\xe5\x2b\x82\xd7\x8f\x99\x0a\x59\x5c\x45\xd9\xb4\x51\x52\xc2\xae\xbf\x80\xcf\xc9\xc9\x51\x24\x2a\x3b\x3a\x4d\xae\xeb\xbd\x22\xc3\x0e\x0f\x59\x25\x92\x17\xe9\x74\xc7\x8b\x70\x70\x36\x55\x95\x75\x4b"
+ "\xad\x61\x2b\x09\xbc\x82\xf2\x6e\x94\x43\xae\xc3\xd5\xcd\x8e\xfe\x5b\x9a\x88\x43\x01\x75\xb2\x23\x09\xf7\x89\x83\xe7\xfa\xf9\xb4\x9b\xf8\xef\xbd\x1c\x92\xc1\xda\x7e\xfe\x05\xba\x5a\xcd\x07\x6a\x78\x9e\x5d\xfb\x11\x2f\x79\x38\xb6\xc2\x5b\x6b\x51\xb4\x71\xdd\xf7\x2a\xe4\xf4\x72\x76\xad\xc2\xdd\x64\x5d\x79\xb6\xf5\x7a\x77\x20\x05\x3d\x30\x06\xd4\x4c\x0a\x2c\x98\x5a\xb9\xd4\x98\xa9\x3f\xc6\x12\xea\x3b\x4b\xc5\x79\x64\x63\x6b\x09\x54\x3b\x14\x27\xba\x99\x80\xc8\x72\xa8\x12\x90\x29\xba\x40\x54\x97\x2b\x7b\xfe\xeb\xcd\x01\x05\x44\x72\xdb\x99\xe4\x61\xc9\x69\xd6\xb9\x28\xd1\x05\x3e\xf9\x0b\x49\x0a\x49\xe9\x8d\x0e\xa7\x4a\x0f\xaf\x32\xd0\xe0\xb2\x3a\x55\x58\xfe\x5c\x28\x70\x51\x23\xb0\x7b\x6a\x5f\x1e\xb8\x17\xd7\x94\x15\x8f\xee\x20\xc7\x42\x25\x3e\x9a\x14\xd7\x60\x72\x39\x47\x48\xa9\xfe\xdd\x47\x0a\xb1\xe6\x60"
+ "\x28\x8c\x11\x68\xe1\xff\xd7\xce\xc8\xbe\xb3\xfe\x27\x30\x09\x70\xd7\xfa\x02\x33\x3a\x61\x2e\xc7\xff\xa4\x2a\xa8\x6e\xb4\x79\x35\x6d\x4c\x1e\x38\xf8\xee\xd4\x84\x4e\x6e\x28\xa7\xce\xc8\xc1\xcf\x80\x05\xf3\x04\xef\xc8\x18\x28\x2e\x8d\x5e\x0c\xdf\xb8\x5f\x96\xe8\xc6\x9c\x2f\xe5\xa6\x44\xd7\xe7\x99\x44\x0c\xec\xd7\x05\x60\x97\xbb\x74\x77\x58\xd5\xbb\x48\xde\x5a\xb2\x54\x7f\x0e\x46\x70\x6a\x6f\x78\xa5\x08\x89\x05\x4e\x7e\xa0\x69\xb4\x40\x60\x55\x77\x75\x9b\x19\xf2\xd5\x13\x80\x77\xf9\x4b\x3f\x1e\xee\xe6\x76\x84\x7b\x8c\xe5\x27\xa8\x0a\x91\x01\x68\x71\x8a\x3f\x06\xab\xf6\xa9\xa5\xe6\x72\x92\xe4\x67\xe2\xa2\x46\x35\x84\x55\x7d\xca\xa8\x85\xd0\xf1\x3f\xbe\xd7\x34\x64\xfc\xae\xe3\xe4\x04\x9f\x66\x02\xb9\x88\x10\xd9\xc4\x4c\x31\x43\x7a\x93\xe2\x9b\x56\x43\x84\xdc\xdc\xde\x1d\xa4\x02\x0e\xc2\xef\xc3\xf8\x78\xd1"
+ "\xb2\x6b\x63\x18\xc9\xa9\xe5\x72\xd8\xf3\xb9\xd1\x8a\xc7\x1a\x02\x27\x20\x77\x10\xe5\xc8\xd4\x4a\x47\xe5\xdf\x5f\x01\xaa\xb0\xd4\x10\xbb\x69\xe3\x36\xc8\xe1\x3d\x43\xfb\x86\xcd\xcc\xbf\xf4\x88\xe0\x20\xca\xb7\x1b\xf1\x2f\x5c\xee\xd4\xd3\xa3\xcc\xa4\x1e\x1c\x47\xfb\xbf\xfc\xa2\x41\x55\x9d\xf6\x5a\x5e\x65\x32\x34\x7b\x52\x8d\xd5\xd0\x20\x60\x03\xab\x3f\x8c\xd4\x21\xea\x2a\xd9\xc4\xd0\xd3\x65\xd8\x7a\x13\x28\x62\x32\x4b\x2c\x87\x93\xa8\xb4\x52\x45\x09\x44\xec\xec\xc3\x17\xdb\x9a\x4d\x5c\xa9\x11\xd4\x7d\xaf\x9e\xf1\x2d\xb2\x66\xc5\x1d\xed\xb7\xcd\x0b\x25\x5e\x30\x47\x3f\x40\xf4\xa1\xa0\x00\x94\x10\xc5\x6a\x63\x1a\xd5\x88\x92\x8e\x82\x39\x87\x3c\x78\x65\x58\x42\x75\x5b\xdd\x77\x3e\x09\x4e\x76\x5b\xe6\x0e\x4d\x38\xb2\xc0\xb8\x95\x01\x7a\x10\xe0\xfb\x07\xf2\xab\x2d\x8c\x32\xed\x2b\xc0\x46\xc2\xf5\x38"
+ "\x83\xf0\x17\xec\xc1\x20\x6a\x9a\x0b\x00\xa0\x98\x22\x50\x23\xd5\x80\x6b\xf6\x1f\xc3\xcc\x97\xc9\x24\x9f\xf3\xaf\x43\x14\xd5\xa0",
+ .ilen = 1040,
+ .result = "\x42\x93\xe4\xeb\x97\xb0\x57\xbf\x1a\x8b\x1f\xe4\x5f\x36\x20\x3c\xef\x0a\xa9\x48\x5f\x5f\x37\x22\x3a\xde\xe3\xae\xbe\xad\x07\xcc\xb1\xf6\xf5\xf9\x56\xdd\xe7\x16\x1e\x7f\xdf\x7a\x9e\x75\xb7\xc7\xbe\xbe\x8a\x36\x04\xc0\x10\xf4\x95\x20\x03\xec\xdc\x05\xa1\x7d\xc4\xa9\x2c\x82\xd0\xbc\x8b\xc5\xc7\x45\x50\xf6\xa2\x1a\xb5\x46\x3b\x73\x02\xa6\x83\x4b\x73\x82\x58\x5e\x3b\x65\x2f\x0e\xfd\x2b\x59\x16\xce\xa1\x60\x9c\xe8\x3a\x99\xed\x8d\x5a\xcf\xf6\x83\xaf\xba\xd7\x73\x73\x40\x97\x3d\xca\xef\x07\x57\xe6\xd9\x70\x0e\x95\xae\xa6\x8d\x04\xcc\xee\xf7\x09\x31\x77\x12\xa3\x23\x97\x62\xb3\x7b\x32\xfb\x80\x14\x48\x81\xc3\xe5\xea\x91\x39\x52\x81\xa2\x4f\xe4\xb3\x09\xff\xde\x5e\xe9\x58\x84\x6e\xf9\x3d\xdf\x25\xea\xad\xae\xe6\x9a\xd1\x89\x55\xd3\xde\x6c\x52\xdb\x70\xfe\x37\xce\x44\x0a\xa8\x25\x5f\x92\xc1\x33\x4a"
+ "\x4f\x9b\x62\x35\xff\xce\xc0\xa9\x60\xce\x52\x00\x97\x51\x35\x26\x2e\xb9\x36\xa9\x87\x6e\x1e\xcc\x91\x78\x53\x98\x86\x5b\x9c\x74\x7d\x88\x33\xe1\xdf\x37\x69\x2b\xbb\xf1\x4d\xf4\xd1\xf1\x39\x93\x17\x51\x19\xe3\x19\x1e\x76\x37\x25\xfb\x09\x27\x6a\xab\x67\x6f\x14\x12\x64\xe7\xc4\x07\xdf\x4d\x17\xbb\x6d\xe0\xe9\xb9\xab\xca\x10\x68\xaf\x7e\xb7\x33\x54\x73\x07\x6e\xf7\x81\x97\x9c\x05\x6f\x84\x5f\xd2\x42\xfb\x38\xcf\xd1\x2f\x14\x30\x88\x98\x4d\x5a\xa9\x76\xd5\x4f\x3e\x70\x6c\x85\x76\xd7\x01\xa0\x1a\xc8\x4e\xaa\xac\x78\xfe\x46\xde\x6a\x05\x46\xa7\x43\x0c\xb9\xde\xb9\x68\xfb\xce\x42\x99\x07\x4d\x0b\x3b\x5a\x30\x35\xa8\xf9\x3a\x73\xef\x0f\xdb\x1e\x16\x42\xc4\xba\xae\x58\xaa\xf8\xe5\x75\x2f\x1b\x15\x5c\xfd\x0a\x97\xd0\xe4\x37\x83\x61\x5f\x43\xa6\xc7\x3f\x38\x59\xe6\xeb\xa3\x90\xc3\xaa\xaa\x5a\xd3\x34\xd4"
+ "\x17\xc8\x65\x3e\x57\xbc\x5e\xdd\x9e\xb7\xf0\x2e\x5b\xb2\x1f\x8a\x08\x0d\x45\x91\x0b\x29\x53\x4f\x4c\x5a\x73\x56\xfe\xaf\x41\x01\x39\x0a\x24\x3c\x7e\xbe\x4e\x53\xf3\xeb\x06\x66\x51\x28\x1d\xbd\x41\x0a\x01\xab\x16\x47\x27\x47\x47\xf7\xcb\x46\x0a\x70\x9e\x01\x9c\x09\xe1\x2a\x00\x1a\xd8\xd4\x79\x9d\x80\x15\x8e\x53\x2a\x65\x83\x78\x3e\x03\x00\x07\x12\x1f\x33\x3e\x7b\x13\x37\xf1\xc3\xef\xb7\xc1\x20\x3c\x3e\x67\x66\x5d\x88\xa7\x7d\x33\x50\x77\xb0\x28\x8e\xe7\x2c\x2e\x7a\xf4\x3c\x8d\x74\x83\xaf\x8e\x87\x0f\xe4\x50\xff\x84\x5c\x47\x0c\x6a\x49\xbf\x42\x86\x77\x15\x48\xa5\x90\x5d\x93\xd6\x2a\x11\xd5\xd5\x11\xaa\xce\xe7\x6f\xa5\xb0\x09\x2c\x8d\xd3\x92\xf0\x5a\x2a\xda\x5b\x1e\xd5\x9a\xc4\xc4\xf3\x49\x74\x41\xca\xe8\xc1\xf8\x44\xd6\x3c\xae\x6c\x1d\x9a\x30\x04\x4d\x27\x0e\xb1\x5f\x59\xa2\x24\xe8\xe1\x98\xc5\x6a\x4c"
+ "\xfe\x41\xd2\x27\x42\x52\xe1\xe9\x7d\x62\xe4\x88\x0f\xad\xb2\x70\xcb\x9d\x4c\x27\x2e\x76\x1e\x1a\x63\x65\xf5\x3b\xf8\x57\x69\xeb\x5b\x38\x26\x39\x33\x25\x45\x3e\x91\xb8\xd8\xc7\xd5\x42\xc0\x22\x31\x74\xf4\xbc\x0c\x23\xf1\xca\xc1\x8d\xd7\xbe\xc9\x62\xe4\x08\x1a\xcf\x36\xd5\xfe\x55\x21\x59\x91\x87\x87\xdf\x06\xdb\xdf\x96\x45\x58\xda\x05\xcd\x50\x4d\xd2\x7d\x05\x18\x73\x6a\x8d\x11\x85\xa6\x88\xe8\xda\xe6\x30\x33\xa4\x89\x31\x75\xbe\x69\x43\x84\x43\x50\x87\xdd\x71\x36\x83\xc3\x78\x74\x24\x0a\xed\x7b\xdb\xa4\x24\x0b\xb9\x7e\x5d\xff\xde\xb1\xef\x61\x5a\x45\x33\xf6\x17\x07\x08\x98\x83\x92\x0f\x23\x6d\xe6\xaa\x17\x54\xad\x6a\xc8\xdb\x26\xbe\xb8\xb6\x08\xfa\x68\xf1\xd7\x79\x6f\x18\xb4\x9e\x2d\x3f\x1b\x64\xaf\x8d\x06\x0e\x49\x28\xe0\x5d\x45\x68\x13\x87\xfa\xde\x40\x7b\xd2\xc3\x94\xd5\xe1\xd9\xc2\xaf\x55\x89"
+ "\xeb\xb4\x12\x59\xa8\xd4\xc5\x29\x66\x38\xe6\xac\x22\x22\xd9\x64\x9b\x34\x0a\x32\x9f\xc2\xbf\x17\x6c\x3f\x71\x7a\x38\x6b\x98\xfb\x49\x36\x89\xc9\xe2\xd6\xc7\x5d\xd0\x69\x5f\x23\x35\xc9\x30\xe2\xfd\x44\x58\x39\xd7\x97\xfb\x5c\x00\xd5\x4f\x7a\x1a\x95\x8b\x62\x4b\xce\xe5\x91\x21\x7b\x30\x00\xd6\xdd\x6d\x02\x86\x49\x0f\x3c\x1a\x27\x3c\xd3\x0e\x71\xf2\xff\xf5\x2f\x87\xac\x67\x59\x81\xa3\xf7\xf8\xd6\x11\x0c\x84\xa9\x03\xee\x2a\xc4\xf3\x22\xab\x7c\xe2\x25\xf5\x67\xa3\xe4\x11\xe0\x59\xb3\xca\x87\xa0\xae\xc9\xa6\x62\x1b\x6e\x4d\x02\x6b\x07\x9d\xfd\xd0\x92\x06\xe1\xb2\x9a\x4a\x1f\x1f\x13\x49\x99\x97\x08\xde\x7f\x98\xaf\x51\x98\xee\x2c\xcb\xf0\x0b\xc6\xb6\xb7\x2d\x9a\xb1\xac\xa6\xe3\x15\x77\x9d\x6b\x1a\xe4\xfc\x8b\xf2\x17\x59\x08\x04\x58\x81\x9d\x1b\x1b\x69\x55\xc2\xb4\x3c\x1f\x50\xf1\x7f\x77\x90\x4c\x66\x40"
+ "\x5a\xc0\x33\x1f\xcb\x05\x6d\x5c\x06\x87\x52\xa2\x8f\x26\xd5\x4f"
+}, {
+ .key = "\x35\x4e\xb5\x70\x50\x42\x8a\x85\xf2\xfb\xed\x7b\xd0\x9e\x97\xca\xfa\x98\x66\x63\xee\x37\xcc\x52\xfe\xd1\xdf\x95\x15\x34\x29\x38",
+ .nonce = "\xfd\x87\xd4\xd8\x62\xfd\xec\xaa",
+ .nlen = 8,
+ .assoc = "\xd6\x31\xda\x5d\x42\x5e\xd7",
+ .alen = 7,
+ .input = "\x6a\xfc\x4b\x25\xdf\xc0\xe4\xe8\x17\x4d\x4c\xc9\x7e\xde\x3a\xcc\x3c\xba\x6a\x77\x47\xdb\xe3\x74\x7a\x4d\x5f\x8d\x37\x55\x80\x73\x90\x66\x5d\x3a\x7d\x5d\x86\x5e\x8d\xfd\x83\xff\x4e\x74\x6f\xf9\xe6\x70\x17\x70\x3e\x96\xa7\x7e\xcb\xab\x8f\x58\x24\x9b\x01\xfd\xcb\xe6\x4d\x9b\xf0\x88\x94\x57\x66\xef\x72\x4c\x42\x6e\x16\x19\x15\xea\x70\x5b\xac\x13\xdb\x9f\x18\xe2\x3c\x26\x97\xbc\xdc\x45\x8c\x6c\x24\x69\x9c\xf7\x65\x1e\x18\x59\x31\x7c\xe4\x73\xbc\x39\x62\xc6\x5c\x9f\xbf\xfa\x90\x03\xc9\x72\x26\xb6\x1b\xc2\xb7\x3f\xf2\x13\x77\xf2\x8d\xb9\x47\xd0\x53\xdd\xc8\x91\x83\x8b\xb1\xce\xa3\xfe\xcd\xd9\xdd\x92\x7b\xdb\xb8\xfb\xc9\x2d\x01\x59\x39\x52\xad\x1b\xec\xcf\xd7\x70\x13\x21\xf5\x47\xaa\x18\x21\x5c\xc9\x9a\xd2\x6b\x05\x9c\x01\xa1\xda\x35\x5d\xb3\x70\xe6\xa9\x80\x8b\x91\xb7\xb3\x5f\x24\x9a\xb7"
+ "\xd1\x6b\xa1\x1c\x50\xba\x49\xe0\xee\x2e\x75\xac\x69\xc0\xeb\x03\xdd\x19\xe5\xf6\x06\xdd\xc3\xd7\x2b\x07\x07\x30\xa7\x19\x0c\xbf\xe6\x18\xcc\xb1\x01\x11\x85\x77\x1d\x96\xa7\xa3\x00\x84\x02\xa2\x83\x68\xda\x17\x27\xc8\x7f\x23\xb7\xf4\x13\x85\xcf\xdd\x7a\x7d\x24\x57\xfe\x05\x93\xf5\x74\xce\xed\x0c\x20\x98\x8d\x92\x30\xa1\x29\x23\x1a\xa0\x4f\x69\x56\x4c\xe1\xc8\xce\xf6\x9a\x0c\xa4\xfa\x04\xf6\x62\x95\xf2\xfa\xc7\x40\x68\x40\x8f\x41\xda\xb4\x26\x6f\x70\xab\x40\x61\xa4\x0e\x75\xfb\x86\xeb\x9d\x9a\x1f\xec\x76\x99\xe7\xea\xaa\x1e\x2d\xb5\xd4\xa6\x1a\xb8\x61\x0a\x1d\x16\x5b\x98\xc2\x31\x40\xe7\x23\x1d\x66\x99\xc8\xc0\xd7\xce\xf3\x57\x40\x04\x3f\xfc\xea\xb3\xfc\xd2\xd3\x99\xa4\x94\x69\xa0\xef\xd1\x85\xb3\xa6\xb1\x28\xbf\x94\x67\x22\xc3\x36\x46\xf8\xd2\x0f\x5f\xf4\x59\x80\xe6\x2d\x43\x08\x7d\x19\x09\x97\xa7\x4c"
+ "\x3d\x8d\xba\x65\x62\xa3\x71\x33\x29\x62\xdb\xc1\x33\x34\x1a\x63\x33\x16\xb6\x64\x7e\xab\x33\xf0\xe6\x26\x68\xba\x1d\x2e\x38\x08\xe6\x02\xd3\x25\x2c\x47\x23\x58\x34\x0f\x9d\x63\x4f\x63\xbb\x7f\x3b\x34\x38\xa7\xb5\x8d\x65\xd9\x9f\x79\x55\x3e\x4d\xe7\x73\xd8\xf6\x98\x97\x84\x60\x9c\xc8\xa9\x3c\xf6\xdc\x12\x5c\xe1\xbb\x0b\x8b\x98\x9c\x9d\x26\x7c\x4a\xe6\x46\x36\x58\x21\x4a\xee\xca\xd7\x3b\xc2\x6c\x49\x2f\xe5\xd5\x03\x59\x84\x53\xcb\xfe\x92\x71\x2e\x7c\x21\xcc\x99\x85\x7f\xb8\x74\x90\x13\x42\x3f\xe0\x6b\x1d\xf2\x4d\x54\xd4\xfc\x3a\x05\xe6\x74\xaf\xa6\xa0\x2a\x20\x23\x5d\x34\x5c\xd9\x3e\x4e\xfa\x93\xe7\xaa\xe9\x6f\x08\x43\x67\x41\xc5\xad\xfb\x31\x95\x82\x73\x32\xd8\xa6\xa3\xed\x0e\x2d\xf6\x5f\xfd\x80\xa6\x7a\xe0\xdf\x78\x15\x29\x74\x33\xd0\x9e\x83\x86\x72\x22\x57\x29\xb9\x9e\x5d\xd3\x1a\xb5\x96"
+ "\x72\x41\x3d\xf1\x64\x43\x67\xee\xaa\x5c\xd3\x9a\x96\x13\x11\x5d\xf3\x0c\x87\x82\x1e\x41\x9e\xd0\x27\xd7\x54\x3b\x67\x73\x09\x91\xe9\xd5\x36\xa7\xb5\x55\xe4\xf3\x21\x51\x49\x22\x07\x55\x4f\x44\x4b\xd2\x15\x93\x17\x2a\xfa\x4d\x4a\x57\xdb\x4c\xa6\xeb\xec\x53\x25\x6c\x21\xed\x00\x4c\x3b\xca\x14\x57\xa9\xd6\x6a\xcd\x8d\x5e\x74\xac\x72\xc1\x97\xe5\x1b\x45\x4e\xda\xfc\xcc\x40\xe8\x48\x88\x0b\xa3\xe3\x8d\x83\x42\xc3\x23\xfd\x68\xb5\x8e\xf1\x9d\x63\x77\xe9\xa3\x8e\x8c\x26\x6b\xbd\x72\x73\x35\x0c\x03\xf8\x43\x78\x52\x71\x15\x1f\x71\x5d\x6e\xed\xb9\xcc\x86\x30\xdb\x2b\xd3\x82\x88\x23\x71\x90\x53\x5c\xa9\x2f\x76\x01\xb7\x9a\xfe\x43\x55\xa3\x04\x9b\x0e\xe4\x59\xdf\xc9\xe9\xb1\xea\x29\x28\x3c\x5c\xae\x72\x84\xb6\xc6\xeb\x0c\x27\x07\x74\x90\x0d\x31\xb0\x00\x77\xe9\x40\x70\x6f\x68\xa7\xfd\x06\xec\x4b\xc0\xb7\xac"
+ "\xbc\x33\xb7\x6d\x0a\xbd\x12\x1b\x59\xcb\xdd\x32\xf5\x1d\x94\x57\x76\x9e\x0c\x18\x98\x71\xd7\x2a\xdb\x0b\x7b\xa7\x71\xb7\x67\x81\x23\x96\xae\xb9\x7e\x32\x43\x92\x8a\x19\xa0\xc4\xd4\x3b\x57\xf9\x4a\x2c\xfb\x51\x46\xbb\xcb\x5d\xb3\xef\x13\x93\x6e\x68\x42\x54\x57\xd3\x6a\x3a\x8f\x9d\x66\xbf\xbd\x36\x23\xf5\x93\x83\x7b\x9c\xc0\xdd\xc5\x49\xc0\x64\xed\x07\x12\xb3\xe6\xe4\xe5\x38\x95\x23\xb1\xa0\x3b\x1a\x61\xda\x17\xac\xc3\x58\xdd\x74\x64\x22\x11\xe8\x32\x1d\x16\x93\x85\x99\xa5\x9c\x34\x55\xb1\xe9\x20\x72\xc9\x28\x7b\x79\x00\xa1\xa6\xa3\x27\x40\x18\x8a\x54\xe0\xcc\xe8\x4e\x8e\x43\x96\xe7\x3f\xc8\xe9\xb2\xf9\xc9\xda\x04\x71\x50\x47\xe4\xaa\xce\xa2\x30\xc8\xe4\xac\xc7\x0d\x06\x2e\xe6\xe8\x80\x36\x29\x9e\x01\xb8\xc3\xf0\xa0\x5d\x7a\xca\x4d\xa0\x57\xbd\x2a\x45\xa7\x7f\x9c\x93\x07\x8f\x35\x67\x92\xe3\xe9"
+ "\x7f\xa8\x61\x43\x9e\x25\x4f\x33\x76\x13\x6e\x12\xb9\xdd\xa4\x7c\x08\x9f\x7c\xe7\x0a\x8d\x84\x06\xa4\x33\x17\x34\x5e\x10\x7c\xc0\xa8\x3d\x1f\x42\x20\x51\x65\x5d\x09\xc3\xaa\xc0\xc8\x0d\xf0\x79\xbc\x20\x1b\x95\xe7\x06\x7d\x47\x20\x03\x1a\x74\xdd\xe2\xd4\xae\x38\x71\x9b\xf5\x80\xec\x08\x4e\x56\xba\x76\x12\x1a\xdf\x48\xf3\xae\xb3\xe6\xe6\xbe\xc0\x91\x2e\x01\xb3\x01\x86\xa2\xb9\x52\xd1\x21\xae\xd4\x97\x1d\xef\x41\x12\x95\x3d\x48\x45\x1c\x56\x32\x8f\xb8\x43\xbb\x19\xf3\xca\xe9\xeb\x6d\x84\xbe\x86\x06\xe2\x36\xb2\x62\x9d\xd3\x4c\x48\x18\x54\x13\x4e\xcf\xfd\xba\x84\xb9\x30\x53\xcf\xfb\xb9\x29\x8f\xdc\x9f\xef\x60\x0b\x64\xf6\x8b\xee\xa6\x91\xc2\x41\x6c\xf6\xfa\x79\x67\x4b\xc1\x3f\xaf\x09\x81\xd4\x5d\xcb\x09\xdf\x36\x31\xc0\x14\x3c\x7c\x0e\x65\x95\x99\x6d\xa3\xf4\xd7\x38\xee\x1a\x2b\x37\xe2\xa4\x3b\x4b\xd0"
+ "\x65\xca\xf8\xc3\xe8\x15\x20\xef\xf2\x00\xfd\x01\x09\xc5\xc8\x17\x04\x93\xd0\x93\x03\x55\xc5\xfe\x32\xa3\x3e\x28\x2d\x3b\x93\x8a\xcc\x07\x72\x80\x8b\x74\x16\x24\xbb\xda\x94\x39\x30\x8f\xb1\xcd\x4a\x90\x92\x7c\x14\x8f\x95\x4e\xac\x9b\xd8\x8f\x1a\x87\xa4\x32\x27\x8a\xba\xf7\x41\xcf\x84\x37\x19\xe6\x06\xf5\x0e\xcf\x36\xf5\x9e\x6c\xde\xbc\xff\x64\x7e\x4e\x59\x57\x48\xfe\x14\xf7\x9c\x93\x5d\x15\xad\xcc\x11\xb1\x17\x18\xb2\x7e\xcc\xab\xe9\xce\x7d\x77\x5b\x51\x1b\x1e\x20\xa8\x32\x06\x0e\x75\x93\xac\xdb\x35\x37\x1f\xe9\x19\x1d\xb4\x71\x97\xd6\x4e\x2c\x08\xa5\x13\xf9\x0e\x7e\x78\x6e\x14\xe0\xa9\xb9\x96\x4c\x80\x82\xba\x17\xb3\x9d\x69\xb0\x84\x46\xff\xf9\x52\x79\x94\x58\x3a\x62\x90\x15\x35\x71\x10\x37\xed\xa1\x8e\x53\x6e\xf4\x26\x57\x93\x15\x93\xf6\x81\x2c\x5a\x10\xda\x92\xad\x2f\xdb\x28\x31\x2d\x55\x04\xd2"
+ "\x06\x28\x8c\x1e\xdc\xea\x54\xac\xff\xb7\x6c\x30\x15\xd4\xb4\x0d\x00\x93\x57\xdd\xd2\x07\x07\x06\xd9\x43\x9b\xcd\x3a\xf4\x7d\x4c\x36\x5d\x23\xa2\xcc\x57\x40\x91\xe9\x2c\x2f\x2c\xd5\x30\x9b\x17\xb0\xc9\xf7\xa7\x2f\xd1\x93\x20\x6b\xc6\xc1\xe4\x6f\xcb\xd1\xe7\x09\x0f\x9e\xdc\xaa\x9f\x2f\xdf\x56\x9f\xd4\x33\x04\xaf\xd3\x6c\x58\x61\xf0\x30\xec\xf2\x7f\xf2\x9c\xdf\x39\xbb\x6f\xa2\x8c\x7e\xc4\x22\x51\x71\xc0\x4d\x14\x1a\xc4\xcd\x04\xd9\x87\x08\x50\x05\xcc\xaf\xf6\xf0\x8f\x92\x54\x58\xc2\xc7\x09\x7a\x59\x02\x05\xe8\xb0\x86\xd9\xbf\x7b\x35\x51\x4d\xaf\x08\x97\x2c\x65\xda\x2a\x71\x3a\xa8\x51\xcc\xf2\x73\x27\xc3\xfd\x62\xcf\xe3\xb2\xca\xcb\xbe\x1a\x0a\xa1\x34\x7b\x77\xc4\x62\x68\x78\x5f\x94\x07\x04\x65\x16\x4b\x61\xcb\xff\x75\x26\x50\x66\x1f\x6e\x93\xf8\xc5\x51\xeb\xa4\x4a\x48\x68\x6b\xe2\x5e\x44\xb2\x50\x2c\x6c"
+ "\xae\x79\x4e\x66\x35\x81\x50\xac\xbc\x3f\xb1\x0c\xf3\x05\x3c\x4a\xa3\x6c\x2a\x79\xb4\xb7\xab\xca\xc7\x9b\x8e\xcd\x5f\x11\x03\xcb\x30\xa3\xab\xda\xfe\x64\xb9\xbb\xd8\x5e\x3a\x1a\x56\xe5\x05\x48\x90\x1e\x61\x69\x1b\x22\xe6\x1a\x3c\x75\xad\x1f\x37\x28\xdc\xe4\x6d\xbd\x42\xdc\xd3\xc8\xb6\x1c\x48\xfe\x94\x77\x7f\xbd\x62\xac\xa3\x47\x27\xcf\x5f\xd9\xdb\xaf\xec\xf7\x5e\xc1\xb0\x9d\x01\x26\x99\x7e\x8f\x03\x70\xb5\x42\xbe\x67\x28\x1b\x7c\xbd\x61\x21\x97\xcc\x5c\xe1\x97\x8f\x8d\xde\x2b\xaa\xa7\x71\x1d\x1e\x02\x73\x70\x58\x32\x5b\x1d\x67\x3d\xe0\x74\x4f\x03\xf2\x70\x51\x79\xf1\x61\x70\x15\x74\x9d\x23\x89\xde\xac\xfd\xde\xd0\x1f\xc3\x87\x44\x35\x4b\xe5\xb0\x60\xc5\x22\xe4\x9e\xca\xeb\xd5\x3a\x09\x45\xa4\xdb\xfa\x3f\xeb\x1b\xc7\xc8\x14\x99\x51\x92\x10\xed\xed\x28\xe0\xa1\xf8\x26\xcf\xcd\xcb\x63\xa1\x3b\xe3"
+ "\xdf\x7e\xfe\xa6\xf0\x81\x9a\xbf\x55\xde\x54\xd5\x56\x60\x98\x10\x68\xf4\x38\x96\x8e\x6f\x1d\x44\x7f\xd6\x2f\xfe\x55\xfb\x0c\x7e\x67\xe2\x61\x44\xed\xf2\x35\x30\x5d\xe9\xc7\xd6\x6d\xe0\xa0\xed\xf3\xfc\xd8\x3e\x0a\x7b\xcd\xaf\x65\x68\x18\xc0\xec\x04\x1c\x74\x6d\xe2\x6e\x79\xd4\x11\x2b\x62\xd5\x27\xad\x4f\x01\x59\x73\xcc\x6a\x53\xfb\x2d\xd5\x4e\x99\x21\x65\x4d\xf5\x82\xf7\xd8\x42\xce\x6f\x3d\x36\x47\xf1\x05\x16\xe8\x1b\x6a\x8f\x93\xf2\x8f\x37\x40\x12\x28\xa3\xe6\xb9\x17\x4a\x1f\xb1\xd1\x66\x69\x86\xc4\xfc\x97\xae\x3f\x8f\x1e\x2b\xdf\xcd\xf9\x3c",
+ .ilen = 1949,
+ .result = "\x7a\x57\xf2\xc7\x06\x3f\x50\x7b\x36\x1a\x66\x5c\xb9\x0e\x5e\x3b\x45\x60\xbe\x9a\x31\x9f\xff\x5d\x66\x34\xb4\xdc\xfb\x9d\x8e\xee\x6a\x33\xa4\x07\x3c\xf9\x4c\x30\xa1\x24\x52\xf9\x50\x46\x88\x20\x02\x32\x3a\x0e\x99\x63\xaf\x1f\x15\x28\x2a\x05\xff\x57\x59\x5e\x18\xa1\x1f\xd0\x92\x5c\x88\x66\x1b\x00\x64\xa5\x93\x8d\x06\x46\xb0\x64\x8b\x8b\xef\x99\x05\x35\x85\xb3\xf3\x33\xbb\xec\x66\xb6\x3d\x57\x42\xe3\xb4\xc6\xaa\xb0\x41\x2a\xb9\x59\xa9\xf6\x3e\x15\x26\x12\x03\x21\x4c\x74\x43\x13\x2a\x03\x27\x09\xb4\xfb\xe7\xb7\x40\xff\x5e\xce\x48\x9a\x60\xe3\x8b\x80\x8c\x38\x2d\xcb\x93\x37\x74\x05\x52\x6f\x73\x3e\xc3\xbc\xca\x72\x0a\xeb\xf1\x3b\xa0\x95\xdc\x8a\xc4\xa9\xdc\xca\x44\xd8\x08\x63\x6a\x36\xd3\x3c\xb8\xac\x46\x7d\xfd\xaa\xeb\x3e\x0f\x45\x8f\x49\xda\x2b\xf2\x12\xbd\xaf\x67\x8a\x63\x48\x4b\x55\x5f\x6d\x8c"
+ "\xb9\x76\x34\x84\xae\xc2\xfc\x52\x64\x82\xf7\xb0\x06\xf0\x45\x73\x12\x50\x30\x72\xea\x78\x9a\xa8\xaf\xb5\xe3\xbb\x77\x52\xec\x59\x84\xbf\x6b\x8f\xce\x86\x5e\x1f\x23\xe9\xfb\x08\x86\xf7\x10\xb9\xf2\x44\x96\x44\x63\xa9\xa8\x78\x00\x23\xd6\xc7\xe7\x6e\x66\x4f\xcc\xee\x15\xb3\xbd\x1d\xa0\xe5\x9c\x1b\x24\x2c\x4d\x3c\x62\x35\x9c\x88\x59\x09\xdd\x82\x1b\xcf\x0a\x83\x6b\x3f\xae\x03\xc4\xb4\xdd\x7e\x5b\x28\x76\x25\x96\xd9\xc9\x9d\x5f\x86\xfa\xf6\xd7\xd2\xe6\x76\x1d\x0f\xa1\xdc\x74\x05\x1b\x1d\xe0\xcd\x16\xb0\xa8\x8a\x34\x7b\x15\x11\x77\xe5\x7b\x7e\x20\xf7\xda\x38\xda\xce\x70\xe9\xf5\x6c\xd9\xbe\x0c\x4c\x95\x4c\xc2\x9b\x34\x55\x55\xe1\xf3\x46\x8e\x48\x74\x14\x4f\x9d\xc9\xf5\xe8\x1a\xf0\x11\x4a\xc1\x8d\xe0\x93\xa0\xbe\x09\x1c\x2b\x4e\x0f\xb2\x87\x8b\x84\xfe\x92\x32\x14\xd7\x93\xdf\xe7\x44\xbc\xc5\xae\x53"
+ "\x69\xd8\xb3\x79\x37\x80\xe3\x17\x5c\xec\x53\x00\x9a\xe3\x8e\xdc\x38\xb8\x66\xf0\xd3\xad\x1d\x02\x96\x86\x3e\x9d\x3b\x5d\xa5\x7f\x21\x10\xf1\x1f\x13\x20\xf9\x57\x87\x20\xf5\x5f\xf1\x17\x48\x0a\x51\x5a\xcd\x19\x03\xa6\x5a\xd1\x12\x97\xe9\x48\xe2\x1d\x83\x75\x50\xd9\x75\x7d\x6a\x82\xa1\xf9\x4e\x54\x87\x89\xc9\x0c\xb7\x5b\x6a\x91\xc1\x9c\xb2\xa9\xdc\x9a\xa4\x49\x0a\x6d\x0d\xbb\xde\x86\x44\xdd\x5d\x89\x2b\x96\x0f\x23\x95\xad\xcc\xa2\xb3\xb9\x7e\x74\x38\xba\x9f\x73\xae\x5f\xf8\x68\xa2\xe0\xa9\xce\xbd\x40\xd4\x4c\x6b\xd2\x56\x62\xb0\xcc\x63\x7e\x5b\xd3\xae\xd1\x75\xce\xbb\xb4\x5b\xa8\xf8\xb4\xac\x71\x75\xaa\xc9\x9f\xbb\x6c\xad\x0f\x55\x5d\xe8\x85\x7d\xf9\x21\x35\xea\x92\x85\x2b\x00\xec\x84\x90\x0a\x63\x96\xe4\x6b\xa9\x77\xb8\x91\xf8\x46\x15\x72\x63\x70\x01\x40\xa3\xa5\x76\x62\x2b\xbf\xf1\xe5\x8d\x9f"
+ "\xa3\xfa\x9b\x03\xbe\xfe\x65\x6f\xa2\x29\x0d\x54\xb4\x71\xce\xa9\xd6\x3d\x88\xf9\xaf\x6b\xa8\x9e\xf4\x16\x96\x36\xb9\x00\xdc\x10\xab\xb5\x08\x31\x1f\x00\xb1\x3c\xd9\x38\x3e\xc6\x04\xa7\x4e\xe8\xae\xed\x98\xc2\xf7\xb9\x00\x5f\x8c\x60\xd1\xe5\x15\xf7\xae\x1e\x84\x88\xd1\xf6\xbc\x3a\x89\x35\x22\x83\x7c\xca\xf0\x33\x82\x4c\x79\x3c\xfd\xb1\xae\x52\x62\x55\xd2\x41\x60\xc6\xbb\xfa\x0e\x59\xd6\xa8\xfe\x5d\xed\x47\x3d\xe0\xea\x1f\x6e\x43\x51\xec\x10\x52\x56\x77\x42\x6b\x52\x87\xd8\xec\xe0\xaa\x76\xa5\x84\x2a\x22\x24\xfd\x92\x40\x88\xd5\x85\x1c\x1f\x6b\x47\xa0\xc4\xe4\xef\xf4\xea\xd7\x59\xac\x2a\x9e\x8c\xfa\x1f\x42\x08\xfe\x4f\x74\xa0\x26\xf5\xb3\x84\xf6\x58\x5f\x26\x66\x3e\xd7\xe4\x22\x91\x13\xc8\xac\x25\x96\x23\xd8\x09\xea\x45\x75\x23\xb8\x5f\xc2\x90\x8b\x09\xc4\xfc\x47\x6c\x6d\x0a\xef\x69\xa4\x38\x19"
+ "\xcf\x7d\xf9\x09\x73\x9b\x60\x5a\xf7\x37\xb5\xfe\x9f\xe3\x2b\x4c\x0d\x6e\x19\xf1\xd6\xc0\x70\xf3\x9d\x22\x3c\xf9\x49\xce\x30\x8e\x44\xb5\x76\x15\x8f\x52\xfd\xa5\x04\xb8\x55\x6a\x36\x59\x7c\xc4\x48\xb8\xd7\xab\x05\x66\xe9\x5e\x21\x6f\x6b\x36\x29\xbb\xe9\xe3\xa2\x9a\xa8\xcd\x55\x25\x11\xba\x5a\x58\xa0\xde\xae\x19\x2a\x48\x5a\xff\x36\xcd\x6d\x16\x7a\x73\x38\x46\xe5\x47\x59\xc8\xa2\xf6\xe2\x6c\x83\xc5\x36\x2c\x83\x7d\xb4\x01\x05\x69\xe7\xaf\x5c\xc4\x64\x82\x12\x21\xef\xf7\xd1\x7d\xb8\x8d\x8c\x98\x7c\x5f\x7d\x92\x88\xb9\x94\x07\x9c\xd8\xe9\x9c\x17\x38\xe3\x57\x6c\xe0\xdc\xa5\x92\x42\xb3\xbd\x50\xa2\x7e\xb5\xb1\x52\x72\x03\x97\xd8\xaa\x9a\x1e\x75\x41\x11\xa3\x4f\xcc\xd4\xe3\x73\xad\x96\xdc\x47\x41\x9f\xb0\xbe\x79\x91\xf5\xb6\x18\xfe\xc2\x83\x18\x7d\x73\xd9\x4f\x83\x84\x03\xb3\xf0\x77\x66\x3d\x83\x63"
+ "\x2e\x2c\xf9\xdd\xa6\x1f\x89\x82\xb8\x23\x42\xeb\xe2\xca\x70\x82\x61\x41\x0a\x6d\x5f\x75\xc5\xe2\xc4\x91\x18\x44\x22\xfa\x34\x10\xf5\x20\xdc\xb7\xdd\x2a\x20\x77\xf5\xf9\xce\xdb\xa0\x0a\x52\x2a\x4e\xdd\xcc\x97\xdf\x05\xe4\x5e\xb7\xaa\xf0\xe2\x80\xff\xba\x1a\x0f\xac\xdf\x02\x32\xe6\xf7\xc7\x17\x13\xb7\xfc\x98\x48\x8c\x0d\x82\xc9\x80\x7a\xe2\x0a\xc5\xb4\xde\x7c\x3c\x79\x81\x0e\x28\x65\x79\x67\x82\x69\x44\x66\x09\xf7\x16\x1a\xf9\x7d\x80\xa1\x79\x14\xa9\xc8\x20\xfb\xa2\x46\xbe\x08\x35\x17\x58\xc1\x1a\xda\x2a\x6b\x2e\x1e\xe6\x27\x55\x7b\x19\xe2\xfb\x64\xfc\x5e\x15\x54\x3c\xe7\xc2\x11\x50\x30\xb8\x72\x03\x0b\x1a\x9f\x86\x27\x11\x5c\x06\x2b\xbd\x75\x1a\x0a\xda\x01\xfa\x5c\x4a\xc1\x80\x3a\x6e\x30\xc8\x2c\xeb\x56\xec\x89\xfa\x35\x7b\xb2\xf0\x97\x08\x86\x53\xbe\xbd\x40\x41\x38\x1c\xb4\x8b\x79\x2e\x18\x96\x94"
+ "\xde\xe8\xca\xe5\x9f\x92\x9f\x15\x5d\x56\x60\x5c\x09\xf9\x16\xf4\x17\x0f\xf6\x4c\xda\xe6\x67\x89\x9f\xca\x6c\xe7\x9b\x04\x62\x0e\x26\xa6\x52\xbd\x29\xff\xc7\xa4\x96\xe6\x6a\x02\xa5\x2e\x7b\xfe\x97\x68\x3e\x2e\x5f\x3b\x0f\x36\xd6\x98\x19\x59\x48\xd2\xc6\xe1\x55\x1a\x6e\xd6\xed\x2c\xba\xc3\x9e\x64\xc9\x95\x86\x35\x5e\x3e\x88\x69\x99\x4b\xee\xbe\x9a\x99\xb5\x6e\x58\xae\xdd\x22\xdb\xdd\x6b\xfc\xaf\x90\xa3\x3d\xa4\xc1\x15\x92\x18\x8d\xd2\x4b\x7b\x06\xd1\x37\xb5\xe2\x7c\x2c\xf0\x25\xe4\x94\x2a\xbd\xe3\x82\x70\x78\xa3\x82\x10\x5a\x90\xd7\xa4\xfa\xaf\x1a\x88\x59\xdc\x74\x12\xb4\x8e\xd7\x19\x46\xf4\x84\x69\x9f\xbb\x70\xa8\x4c\x52\x81\xa9\xff\x76\x1c\xae\xd8\x11\x3d\x7f\x7d\xc5\x12\x59\x28\x18\xc2\xa2\xb7\x1c\x88\xf8\xd6\x1b\xa6\x7d\x9e\xde\x29\xf8\xed\xff\xeb\x92\x24\x4f\x05\xaa\xd9\x49\xba\x87\x59"
+ "\x51\xc9\x20\x5c\x9b\x74\xcf\x03\xd9\x2d\x34\xc7\x5b\xa5\x40\xb2\x99\xf5\xcb\xb4\xf6\xb7\x72\x4a\xd6\xbd\xb0\xf3\x93\xe0\x1b\xa8\x04\x1e\x35\xd4\x80\x20\xf4\x9c\x31\x6b\x45\xb9\x15\xb0\x5e\xdd\x0a\x33\x9c\x83\xcd\x58\x89\x50\x56\xbb\x81\x00\x91\x32\xf3\x1b\x3e\xcf\x45\xe1\xf9\xe1\x2c\x26\x78\x93\x9a\x60\x46\xc9\xb5\x5e\x6a\x28\x92\x87\x3f\x63\x7b\xdb\xf7\xd0\x13\x9d\x32\x40\x5e\xcf\xfb\x79\x68\x47\x4c\xfd\x01\x17\xe6\x97\x93\x78\xbb\xa6\x27\xa3\xe8\x1a\xe8\x94\x55\x7d\x08\xe5\xdc\x66\xa3\x69\xc8\xca\xc5\xa1\x84\x55\xde\x08\x91\x16\x3a\x0c\x86\xab\x27\x2b\x64\x34\x02\x6c\x76\x8b\xc6\xaf\xcc\xe1\xd6\x8c\x2a\x18\x3d\xa6\x1b\x37\x75\x45\x73\xc2\x75\xd7\x53\x78\x3a\xd6\xe8\x29\xd2\x4a\xa8\x1e\x82\xf6\xb6\x81\xde\x21\xed\x2b\x56\xbb\xf2\xd0\x57\xc1\x7c\xd2\x6a\xd2\x56\xf5\x13\x5f\x1c\x6a\x0b\x74\xfb"
+ "\xe9\xfe\x9e\xea\x95\xb2\x46\xab\x0a\xfc\xfd\xf3\xbb\x04\x2b\x76\x1b\xa4\x74\xb0\xc1\x78\xc3\x69\xe2\xb0\x01\xe1\xde\x32\x4c\x8d\x1a\xb3\x38\x08\xd5\xfc\x1f\xdc\x0e\x2c\x9c\xb1\xa1\x63\x17\x22\xf5\x6c\x93\x70\x74\x00\xf8\x39\x01\x94\xd1\x32\x23\x56\x5d\xa6\x02\x76\x76\x93\xce\x2f\x19\xe9\x17\x52\xae\x6e\x2c\x6d\x61\x7f\x3b\xaa\xe0\x52\x85\xc5\x65\xc1\xbb\x8e\x5b\x21\xd5\xc9\x78\x83\x07\x97\x4c\x62\x61\x41\xd4\xfc\xc9\x39\xe3\x9b\xd0\xcc\x75\xc4\x97\xe6\xdd\x2a\x5f\xa6\xe8\x59\x6c\x98\xb9\x02\xe2\xa2\xd6\x68\xee\x3b\x1d\xe3\x4d\x5b\x30\xef\x03\xf2\xeb\x18\x57\x36\xe8\xa1\xf4\x47\xfb\xcb\x8f\xcb\xc8\xf3\x4f\x74\x9d\x9d\xb1\x8d\x14\x44\xd9\x19\xb4\x54\x4f\x75\x19\x09\xa0\x75\xbc\x3b\x82\xc6\x3f\xb8\x83\x19\x6e\xd6\x37\xfe\x6e\x8a\x4e\xe0\x4a\xab\x7b\xc8\xb4\x1d\xf4\xed\x27\x03\x65\xa2\xa1\xae\x11\xe7"
+ "\x98\x78\x48\x91\xd2\xd2\xd4\x23\x78\x50\xb1\x5b\x85\x10\x8d\xca\x5f\x0f\x71\xae\x72\x9a\xf6\x25\x19\x60\x06\xf7\x10\x34\x18\x0d\xc9\x9f\x7b\x0c\x9b\x8f\x91\x1b\x9f\xcd\x10\xee\x75\xf9\x97\x66\xfc\x4d\x33\x6e\x28\x2b\x92\x85\x4f\xab\x43\x8d\x8f\x7d\x86\xa7\xc7\xd8\xd3\x0b\x8b\x57\xb6\x1d\x95\x0d\xe9\xbc\xd9\x03\xd9\x10\x19\xc3\x46\x63\x55\x87\x61\x79\x6c\x95\x0e\x9c\xdd\xca\xc3\xf3\x64\xf0\x7d\x76\xb7\x53\x67\x2b\x1e\x44\x56\x81\xea\x8f\x5c\x42\x16\xb8\x28\xeb\x1b\x61\x10\x1e\xbf\xec\xa8"
+}, {
+ .key = "\xb3\x35\x50\x03\x54\x2e\x40\x5e\x8f\x59\x8e\xc5\x90\xd5\x27\x2d\xba\x29\x2e\xcb\x1b\x70\x44\x1e\x65\x91\x6e\x2a\x79\x22\xda\x64",
+ .nonce = "\x05\xa3\x93\xed\x30\xc5\xa2\x06",
+ .nlen = 8,
+ .assoc = "\xb1\x69\x83\x87\x30\xaa\x5d\xb8\x77\xe8\x21\xff\x06\x59\x35\xce\x75\xfe\x38\xef\xb8\x91\x43\x8c\xcf\x70\xdd\x0a\x68\xbf\xd4\xbc\x16\x76\x99\x36\x1e\x58\x79\x5e\xd4\x29\xf7\x33\x93\x48\xdb\x5f\x01\xae\x9c\xb6\xe4\x88\x6d\x2b\x76\x75\xe0\xf3\x74\xe2\xc9",
+ .alen = 63,
+ .input = "\x52\x34\xb3\x65\x3b\xb7\xe5\xd3\xab\x49\x17\x60\xd2\x52\x56\xdf\xdf\x34\x56\x82\xe2\xbe\xe5\xe1\x28\xd1\x4e\x5f\x4f\x01\x7d\x3f\x99\x6b\x30\x6e\x1a\x7c\x4c\x8e\x62\x81\xae\x86\x3f\x6b\xd0\xb5\xa9\xcf\x50\xf1\x02\x12\xa0\x0b\x24\xe9\xe6\x72\x89\x2c\x52\x1b\x34\x38\xf8\x75\x5f\xa0\x74\xe2\x99\xdd\xa6\x4b\x14\x50\x4e\xf1\xbe\xd6\x9e\xdb\xb2\x24\x27\x74\x12\x4a\x78\x78\x17\xa5\x58\x8e\x2f\xf9\xf4\x8d\xee\x03\x88\xae\xb8\x29\xa1\x2f\x4b\xee\x92\xbd\x87\xb3\xce\x34\x21\x57\x46\x04\x49\x0c\x80\xf2\x01\x13\xa1\x55\xb3\xff\x44\x30\x3c\x1c\xd0\xef\xbc\x18\x74\x26\xad\x41\x5b\x5b\x3e\x9a\x7a\x46\x4f\x16\xd6\x74\x5a\xb7\x3a\x28\x31\xd8\xae\x26\xac\x50\x53\x86\xf2\x56\xd7\x3f\x29\xbc\x45\x68\x8e\xcb\x98\x64\xdd\xc9\xba\xb8\x4b\x7b\x82\xdd\x14\xa7\xcb\x71\x72\x00\x5c\xad\x7b\x6a\x89\xa4\x3d\xbf\xb5"
+ "\x4b\x3e\x7c\x5a\xcf\xb8\xa1\xc5\x6e\xc8\xb6\x31\x57\x7b\xdf\xa5\x7e\xb1\xd6\x42\x2a\x31\x36\xd1\xd0\x3f\x7a\xe5\x94\xd6\x36\xa0\x6f\xb7\x40\x7d\x37\xc6\x55\x7c\x50\x40\x6d\x29\x89\xe3\x5a\xae\x97\xe7\x44\x49\x6e\xbd\x81\x3d\x03\x93\x06\x12\x06\xe2\x41\x12\x4a\xf1\x6a\xa4\x58\xa2\xfb\xd2\x15\xba\xc9\x79\xc9\xce\x5e\x13\xbb\xf1\x09\x04\xcc\xfd\xe8\x51\x34\x6a\xe8\x61\x88\xda\xed\x01\x47\x84\xf5\x73\x25\xf9\x1c\x42\x86\x07\xf3\x5b\x1a\x01\xb3\xeb\x24\x32\x8d\xf6\xed\x7c\x4b\xeb\x3c\x36\x42\x28\xdf\xdf\xb6\xbe\xd9\x8c\x52\xd3\x2b\x08\x90\x8c\xe7\x98\x31\xe2\x32\x8e\xfc\x11\x48\x00\xa8\x6a\x42\x4a\x02\xc6\x4b\x09\xf1\xe3\x49\xf3\x45\x1f\x0e\xbc\x56\xe2\xe4\xdf\xfb\xeb\x61\xfa\x24\xc1\x63\x75\xbb\x47\x75\xaf\xe1\x53\x16\x96\x21\x85\x26\x11\xb3\x76\xe3\x23\xa1\x6b\x74\x37\xd0\xde\x06\x90\x71\x5d\x43\x88"
+ "\x9b\x00\x54\xa6\x75\x2f\xa1\xc2\x0b\x73\x20\x1d\xb6\x21\x79\x57\x3f\xfa\x09\xbe\x8a\x33\xc3\x52\xf0\x1d\x82\x31\xd1\x55\xb5\x6c\x99\x25\xcf\x5c\x32\xce\xe9\x0d\xfa\x69\x2c\xd5\x0d\xc5\x6d\x86\xd0\x0c\x3b\x06\x50\x79\xe8\xc3\xae\x04\xe6\xcd\x51\xe4\x26\x9b\x4f\x7e\xa6\x0f\xab\xd8\xe5\xde\xa9\x00\x95\xbe\xa3\x9d\x5d\xb2\x09\x70\x18\x1c\xf0\xac\x29\x23\x02\x29\x28\xd2\x74\x35\x57\x62\x0f\x24\xea\x5e\x33\xc2\x92\xf3\x78\x4d\x30\x1e\xa1\x99\xa9\x82\xb0\x42\x31\x8d\xad\x8a\xbc\xfc\xd4\x57\x47\x3e\xb4\x50\xdd\x6e\x2c\x80\x4d\x22\xf1\xfb\x57\xc4\xdd\x17\xe1\x8a\x36\x4a\xb3\x37\xca\xc9\x4e\xab\xd5\x69\xc4\xf4\xbc\x0b\x3b\x44\x4b\x29\x9c\xee\xd4\x35\x22\x21\xb0\x1f\x27\x64\xa8\x51\x1b\xf0\x9f\x19\x5c\xfb\x5a\x64\x74\x70\x45\x09\xf5\x64\xfe\x1a\x2d\xc9\x14\x04\x14\xcf\xd5\x7d\x60\xaf\x94\x39\x94\xe2\x7d\x79"
+ "\x82\xd0\x65\x3b\x6b\x9c\x19\x84\xb4\x6d\xb3\x0c\x99\xc0\x56\xa8\xbd\x73\xce\x05\x84\x3e\x30\xaa\xc4\x9b\x1b\x04\x2a\x9f\xd7\x43\x2b\x23\xdf\xbf\xaa\xd5\xc2\x43\x2d\x70\xab\xdc\x75\xad\xac\xf7\xc0\xbe\x67\xb2\x74\xed\x67\x10\x4a\x92\x60\xc1\x40\x50\x19\x8a\x8a\x8c\x09\x0e\x72\xe1\x73\x5e\xe8\x41\x85\x63\x9f\x3f\xd7\x7d\xc4\xfb\x22\x5d\x92\x6c\xb3\x1e\xe2\x50\x2f\x82\xa8\x28\xc0\xb5\xd7\x5f\x68\x0d\x2c\x2d\xaf\x7e\xfa\x2e\x08\x0f\x1f\x70\x9f\xe9\x19\x72\x55\xf8\xfb\x51\xd2\x33\x5d\xa0\xd3\x2b\x0a\x6c\xbc\x4e\xcf\x36\x4d\xdc\x3b\xe9\x3e\x81\x7c\x61\xdb\x20\x2d\x3a\xc3\xb3\x0c\x1e\x00\xb9\x7c\xf5\xca\x10\x5f\x3a\x71\xb3\xe4\x20\xdb\x0c\x2a\x98\x63\x45\x00\x58\xf6\x68\xe4\x0b\xda\x13\x3b\x60\x5c\x76\xdb\xb9\x97\x71\xe4\xd9\xb7\xdb\xbd\x68\xc7\x84\x84\xaa\x7c\x68\x62\x5e\x16\xfc\xba\x72\xaa\x9a\xa9\xeb"
+ "\x7c\x75\x47\x97\x7e\xad\xe2\xd9\x91\xe8\xe4\xa5\x31\xd7\x01\x8e\xa2\x11\x88\x95\xb9\xf2\x9b\xd3\x7f\x1b\x81\x22\xf7\x98\x60\x0a\x64\xa6\xc1\xf6\x49\xc7\xe3\x07\x4d\x94\x7a\xcf\x6e\x68\x0c\x1b\x3f\x6e\x2e\xee\x92\xfa\x52\xb3\x59\xf8\xf1\x8f\x6a\x66\xa3\x82\x76\x4a\x07\x1a\xc7\xdd\xf5\xda\x9c\x3c\x24\xbf\xfd\x42\xa1\x10\x64\x6a\x0f\x89\xee\x36\xa5\xce\x99\x48\x6a\xf0\x9f\x9e\x69\xa4\x40\x20\xe9\x16\x15\xf7\xdb\x75\x02\xcb\xe9\x73\x8b\x3b\x49\x2f\xf0\xaf\x51\x06\x5c\xdf\x27\x27\x49\x6a\xd1\xcc\xc7\xb5\x63\xb5\xfc\xb8\x5c\x87\x7f\x84\xb4\xcc\x14\xa9\x53\xda\xa4\x56\xf8\xb6\x1b\xcc\x40\x27\x52\x06\x5a\x13\x81\xd7\x3a\xd4\x3b\xfb\x49\x65\x31\x33\xb2\xfa\xcd\xad\x58\x4e\x2b\xae\xd2\x20\xfb\x1a\x48\xb4\x3f\x9a\xd8\x7a\x35\x4a\xc8\xee\x88\x5e\x07\x66\x54\xb9\xec\x9f\xa3\xe3\xb9\x37\xaa\x49\x76\x31\xda"
+ "\x74\x2d\x3c\xa4\x65\x10\x32\x38\xf0\xde\xd3\x99\x17\xaa\x71\xaa\x8f\x0f\x8c\xaf\xa2\xf8\x5d\x64\xba\x1d\xa3\xef\x96\x73\xe8\xa1\x02\x8d\x0c\x6d\xb8\x06\x90\xb8\x08\x56\x2c\xa7\x06\xc9\xc2\x38\xdb\x7c\x63\xb1\x57\x8e\xea\x7c\x79\xf3\x49\x1d\xfe\x9f\xf3\x6e\xb1\x1d\xba\x19\x80\x1a\x0a\xd3\xb0\x26\x21\x40\xb1\x7c\xf9\x4d\x8d\x10\xc1\x7e\xf4\xf6\x3c\xa8\xfd\x7c\xa3\x92\xb2\x0f\xaa\xcc\xa6\x11\xfe\x04\xe3\xd1\x7a\x32\x89\xdf\x0d\xc4\x8f\x79\x6b\xca\x16\x7c\x6e\xf9\xad\x0f\xf6\xfe\x27\xdb\xc4\x13\x70\xf1\x62\x1a\x4f\x79\x40\xc9\x9b\x8b\x21\xea\x84\xfa\xf5\xf1\x89\xce\xb7\x55\x0a\x80\x39\x2f\x55\x36\x16\x9c\x7b\x08\xbd\x87\x0d\xa5\x32\xf1\x52\x7c\xe8\x55\x60\x5b\xd7\x69\xe4\xfc\xfa\x12\x85\x96\xea\x50\x28\xab\x8a\xf7\xbb\x0e\x53\x74\xca\xa6\x27\x09\xc2\xb5\xde\x18\x14\xd9\xea\xe5\x29\x1c\x40\x56\xcf\xd7"
+ "\xae\x05\x3f\x65\xaf\x05\x73\xe2\x35\x96\x27\x07\x14\xc0\xad\x33\xf1\xdc\x44\x7a\x89\x17\x77\xd2\x9c\x58\x60\xf0\x3f\x7b\x2d\x2e\x57\x95\x54\x87\xed\xf2\xc7\x4c\xf0\xae\x56\x29\x19\x7d\x66\x4b\x9b\x83\x84\x42\x3b\x01\x25\x66\x8e\x02\xde\xb9\x83\x54\x19\xf6\x9f\x79\x0d\x67\xc5\x1d\x7a\x44\x02\x98\xa7\x16\x1c\x29\x0d\x74\xff\x85\x40\x06\xef\x2c\xa9\xc6\xf5\x53\x07\x06\xae\xe4\xfa\x5f\xd8\x39\x4d\xf1\x9b\x6b\xd9\x24\x84\xfe\x03\x4c\xb2\x3f\xdf\xa1\x05\x9e\x50\x14\x5a\xd9\x1a\xa2\xa7\xfa\xfa\x17\xf7\x78\xd6\xb5\x92\x61\x91\xac\x36\xfa\x56\x0d\x38\x32\x18\x85\x08\x58\x37\xf0\x4b\xdb\x59\xe7\xa4\x34\xc0\x1b\x01\xaf\x2d\xde\xa1\xaa\x5d\xd3\xec\xe1\xd4\xf7\xe6\x54\x68\xf0\x51\x97\xa7\x89\xea\x24\xad\xd3\x6e\x47\x93\x8b\x4b\xb4\xf7\x1c\x42\x06\x67\xe8\x99\xf6\xf5\x7b\x85\xb5\x65\xb5\xb5\xd2\x37\xf5\xf3\x02\xa6"
+ "\x4d\x11\xa7\xdc\x51\x09\x7f\xa0\xd8\x88\x1c\x13\x71\xae\x9c\xb7\x7b\x34\xd6\x4e\x68\x26\x83\x51\xaf\x1d\xee\x8b\xbb\x69\x43\x2b\x9e\x8a\xbc\x02\x0e\xa0\x1b\xe0\xa8\x5f\x6f\xaf\x1b\x8f\xe7\x64\x71\x74\x11\x7e\xa8\xd8\xf9\x97\x06\xc3\xb6\xfb\xfb\xb7\x3d\x35\x9d\x3b\x52\xed\x54\xca\xf4\x81\x01\x2d\x1b\xc3\xa7\x00\x3d\x1a\x39\x54\xe1\xf6\xff\xed\x6f\x0b\x5a\x68\xda\x58\xdd\xa9\xcf\x5c\x4a\xe5\x09\x4e\xde\x9d\xbc\x3e\xee\x5a\x00\x3b\x2c\x87\x10\x65\x60\xdd\xd7\x56\xd1\x4c\x64\x45\xe4\x21\xec\x78\xf8\x25\x7a\x3e\x16\x5d\x09\x53\x14\xbe\x4f\xae\x87\xd8\xd1\xaa\x3c\xf6\x3e\xa4\x70\x8c\x5e\x70\xa4\xb3\x6b\x66\x73\xd3\xbf\x31\x06\x19\x62\x93\x15\xf2\x86\xe4\x52\x7e\x53\x4c\x12\x38\xcc\x34\x7d\x57\xf6\x42\x93\x8a\xc4\xee\x5c\x8a\xe1\x52\x8f\x56\x64\xf6\xa6\xd1\x91\x57\x70\xcd\x11\x76\xf5\x59\x60\x60\x3c"
+ "\xc1\xc3\x0b\x7f\x58\x1a\x50\x91\xf1\x68\x8f\x6e\x74\x74\xa8\x51\x0b\xf7\x7a\x98\x37\xf2\x0a\x0e\xa4\x97\x04\xb8\x9b\xfd\xa0\xea\xf7\x0d\xe1\xdb\x03\xf0\x31\x29\xf8\xdd\x6b\x8b\x5d\xd8\x59\xa9\x29\xcf\x9a\x79\x89\x19\x63\x46\x09\x79\x6a\x11\xda\x63\x68\x48\x77\x23\xfb\x7d\x3a\x43\xcb\x02\x3b\x7a\x6d\x10\x2a\x9e\xac\xf1\xd4\x19\xf8\x23\x64\x1d\x2c\x5f\xf2\xb0\x5c\x23\x27\xf7\x27\x30\x16\x37\xb1\x90\xab\x38\xfb\x55\xcd\x78\x58\xd4\x7d\x43\xf6\x45\x5e\x55\x8d\xb1\x02\x65\x58\xb4\x13\x4b\x36\xf7\xcc\xfe\x3d\x0b\x82\xe2\x12\x11\xbb\xe6\xb8\x3a\x48\x71\xc7\x50\x06\x16\x3a\xe6\x7c\x05\xc7\xc8\x4d\x2f\x08\x6a\x17\x9a\x95\x97\x50\x68\xdc\x28\x18\xc4\x61\x38\xb9\xe0\x3e\x78\xdb\x29\xe0\x9f\x52\xdd\xf8\x4f\x91\xc1\xd0\x33\xa1\x7a\x8e\x30\x13\x82\x07\x9f\xd3\x31\x0f\x23\xbe\x32\x5a\x75\xcf\x96\xb2\xec\xb5\x32"
+ "\xac\x21\xd1\x82\x33\xd3\x15\x74\xbd\x90\xf1\x2c\xe6\x5f\x8d\xe3\x02\xe8\xe9\xc4\xca\x96\xeb\x0e\xbc\x91\xf4\xb9\xea\xd9\x1b\x75\xbd\xe1\xac\x2a\x05\x37\x52\x9b\x1b\x3f\x5a\xdc\x21\xc3\x98\xbb\xaf\xa3\xf2\x00\xbf\x0d\x30\x89\x05\xcc\xa5\x76\xf5\x06\xf0\xc6\x54\x8a\x5d\xd4\x1e\xc1\xf2\xce\xb0\x62\xc8\xfc\x59\x42\x9a\x90\x60\x55\xfe\x88\xa5\x8b\xb8\x33\x0c\x23\x24\x0d\x15\x70\x37\x1e\x3d\xf6\xd2\xea\x92\x10\xb2\xc4\x51\xac\xf2\xac\xf3\x6b\x6c\xaa\xcf\x12\xc5\x6c\x90\x50\xb5\x0c\xfc\x1a\x15\x52\xe9\x26\xc6\x52\xa4\xe7\x81\x69\xe1\xe7\x9e\x30\x01\xec\x84\x89\xb2\x0d\x66\xdd\xce\x28\x5c\xec\x98\x46\x68\x21\x9f\x88\x3f\x1f\x42\x77\xce\xd0\x61\xd4\x20\xa7\xff\x53\xad\x37\xd0\x17\x35\xc9\xfc\xba\x0a\x78\x3f\xf2\xcc\x86\x89\xe8\x4b\x3c\x48\x33\x09\x7f\xc6\xc0\xdd\xb8\xfd\x7a\x66\x66\x65\xeb\x47\xa7\x04\x28"
+ "\xa3\x19\x8e\xa9\xb1\x13\x67\x62\x70\xcf\xd6",
+ .ilen = 2027,
+ .result = "\x74\xa6\x3e\xe4\xb1\xcb\xaf\xb0\x40\xe5\x0f\x9e\xf1\xf2\x89\xb5\x42\x34\x8a\xa1\x03\xb7\xe9\x57\x46\xbe\x20\xe4\x6e\xb0\xeb\xff\xea\x07\x7e\xef\xe2\x55\x9f\xe5\x78\x3a\xb7\x83\xc2\x18\x40\x7b\xeb\xcd\x81\xfb\x90\x12\x9e\x46\xa9\xd6\x4a\xba\xb0\x62\xdb\x6b\x99\xc4\xdb\x54\x4b\xb8\xa5\x71\xcb\xcd\x63\x32\x55\xfb\x31\xf0\x38\xf5\xbe\x78\xe4\x45\xce\x1b\x6a\x5b\x0e\xf4\x16\xe4\xb1\x3d\xf6\x63\x7b\xa7\x0c\xde\x6f\x8f\x74\xdf\xe0\x1e\x9d\xce\x8f\x24\xef\x23\x35\x33\x7b\x83\x34\x23\x58\x74\x14\x77\x1f\xc2\x4f\x4e\xc6\x89\xf9\x52\x09\x37\x64\x14\xc4\x01\x6b\x9d\x77\xe8\x90\x5d\xa8\x4a\x2a\xef\x5c\x7f\xeb\xbb\xb2\xc6\x93\x99\x66\xdc\x7f\xd4\x9e\x2a\xca\x8d\xdb\xe7\x20\xcf\xe4\x73\xae\x49\x7d\x64\x0f\x0e\x28\x46\xa9\xa8\x32\xe4\x0e\xf6\x51\x53\xb8\x3c\xb1\xff\xa3\x33\x41\x75\xff\xf1\x6f\xf1\xfb"
+ "\xbb\x83\x7f\x06\x9b\xe7\x1b\x0a\xe0\x5c\x33\x60\x5b\xdb\x5b\xed\xfe\xa5\x16\x19\x72\xa3\x64\x23\x00\x02\xc7\xf3\x6a\x81\x3e\x44\x1d\x79\x15\x5f\x9a\xde\xe2\xfd\x1b\x73\xc1\xbc\x23\xba\x31\xd2\x50\xd5\xad\x7f\x74\xa7\xc9\xf8\x3e\x2b\x26\x10\xf6\x03\x36\x74\xe4\x0e\x6a\x72\xb7\x73\x0a\x42\x28\xc2\xad\x5e\x03\xbe\xb8\x0b\xa8\x5b\xd4\xb8\xba\x52\x89\xb1\x9b\xc1\xc3\x65\x87\xed\xa5\xf4\x86\xfd\x41\x80\x91\x27\x59\x53\x67\x15\x78\x54\x8b\x2d\x3d\xc7\xff\x02\x92\x07\x5f\x7a\x4b\x60\x59\x3c\x6f\x5c\xd8\xec\x95\xd2\xfe\xa0\x3b\xd8\x3f\xd1\x69\xa6\xd6\x41\xb2\xf4\x4d\x12\xf4\x58\x3e\x66\x64\x80\x31\x9b\xa8\x4c\x8b\x07\xb2\xec\x66\x94\x66\x47\x50\x50\x5f\x18\x0b\x0e\xd6\xc0\x39\x21\x13\x9e\x33\xbc\x79\x36\x02\x96\x70\xf0\x48\x67\x2f\x26\xe9\x6d\x10\xbb\xd6\x3f\xd1\x64\x7a\x2e\xbe\x0c\x61\xf0\x75\x42\x38\x23"
+ "\xb1\x9e\x9f\x7c\x67\x66\xd9\x58\x9a\xf1\xbb\x41\x2a\x8d\x65\x84\x94\xfc\xdc\x6a\x50\x64\xdb\x56\x33\x76\x00\x10\xed\xbe\xd2\x12\xf6\xf6\x1b\xa2\x16\xde\xae\x31\x95\xdd\xb1\x08\x7e\x4e\xee\xe7\xf9\xa5\xfb\x5b\x61\x43\x00\x40\xf6\x7e\x02\x04\x32\x4e\x0c\xe2\x66\x0d\xd7\x07\x98\x0e\xf8\x72\x34\x6d\x95\x86\xd7\xcb\x31\x54\x47\xd0\x38\x29\x9c\x5a\x68\xd4\x87\x76\xc9\xe7\x7e\xe3\xf4\x81\x6d\x18\xcb\xc9\x05\xaf\xa0\xfb\x66\xf7\xf1\x1c\xc6\x14\x11\x4f\x2b\x79\x42\x8b\xbc\xac\xe7\x6c\xfe\x0f\x58\xe7\x7c\x78\x39\x30\xb0\x66\x2c\x9b\x6d\x3a\xe1\xcf\xc9\xa4\x0e\x6d\x6d\x8a\xa1\x3a\xe7\x28\xd4\x78\x4c\xa6\xa2\x2a\xa6\x03\x30\xd7\xa8\x25\x66\x87\x2f\x69\x5c\x4e\xdd\xa5\x49\x5d\x37\x4a\x59\xc4\xaf\x1f\xa2\xe4\xf8\xa6\x12\x97\xd5\x79\xf5\xe2\x4a\x2b\x5f\x61\xe4\x9e\xe3\xee\xb8\xa7\x5b\x2f\xf4\x9e\x6c\xfb\xd1\xc6"
+ "\x56\x77\xba\x75\xaa\x3d\x1a\xa8\x0b\xb3\x68\x24\x00\x10\x7f\xfd\xd7\xa1\x8d\x83\x54\x4f\x1f\xd8\x2a\xbe\x8a\x0c\x87\xab\xa2\xde\xc3\x39\xbf\x09\x03\xa5\xf3\x05\x28\xe1\xe1\xee\x39\x70\x9c\xd8\x81\x12\x1e\x02\x40\xd2\x6e\xf0\xeb\x1b\x3d\x22\xc6\xe5\xe3\xb4\x5a\x98\xbb\xf0\x22\x28\x8d\xe5\xd3\x16\x48\x24\xa5\xe6\x66\x0c\xf9\x08\xf9\x7e\x1e\xe1\x28\x26\x22\xc7\xc7\x0a\x32\x47\xfa\xa3\xbe\x3c\xc4\xc5\x53\x0a\xd5\x94\x4a\xd7\x93\xd8\x42\x99\xb9\x0a\xdb\x56\xf7\xb9\x1c\x53\x4f\xfa\xd3\x74\xad\xd9\x68\xf1\x1b\xdf\x61\xc6\x5e\xa8\x48\xfc\xd4\x4a\x4c\x3c\x32\xf7\x1c\x96\x21\x9b\xf9\xa3\xcc\x5a\xce\xd5\xd7\x08\x24\xf6\x1c\xfd\xdd\x38\xc2\x32\xe9\xb8\xe7\xb6\xfa\x9d\x45\x13\x2c\x83\xfd\x4a\x69\x82\xcd\xdc\xb3\x76\x0c\x9e\xd8\xf4\x1b\x45\x15\xb4\x97\xe7\x58\x34\xe2\x03\x29\x5a\xbf\xb6\xe0\x5d\x13\xd9\x2b\xb4"
+ "\x80\xb2\x45\x81\x6a\x2e\x6c\x89\x7d\xee\xbb\x52\xdd\x1f\x18\xe7\x13\x6b\x33\x0e\xea\x36\x92\x77\x7b\x6d\x9c\x5a\x5f\x45\x7b\x7b\x35\x62\x23\xd1\xbf\x0f\xd0\x08\x1b\x2b\x80\x6b\x7e\xf1\x21\x47\xb0\x57\xd1\x98\x72\x90\x34\x1c\x20\x04\xff\x3d\x5c\xee\x0e\x57\x5f\x6f\x24\x4e\x3c\xea\xfc\xa5\xa9\x83\xc9\x61\xb4\x51\x24\xf8\x27\x5e\x46\x8c\xb1\x53\x02\x96\x35\xba\xb8\x4c\x71\xd3\x15\x59\x35\x22\x20\xad\x03\x9f\x66\x44\x3b\x9c\x35\x37\x1f\x9b\xbb\xf3\xdb\x35\x63\x30\x64\xaa\xa2\x06\xa8\x5d\xbb\xe1\x9f\x70\xec\x82\x11\x06\x36\xec\x8b\x69\x66\x24\x44\xc9\x4a\x57\xbb\x9b\x78\x13\xce\x9c\x0c\xba\x92\x93\x63\xb8\xe2\x95\x0f\x0f\x16\x39\x52\xfd\x3a\x6d\x02\x4b\xdf\x13\xd3\x2a\x22\xb4\x03\x7c\x54\x49\x96\x68\x54\x10\xfa\xef\xaa\x6c\xe8\x22\xdc\x71\x16\x13\x1a\xf6\x28\xe5\x6d\x77\x3d\xcd\x30\x63\xb1\x70\x52\xa1"
+ "\xc5\x94\x5f\xcf\xe8\xb8\x26\x98\xf7\x06\xa0\x0a\x70\xfa\x03\x80\xac\xc1\xec\xd6\x4c\x54\xd7\xfe\x47\xb6\x88\x4a\xf7\x71\x24\xee\xf3\xd2\xc2\x4a\x7f\xfe\x61\xc7\x35\xc9\x37\x67\xcb\x24\x35\xda\x7e\xca\x5f\xf3\x8d\xd4\x13\x8e\xd6\xcb\x4d\x53\x8f\x53\x1f\xc0\x74\xf7\x53\xb9\x5e\x23\x37\xba\x6e\xe3\x9d\x07\x55\x25\x7b\xe6\x2a\x64\xd1\x32\xdd\x54\x1b\x4b\xc0\xe1\xd7\x69\x58\xf8\x93\x29\xc4\xdd\x23\x2f\xa5\xfc\x9d\x7e\xf8\xd4\x90\xcd\x82\x55\xdc\x16\x16\x9f\x07\x52\x9b\x9d\x25\xed\x32\xc5\x7b\xdf\xf6\x83\x46\x3d\x65\xb7\xef\x87\x7a\x12\x69\x8f\x06\x7c\x51\x15\x4a\x08\xe8\xac\x9a\x0c\x24\xa7\x27\xd8\x46\x2f\xe7\x01\x0e\x1c\xc6\x91\xb0\x6e\x85\x65\xf0\x29\x0d\x2e\x6b\x3b\xfb\x4b\xdf\xe4\x80\x93\x03\x66\x46\x3e\x8a\x6e\xf3\x5e\x4d\x62\x0e\x49\x05\xaf\xd4\xf8\x21\x20\x61\x1d\x39\x17\xf4\x61\x47\x95\xfb\x15"
+ "\x2e\xb3\x4f\xd0\x5d\xf5\x7d\x40\xda\x90\x3c\x6b\xcb\x17\x00\x13\x3b\x64\x34\x1b\xf0\xf2\xe5\x3b\xb2\xc7\xd3\x5f\x3a\x44\xa6\x9b\xb7\x78\x0e\x42\x5d\x4c\xc1\xe9\xd2\xcb\xb7\x78\xd1\xfe\x9a\xb5\x07\xe9\xe0\xbe\xe2\x8a\xa7\x01\x83\x00\x8c\x5c\x08\xe6\x63\x12\x92\xb7\xb7\xa6\x19\x7d\x38\x13\x38\x92\x87\x24\xf9\x48\xb3\x5e\x87\x6a\x40\x39\x5c\x3f\xed\x8f\xee\xdb\x15\x82\x06\xda\x49\x21\x2b\xb5\xbf\x32\x7c\x9f\x42\x28\x63\xcf\xaf\x1e\xf8\xc6\xa0\xd1\x02\x43\x57\x62\xec\x9b\x0f\x01\x9e\x71\xd8\x87\x9d\x01\xc1\x58\x77\xd9\xaf\xb1\x10\x7e\xdd\xa6\x50\x96\xe5\xf0\x72\x00\x6d\x4b\xf8\x2a\x8f\x19\xf3\x22\x88\x11\x4a\x8b\x7c\xfd\xb7\xed\xe1\xf6\x40\x39\xe0\xe9\xf6\x3d\x25\xe6\x74\x3c\x58\x57\x7f\xe1\x22\x96\x47\x31\x91\xba\x70\x85\x28\x6b\x9f\x6e\x25\xac\x23\x66\x2f\x29\x88\x28\xce\x8c\x5c\x88\x53\xd1\x3b"
+ "\xcc\x6a\x51\xb2\xe1\x28\x3f\x91\xb4\x0d\x00\x3a\xe3\xf8\xc3\x8f\xd7\x96\x62\x0e\x2e\xfc\xc8\x6c\x77\xa6\x1d\x22\xc1\xb8\xe6\x61\xd7\x67\x36\x13\x7b\xbb\x9b\x59\x09\xa6\xdf\xf7\x6b\xa3\x40\x1a\xf5\x4f\xb4\xda\xd3\xf3\x81\x93\xc6\x18\xd9\x26\xee\xac\xf0\xaa\xdf\xc5\x9c\xca\xc2\xa2\xcc\x7b\x5c\x24\xb0\xbc\xd0\x6a\x4d\x89\x09\xb8\x07\xfe\x87\xad\x0a\xea\xb8\x42\xf9\x5e\xb3\x3e\x36\x4c\xaf\x75\x9e\x1c\xeb\xbd\xbc\xbb\x80\x40\xa7\x3a\x30\xbf\xa8\x44\xf4\xeb\x38\xad\x29\xba\x23\xed\x41\x0c\xea\xd2\xbb\x41\x18\xd6\xb9\xba\x65\x2b\xa3\x91\x6d\x1f\xa9\xf4\xd1\x25\x8d\x4d\x38\xff\x64\xa0\xec\xde\xa6\xb6\x79\xab\x8e\x33\x6c\x47\xde\xaf\x94\xa4\xa5\x86\x77\x55\x09\x92\x81\x31\x76\xc7\x34\x22\x89\x8e\x3d\x26\x26\xd7\xfc\x1e\x16\x72\x13\x33\x63\xd5\x22\xbe\xb8\x04\x34\x84\x41\xbb\x80\xd0\x9f\x46\x48\x07\xa7"
+ "\xfc\x2b\x3a\x75\x55\x8c\xc7\x6a\xbd\x7e\x46\x08\x84\x0f\xd5\x74\xc0\x82\x8e\xaa\x61\x05\x01\xb2\x47\x6e\x20\x6a\x2d\x58\x70\x48\x32\xa7\x37\xd2\xb8\x82\x1a\x51\xb9\x61\xdd\xfd\x9d\x6b\x0e\x18\x97\xf8\x45\x5f\x87\x10\xcf\x34\x72\x45\x26\x49\x70\xe7\xa3\x78\xe0\x52\x89\x84\x94\x83\x82\xc2\x69\x8f\xe3\xe1\x3f\x60\x74\x88\xc4\xf7\x75\x2c\xfb\xbd\xb6\xc4\x7e\x10\x0a\x6c\x90\x04\x9e\xc3\x3f\x59\x7c\xce\x31\x18\x60\x57\x73\x46\x94\x7d\x06\xa0\x6d\x44\xec\xa2\x0a\x9e\x05\x15\xef\xca\x5c\xbf\x00\xeb\xf7\x3d\x32\xd4\xa5\xef\x49\x89\x5e\x46\xb0\xa6\x63\x5b\x8a\x73\xae\x6f\xd5\x9d\xf8\x4f\x40\xb5\xb2\x6e\xd3\xb6\x01\xa9\x26\xa2\x21\xcf\x33\x7a\x3a\xa4\x23\x13\xb0\x69\x6a\xee\xce\xd8\x9d\x01\x1d\x50\xc1\x30\x6c\xb1\xcd\xa0\xf0\xf0\xa2\x64\x6f\xbb\xbf\x5e\xe6\xab\x87\xb4\x0f\x4f\x15\xaf\xb5\x25\xa1\xb2\xd0\x80"
+ "\x2c\xfb\xf9\xfe\xd2\x33\xbb\x76\xfe\x7c\xa8\x66\xf7\xe7\x85\x9f\x1f\x85\x57\x88\xe1\xe9\x63\xe4\xd8\x1c\xa1\xfb\xda\x44\x05\x2e\x1d\x3a\x1c\xff\xc8\x3b\xc0\xfe\xda\x22\x0b\x43\xd6\x88\x39\x4c\x4a\xa6\x69\x18\x93\x42\x4e\xb5\xcc\x66\x0d\x09\xf8\x1e\x7c\xd3\x3c\x99\x0d\x50\x1d\x62\xe9\x57\x06\xbf\x19\x88\xdd\xad\x7b\x4f\xf9\xc7\x82\x6d\x8d\xc8\xc4\xc5\x78\x17\x20\x15\xc5\x52\x41\xcf\x5b\xd6\x7f\x94\x02\x41\xe0\x40\x22\x03\x5e\xd1\x53\xd4\x86\xd3\x2c\x9f\x0f\x96\xe3\x6b\x9a\x76\x32\x06\x47\x4b\x11\xb3\xdd\x03\x65\xbd\x9b\x01\xda\x9c\xb9\x7e\x3f\x6a\xc4\x7b\xea\xd4\x3c\xb9\xfb\x5c\x6b\x64\x33\x52\xba\x64\x78\x8f\xa4\xaf\x7a\x61\x8d\xbc\xc5\x73\xe9\x6b\x58\x97\x4b\xbf\x63\x22\xd3\x37\x02\x54\xc5\xb9\x16\x4a\xf0\x19\xd8\x94\x57\xb8\x8a\xb3\x16\x3b\xd0\x84\x8e\x67\xa6\xa3\x7d\x78\xec\x00"
+}, {
+ .key = "\xb3\x35\x50\x03\x54\x2e\x40\x5e\x8f\x59\x8e\xc5\x90\xd5\x27\x2d\xba\x29\x2e\xcb\x1b\x70\x44\x1e\x65\x91\x6e\x2a\x79\x22\xda\x64",
+ .nonce = "\x05\xa3\x93\xed\x30\xc5\xa2\x06",
+ .nlen = 8,
+ .assoc = "\xb1\x69\x83\x87\x30\xaa\x5d\xb8\x77\xe8\x21\xff\x06\x59\x35\xce\x75\xfe\x38\xef\xb8\x91\x43\x8c\xcf\x70\xdd\x0a\x68\xbf\xd4\xbc\x16\x76\x99\x36\x1e\x58\x79\x5e\xd4\x29\xf7\x33\x93\x48\xdb\x5f\x01\xae\x9c\xb6\xe4\x88\x6d\x2b\x76\x75\xe0\xf3\x74\xe2\xc9",
+ .alen = 63,
+ .input = "\x52\x34\xb3\x65\x3b\xb7\xe5\xd3\xab\x49\x17\x60\xd2\x52\x56\xdf\xdf\x34\x56\x82\xe2\xbe\xe5\xe1\x28\xd1\x4e\x5f\x4f\x01\x7d\x3f\x99\x6b\x30\x6e\x1a\x7c\x4c\x8e\x62\x81\xae\x86\x3f\x6b\xd0\xb5\xa9\xcf\x50\xf1\x02\x12\xa0\x0b\x24\xe9\xe6\x72\x89\x2c\x52\x1b\x34\x38\xf8\x75\x5f\xa0\x74\xe2\x99\xdd\xa6\x4b\x14\x50\x4e\xf1\xbe\xd6\x9e\xdb\xb2\x24\x27\x74\x12\x4a\x78\x78\x17\xa5\x58\x8e\x2f\xf9\xf4\x8d\xee\x03\x88\xae\xb8\x29\xa1\x2f\x4b\xee\x92\xbd\x87\xb3\xce\x34\x21\x57\x46\x04\x49\x0c\x80\xf2\x01\x13\xa1\x55\xb3\xff\x44\x30\x3c\x1c\xd0\xef\xbc\x18\x74\x26\xad\x41\x5b\x5b\x3e\x9a\x7a\x46\x4f\x16\xd6\x74\x5a\xb7\x3a\x28\x31\xd8\xae\x26\xac\x50\x53\x86\xf2\x56\xd7\x3f\x29\xbc\x45\x68\x8e\xcb\x98\x64\xdd\xc9\xba\xb8\x4b\x7b\x82\xdd\x14\xa7\xcb\x71\x72\x00\x5c\xad\x7b\x6a\x89\xa4\x3d\xbf\xb5"
+ "\x4b\x3e\x7c\x5a\xcf\xb8\xa1\xc5\x6e\xc8\xb6\x31\x57\x7b\xdf\xa5\x7e\xb1\xd6\x42\x2a\x31\x36\xd1\xd0\x3f\x7a\xe5\x94\xd6\x36\xa0\x6f\xb7\x40\x7d\x37\xc6\x55\x7c\x50\x40\x6d\x29\x89\xe3\x5a\xae\x97\xe7\x44\x49\x6e\xbd\x81\x3d\x03\x93\x06\x12\x06\xe2\x41\x12\x4a\xf1\x6a\xa4\x58\xa2\xfb\xd2\x15\xba\xc9\x79\xc9\xce\x5e\x13\xbb\xf1\x09\x04\xcc\xfd\xe8\x51\x34\x6a\xe8\x61\x88\xda\xed\x01\x47\x84\xf5\x73\x25\xf9\x1c\x42\x86\x07\xf3\x5b\x1a\x01\xb3\xeb\x24\x32\x8d\xf6\xed\x7c\x4b\xeb\x3c\x36\x42\x28\xdf\xdf\xb6\xbe\xd9\x8c\x52\xd3\x2b\x08\x90\x8c\xe7\x98\x31\xe2\x32\x8e\xfc\x11\x48\x00\xa8\x6a\x42\x4a\x02\xc6\x4b\x09\xf1\xe3\x49\xf3\x45\x1f\x0e\xbc\x56\xe2\xe4\xdf\xfb\xeb\x61\xfa\x24\xc1\x63\x75\xbb\x47\x75\xaf\xe1\x53\x16\x96\x21\x85\x26\x11\xb3\x76\xe3\x23\xa1\x6b\x74\x37\xd0\xde\x06\x90\x71\x5d\x43\x88"
+ "\x9b\x00\x54\xa6\x75\x2f\xa1\xc2\x0b\x73\x20\x1d\xb6\x21\x79\x57\x3f\xfa\x09\xbe\x8a\x33\xc3\x52\xf0\x1d\x82\x31\xd1\x55\xb5\x6c\x99\x25\xcf\x5c\x32\xce\xe9\x0d\xfa\x69\x2c\xd5\x0d\xc5\x6d\x86\xd0\x0c\x3b\x06\x50\x79\xe8\xc3\xae\x04\xe6\xcd\x51\xe4\x26\x9b\x4f\x7e\xa6\x0f\xab\xd8\xe5\xde\xa9\x00\x95\xbe\xa3\x9d\x5d\xb2\x09\x70\x18\x1c\xf0\xac\x29\x23\x02\x29\x28\xd2\x74\x35\x57\x62\x0f\x24\xea\x5e\x33\xc2\x92\xf3\x78\x4d\x30\x1e\xa1\x99\xa9\x82\xb0\x42\x31\x8d\xad\x8a\xbc\xfc\xd4\x57\x47\x3e\xb4\x50\xdd\x6e\x2c\x80\x4d\x22\xf1\xfb\x57\xc4\xdd\x17\xe1\x8a\x36\x4a\xb3\x37\xca\xc9\x4e\xab\xd5\x69\xc4\xf4\xbc\x0b\x3b\x44\x4b\x29\x9c\xee\xd4\x35\x22\x21\xb0\x1f\x27\x64\xa8\x51\x1b\xf0\x9f\x19\x5c\xfb\x5a\x64\x74\x70\x45\x09\xf5\x64\xfe\x1a\x2d\xc9\x14\x04\x14\xcf\xd5\x7d\x60\xaf\x94\x39\x94\xe2\x7d\x79"
+ "\x82\xd0\x65\x3b\x6b\x9c\x19\x84\xb4\x6d\xb3\x0c\x99\xc0\x56\xa8\xbd\x73\xce\x05\x84\x3e\x30\xaa\xc4\x9b\x1b\x04\x2a\x9f\xd7\x43\x2b\x23\xdf\xbf\xaa\xd5\xc2\x43\x2d\x70\xab\xdc\x75\xad\xac\xf7\xc0\xbe\x67\xb2\x74\xed\x67\x10\x4a\x92\x60\xc1\x40\x50\x19\x8a\x8a\x8c\x09\x0e\x72\xe1\x73\x5e\xe8\x41\x85\x63\x9f\x3f\xd7\x7d\xc4\xfb\x22\x5d\x92\x6c\xb3\x1e\xe2\x50\x2f\x82\xa8\x28\xc0\xb5\xd7\x5f\x68\x0d\x2c\x2d\xaf\x7e\xfa\x2e\x08\x0f\x1f\x70\x9f\xe9\x19\x72\x55\xf8\xfb\x51\xd2\x33\x5d\xa0\xd3\x2b\x0a\x6c\xbc\x4e\xcf\x36\x4d\xdc\x3b\xe9\x3e\x81\x7c\x61\xdb\x20\x2d\x3a\xc3\xb3\x0c\x1e\x00\xb9\x7c\xf5\xca\x10\x5f\x3a\x71\xb3\xe4\x20\xdb\x0c\x2a\x98\x63\x45\x00\x58\xf6\x68\xe4\x0b\xda\x13\x3b\x60\x5c\x76\xdb\xb9\x97\x71\xe4\xd9\xb7\xdb\xbd\x68\xc7\x84\x84\xaa\x7c\x68\x62\x5e\x16\xfc\xba\x72\xaa\x9a\xa9\xeb"
+ "\x7c\x75\x47\x97\x7e\xad\xe2\xd9\x91\xe8\xe4\xa5\x31\xd7\x01\x8e\xa2\x11\x88\x95\xb9\xf2\x9b\xd3\x7f\x1b\x81\x22\xf7\x98\x60\x0a\x64\xa6\xc1\xf6\x49\xc7\xe3\x07\x4d\x94\x7a\xcf\x6e\x68\x0c\x1b\x3f\x6e\x2e\xee\x92\xfa\x52\xb3\x59\xf8\xf1\x8f\x6a\x66\xa3\x82\x76\x4a\x07\x1a\xc7\xdd\xf5\xda\x9c\x3c\x24\xbf\xfd\x42\xa1\x10\x64\x6a\x0f\x89\xee\x36\xa5\xce\x99\x48\x6a\xf0\x9f\x9e\x69\xa4\x40\x20\xe9\x16\x15\xf7\xdb\x75\x02\xcb\xe9\x73\x8b\x3b\x49\x2f\xf0\xaf\x51\x06\x5c\xdf\x27\x27\x49\x6a\xd1\xcc\xc7\xb5\x63\xb5\xfc\xb8\x5c\x87\x7f\x84\xb4\xcc\x14\xa9\x53\xda\xa4\x56\xf8\xb6\x1b\xcc\x40\x27\x52\x06\x5a\x13\x81\xd7\x3a\xd4\x3b\xfb\x49\x65\x31\x33\xb2\xfa\xcd\xad\x58\x4e\x2b\xae\xd2\x20\xfb\x1a\x48\xb4\x3f\x9a\xd8\x7a\x35\x4a\xc8\xee\x88\x5e\x07\x66\x54\xb9\xec\x9f\xa3\xe3\xb9\x37\xaa\x49\x76\x31\xda"
+ "\x74\x2d\x3c\xa4\x65\x10\x32\x38\xf0\xde\xd3\x99\x17\xaa\x71\xaa\x8f\x0f\x8c\xaf\xa2\xf8\x5d\x64\xba\x1d\xa3\xef\x96\x73\xe8\xa1\x02\x8d\x0c\x6d\xb8\x06\x90\xb8\x08\x56\x2c\xa7\x06\xc9\xc2\x38\xdb\x7c\x63\xb1\x57\x8e\xea\x7c\x79\xf3\x49\x1d\xfe\x9f\xf3\x6e\xb1\x1d\xba\x19\x80\x1a\x0a\xd3\xb0\x26\x21\x40\xb1\x7c\xf9\x4d\x8d\x10\xc1\x7e\xf4\xf6\x3c\xa8\xfd\x7c\xa3\x92\xb2\x0f\xaa\xcc\xa6\x11\xfe\x04\xe3\xd1\x7a\x32\x89\xdf\x0d\xc4\x8f\x79\x6b\xca\x16\x7c\x6e\xf9\xad\x0f\xf6\xfe\x27\xdb\xc4\x13\x70\xf1\x62\x1a\x4f\x79\x40\xc9\x9b\x8b\x21\xea\x84\xfa\xf5\xf1\x89\xce\xb7\x55\x0a\x80\x39\x2f\x55\x36\x16\x9c\x7b\x08\xbd\x87\x0d\xa5\x32\xf1\x52\x7c\xe8\x55\x60\x5b\xd7\x69\xe4\xfc\xfa\x12\x85\x96\xea\x50\x28\xab\x8a\xf7\xbb\x0e\x53\x74\xca\xa6\x27\x09\xc2\xb5\xde\x18\x14\xd9\xea\xe5\x29\x1c\x40\x56\xcf\xd7"
+ "\xae\x05\x3f\x65\xaf\x05\x73\xe2\x35\x96\x27\x07\x14\xc0\xad\x33\xf1\xdc\x44\x7a\x89\x17\x77\xd2\x9c\x58\x60\xf0\x3f\x7b\x2d\x2e\x57\x95\x54\x87\xed\xf2\xc7\x4c\xf0\xae\x56\x29\x19\x7d\x66\x4b\x9b\x83\x84\x42\x3b\x01\x25\x66\x8e\x02\xde\xb9\x83\x54\x19\xf6\x9f\x79\x0d\x67\xc5\x1d\x7a\x44\x02\x98\xa7\x16\x1c\x29\x0d\x74\xff\x85\x40\x06\xef\x2c\xa9\xc6\xf5\x53\x07\x06\xae\xe4\xfa\x5f\xd8\x39\x4d\xf1\x9b\x6b\xd9\x24\x84\xfe\x03\x4c\xb2\x3f\xdf\xa1\x05\x9e\x50\x14\x5a\xd9\x1a\xa2\xa7\xfa\xfa\x17\xf7\x78\xd6\xb5\x92\x61\x91\xac\x36\xfa\x56\x0d\x38\x32\x18\x85\x08\x58\x37\xf0\x4b\xdb\x59\xe7\xa4\x34\xc0\x1b\x01\xaf\x2d\xde\xa1\xaa\x5d\xd3\xec\xe1\xd4\xf7\xe6\x54\x68\xf0\x51\x97\xa7\x89\xea\x24\xad\xd3\x6e\x47\x93\x8b\x4b\xb4\xf7\x1c\x42\x06\x67\xe8\x99\xf6\xf5\x7b\x85\xb5\x65\xb5\xb5\xd2\x37\xf5\xf3\x02\xa6"
+ "\x4d\x11\xa7\xdc\x51\x09\x7f\xa0\xd8\x88\x1c\x13\x71\xae\x9c\xb7\x7b\x34\xd6\x4e\x68\x26\x83\x51\xaf\x1d\xee\x8b\xbb\x69\x43\x2b\x9e\x8a\xbc\x02\x0e\xa0\x1b\xe0\xa8\x5f\x6f\xaf\x1b\x8f\xe7\x64\x71\x74\x11\x7e\xa8\xd8\xf9\x97\x06\xc3\xb6\xfb\xfb\xb7\x3d\x35\x9d\x3b\x52\xed\x54\xca\xf4\x81\x01\x2d\x1b\xc3\xa7\x00\x3d\x1a\x39\x54\xe1\xf6\xff\xed\x6f\x0b\x5a\x68\xda\x58\xdd\xa9\xcf\x5c\x4a\xe5\x09\x4e\xde\x9d\xbc\x3e\xee\x5a\x00\x3b\x2c\x87\x10\x65\x60\xdd\xd7\x56\xd1\x4c\x64\x45\xe4\x21\xec\x78\xf8\x25\x7a\x3e\x16\x5d\x09\x53\x14\xbe\x4f\xae\x87\xd8\xd1\xaa\x3c\xf6\x3e\xa4\x70\x8c\x5e\x70\xa4\xb3\x6b\x66\x73\xd3\xbf\x31\x06\x19\x62\x93\x15\xf2\x86\xe4\x52\x7e\x53\x4c\x12\x38\xcc\x34\x7d\x57\xf6\x42\x93\x8a\xc4\xee\x5c\x8a\xe1\x52\x8f\x56\x64\xf6\xa6\xd1\x91\x57\x70\xcd\x11\x76\xf5\x59\x60\x60\x3c"
+ "\xc1\xc3\x0b\x7f\x58\x1a\x50\x91\xf1\x68\x8f\x6e\x74\x74\xa8\x51\x0b\xf7\x7a\x98\x37\xf2\x0a\x0e\xa4\x97\x04\xb8\x9b\xfd\xa0\xea\xf7\x0d\xe1\xdb\x03\xf0\x31\x29\xf8\xdd\x6b\x8b\x5d\xd8\x59\xa9\x29\xcf\x9a\x79\x89\x19\x63\x46\x09\x79\x6a\x11\xda\x63\x68\x48\x77\x23\xfb\x7d\x3a\x43\xcb\x02\x3b\x7a\x6d\x10\x2a\x9e\xac\xf1\xd4\x19\xf8\x23\x64\x1d\x2c\x5f\xf2\xb0\x5c\x23\x27\xf7\x27\x30\x16\x37\xb1\x90\xab\x38\xfb\x55\xcd\x78\x58\xd4\x7d\x43\xf6\x45\x5e\x55\x8d\xb1\x02\x65\x58\xb4\x13\x4b\x36\xf7\xcc\xfe\x3d\x0b\x82\xe2\x12\x11\xbb\xe6\xb8\x3a\x48\x71\xc7\x50\x06\x16\x3a\xe6\x7c\x05\xc7\xc8\x4d\x2f\x08\x6a\x17\x9a\x95\x97\x50\x68\xdc\x28\x18\xc4\x61\x38\xb9\xe0\x3e\x78\xdb\x29\xe0\x9f\x52\xdd\xf8\x4f\x91\xc1\xd0\x33\xa1\x7a\x8e\x30\x13\x82\x07\x9f\xd3\x31\x0f\x23\xbe\x32\x5a\x75\xcf\x96\xb2\xec\xb5\x32"
+ "\xac\x21\xd1\x82\x33\xd3\x15\x74\xbd\x90\xf1\x2c\xe6\x5f\x8d\xe3\x02\xe8\xe9\xc4\xca\x96\xeb\x0e\xbc\x91\xf4\xb9\xea\xd9\x1b\x75\xbd\xe1\xac\x2a\x05\x37\x52\x9b\x1b\x3f\x5a\xdc\x21\xc3\x98\xbb\xaf\xa3\xf2\x00\xbf\x0d\x30\x89\x05\xcc\xa5\x76\xf5\x06\xf0\xc6\x54\x8a\x5d\xd4\x1e\xc1\xf2\xce\xb0\x62\xc8\xfc\x59\x42\x9a\x90\x60\x55\xfe\x88\xa5\x8b\xb8\x33\x0c\x23\x24\x0d\x15\x70\x37\x1e\x3d\xf6\xd2\xea\x92\x10\xb2\xc4\x51\xac\xf2\xac\xf3\x6b\x6c\xaa\xcf\x12\xc5\x6c\x90\x50\xb5\x0c\xfc\x1a\x15\x52\xe9\x26\xc6\x52\xa4\xe7\x81\x69\xe1\xe7\x9e\x30\x01\xec\x84\x89\xb2\x0d\x66\xdd\xce\x28\x5c\xec\x98\x46\x68\x21\x9f\x88\x3f\x1f\x42\x77\xce\xd0\x61\xd4\x20\xa7\xff\x53\xad\x37\xd0\x17\x35\xc9\xfc\xba\x0a\x78\x3f\xf2\xcc\x86\x89\xe8\x4b\x3c\x48\x33\x09\x7f\xc6\xc0\xdd\xb8\xfd\x7a\x66\x66\x65\xeb\x47\xa7\x04\x28"
+ "\xa3\x19\x8e\xa9\xb1\x13\x67\x62\x70\xcf\xd7",
+ .ilen = 2027,
+ .result = "\x74\xa6\x3e\xe4\xb1\xcb\xaf\xb0\x40\xe5\x0f\x9e\xf1\xf2\x89\xb5\x42\x34\x8a\xa1\x03\xb7\xe9\x57\x46\xbe\x20\xe4\x6e\xb0\xeb\xff\xea\x07\x7e\xef\xe2\x55\x9f\xe5\x78\x3a\xb7\x83\xc2\x18\x40\x7b\xeb\xcd\x81\xfb\x90\x12\x9e\x46\xa9\xd6\x4a\xba\xb0\x62\xdb\x6b\x99\xc4\xdb\x54\x4b\xb8\xa5\x71\xcb\xcd\x63\x32\x55\xfb\x31\xf0\x38\xf5\xbe\x78\xe4\x45\xce\x1b\x6a\x5b\x0e\xf4\x16\xe4\xb1\x3d\xf6\x63\x7b\xa7\x0c\xde\x6f\x8f\x74\xdf\xe0\x1e\x9d\xce\x8f\x24\xef\x23\x35\x33\x7b\x83\x34\x23\x58\x74\x14\x77\x1f\xc2\x4f\x4e\xc6\x89\xf9\x52\x09\x37\x64\x14\xc4\x01\x6b\x9d\x77\xe8\x90\x5d\xa8\x4a\x2a\xef\x5c\x7f\xeb\xbb\xb2\xc6\x93\x99\x66\xdc\x7f\xd4\x9e\x2a\xca\x8d\xdb\xe7\x20\xcf\xe4\x73\xae\x49\x7d\x64\x0f\x0e\x28\x46\xa9\xa8\x32\xe4\x0e\xf6\x51\x53\xb8\x3c\xb1\xff\xa3\x33\x41\x75\xff\xf1\x6f\xf1\xfb"
+ "\xbb\x83\x7f\x06\x9b\xe7\x1b\x0a\xe0\x5c\x33\x60\x5b\xdb\x5b\xed\xfe\xa5\x16\x19\x72\xa3\x64\x23\x00\x02\xc7\xf3\x6a\x81\x3e\x44\x1d\x79\x15\x5f\x9a\xde\xe2\xfd\x1b\x73\xc1\xbc\x23\xba\x31\xd2\x50\xd5\xad\x7f\x74\xa7\xc9\xf8\x3e\x2b\x26\x10\xf6\x03\x36\x74\xe4\x0e\x6a\x72\xb7\x73\x0a\x42\x28\xc2\xad\x5e\x03\xbe\xb8\x0b\xa8\x5b\xd4\xb8\xba\x52\x89\xb1\x9b\xc1\xc3\x65\x87\xed\xa5\xf4\x86\xfd\x41\x80\x91\x27\x59\x53\x67\x15\x78\x54\x8b\x2d\x3d\xc7\xff\x02\x92\x07\x5f\x7a\x4b\x60\x59\x3c\x6f\x5c\xd8\xec\x95\xd2\xfe\xa0\x3b\xd8\x3f\xd1\x69\xa6\xd6\x41\xb2\xf4\x4d\x12\xf4\x58\x3e\x66\x64\x80\x31\x9b\xa8\x4c\x8b\x07\xb2\xec\x66\x94\x66\x47\x50\x50\x5f\x18\x0b\x0e\xd6\xc0\x39\x21\x13\x9e\x33\xbc\x79\x36\x02\x96\x70\xf0\x48\x67\x2f\x26\xe9\x6d\x10\xbb\xd6\x3f\xd1\x64\x7a\x2e\xbe\x0c\x61\xf0\x75\x42\x38\x23"
+ "\xb1\x9e\x9f\x7c\x67\x66\xd9\x58\x9a\xf1\xbb\x41\x2a\x8d\x65\x84\x94\xfc\xdc\x6a\x50\x64\xdb\x56\x33\x76\x00\x10\xed\xbe\xd2\x12\xf6\xf6\x1b\xa2\x16\xde\xae\x31\x95\xdd\xb1\x08\x7e\x4e\xee\xe7\xf9\xa5\xfb\x5b\x61\x43\x00\x40\xf6\x7e\x02\x04\x32\x4e\x0c\xe2\x66\x0d\xd7\x07\x98\x0e\xf8\x72\x34\x6d\x95\x86\xd7\xcb\x31\x54\x47\xd0\x38\x29\x9c\x5a\x68\xd4\x87\x76\xc9\xe7\x7e\xe3\xf4\x81\x6d\x18\xcb\xc9\x05\xaf\xa0\xfb\x66\xf7\xf1\x1c\xc6\x14\x11\x4f\x2b\x79\x42\x8b\xbc\xac\xe7\x6c\xfe\x0f\x58\xe7\x7c\x78\x39\x30\xb0\x66\x2c\x9b\x6d\x3a\xe1\xcf\xc9\xa4\x0e\x6d\x6d\x8a\xa1\x3a\xe7\x28\xd4\x78\x4c\xa6\xa2\x2a\xa6\x03\x30\xd7\xa8\x25\x66\x87\x2f\x69\x5c\x4e\xdd\xa5\x49\x5d\x37\x4a\x59\xc4\xaf\x1f\xa2\xe4\xf8\xa6\x12\x97\xd5\x79\xf5\xe2\x4a\x2b\x5f\x61\xe4\x9e\xe3\xee\xb8\xa7\x5b\x2f\xf4\x9e\x6c\xfb\xd1\xc6"
+ "\x56\x77\xba\x75\xaa\x3d\x1a\xa8\x0b\xb3\x68\x24\x00\x10\x7f\xfd\xd7\xa1\x8d\x83\x54\x4f\x1f\xd8\x2a\xbe\x8a\x0c\x87\xab\xa2\xde\xc3\x39\xbf\x09\x03\xa5\xf3\x05\x28\xe1\xe1\xee\x39\x70\x9c\xd8\x81\x12\x1e\x02\x40\xd2\x6e\xf0\xeb\x1b\x3d\x22\xc6\xe5\xe3\xb4\x5a\x98\xbb\xf0\x22\x28\x8d\xe5\xd3\x16\x48\x24\xa5\xe6\x66\x0c\xf9\x08\xf9\x7e\x1e\xe1\x28\x26\x22\xc7\xc7\x0a\x32\x47\xfa\xa3\xbe\x3c\xc4\xc5\x53\x0a\xd5\x94\x4a\xd7\x93\xd8\x42\x99\xb9\x0a\xdb\x56\xf7\xb9\x1c\x53\x4f\xfa\xd3\x74\xad\xd9\x68\xf1\x1b\xdf\x61\xc6\x5e\xa8\x48\xfc\xd4\x4a\x4c\x3c\x32\xf7\x1c\x96\x21\x9b\xf9\xa3\xcc\x5a\xce\xd5\xd7\x08\x24\xf6\x1c\xfd\xdd\x38\xc2\x32\xe9\xb8\xe7\xb6\xfa\x9d\x45\x13\x2c\x83\xfd\x4a\x69\x82\xcd\xdc\xb3\x76\x0c\x9e\xd8\xf4\x1b\x45\x15\xb4\x97\xe7\x58\x34\xe2\x03\x29\x5a\xbf\xb6\xe0\x5d\x13\xd9\x2b\xb4"
+ "\x80\xb2\x45\x81\x6a\x2e\x6c\x89\x7d\xee\xbb\x52\xdd\x1f\x18\xe7\x13\x6b\x33\x0e\xea\x36\x92\x77\x7b\x6d\x9c\x5a\x5f\x45\x7b\x7b\x35\x62\x23\xd1\xbf\x0f\xd0\x08\x1b\x2b\x80\x6b\x7e\xf1\x21\x47\xb0\x57\xd1\x98\x72\x90\x34\x1c\x20\x04\xff\x3d\x5c\xee\x0e\x57\x5f\x6f\x24\x4e\x3c\xea\xfc\xa5\xa9\x83\xc9\x61\xb4\x51\x24\xf8\x27\x5e\x46\x8c\xb1\x53\x02\x96\x35\xba\xb8\x4c\x71\xd3\x15\x59\x35\x22\x20\xad\x03\x9f\x66\x44\x3b\x9c\x35\x37\x1f\x9b\xbb\xf3\xdb\x35\x63\x30\x64\xaa\xa2\x06\xa8\x5d\xbb\xe1\x9f\x70\xec\x82\x11\x06\x36\xec\x8b\x69\x66\x24\x44\xc9\x4a\x57\xbb\x9b\x78\x13\xce\x9c\x0c\xba\x92\x93\x63\xb8\xe2\x95\x0f\x0f\x16\x39\x52\xfd\x3a\x6d\x02\x4b\xdf\x13\xd3\x2a\x22\xb4\x03\x7c\x54\x49\x96\x68\x54\x10\xfa\xef\xaa\x6c\xe8\x22\xdc\x71\x16\x13\x1a\xf6\x28\xe5\x6d\x77\x3d\xcd\x30\x63\xb1\x70\x52\xa1"
+ "\xc5\x94\x5f\xcf\xe8\xb8\x26\x98\xf7\x06\xa0\x0a\x70\xfa\x03\x80\xac\xc1\xec\xd6\x4c\x54\xd7\xfe\x47\xb6\x88\x4a\xf7\x71\x24\xee\xf3\xd2\xc2\x4a\x7f\xfe\x61\xc7\x35\xc9\x37\x67\xcb\x24\x35\xda\x7e\xca\x5f\xf3\x8d\xd4\x13\x8e\xd6\xcb\x4d\x53\x8f\x53\x1f\xc0\x74\xf7\x53\xb9\x5e\x23\x37\xba\x6e\xe3\x9d\x07\x55\x25\x7b\xe6\x2a\x64\xd1\x32\xdd\x54\x1b\x4b\xc0\xe1\xd7\x69\x58\xf8\x93\x29\xc4\xdd\x23\x2f\xa5\xfc\x9d\x7e\xf8\xd4\x90\xcd\x82\x55\xdc\x16\x16\x9f\x07\x52\x9b\x9d\x25\xed\x32\xc5\x7b\xdf\xf6\x83\x46\x3d\x65\xb7\xef\x87\x7a\x12\x69\x8f\x06\x7c\x51\x15\x4a\x08\xe8\xac\x9a\x0c\x24\xa7\x27\xd8\x46\x2f\xe7\x01\x0e\x1c\xc6\x91\xb0\x6e\x85\x65\xf0\x29\x0d\x2e\x6b\x3b\xfb\x4b\xdf\xe4\x80\x93\x03\x66\x46\x3e\x8a\x6e\xf3\x5e\x4d\x62\x0e\x49\x05\xaf\xd4\xf8\x21\x20\x61\x1d\x39\x17\xf4\x61\x47\x95\xfb\x15"
+ "\x2e\xb3\x4f\xd0\x5d\xf5\x7d\x40\xda\x90\x3c\x6b\xcb\x17\x00\x13\x3b\x64\x34\x1b\xf0\xf2\xe5\x3b\xb2\xc7\xd3\x5f\x3a\x44\xa6\x9b\xb7\x78\x0e\x42\x5d\x4c\xc1\xe9\xd2\xcb\xb7\x78\xd1\xfe\x9a\xb5\x07\xe9\xe0\xbe\xe2\x8a\xa7\x01\x83\x00\x8c\x5c\x08\xe6\x63\x12\x92\xb7\xb7\xa6\x19\x7d\x38\x13\x38\x92\x87\x24\xf9\x48\xb3\x5e\x87\x6a\x40\x39\x5c\x3f\xed\x8f\xee\xdb\x15\x82\x06\xda\x49\x21\x2b\xb5\xbf\x32\x7c\x9f\x42\x28\x63\xcf\xaf\x1e\xf8\xc6\xa0\xd1\x02\x43\x57\x62\xec\x9b\x0f\x01\x9e\x71\xd8\x87\x9d\x01\xc1\x58\x77\xd9\xaf\xb1\x10\x7e\xdd\xa6\x50\x96\xe5\xf0\x72\x00\x6d\x4b\xf8\x2a\x8f\x19\xf3\x22\x88\x11\x4a\x8b\x7c\xfd\xb7\xed\xe1\xf6\x40\x39\xe0\xe9\xf6\x3d\x25\xe6\x74\x3c\x58\x57\x7f\xe1\x22\x96\x47\x31\x91\xba\x70\x85\x28\x6b\x9f\x6e\x25\xac\x23\x66\x2f\x29\x88\x28\xce\x8c\x5c\x88\x53\xd1\x3b"
+ "\xcc\x6a\x51\xb2\xe1\x28\x3f\x91\xb4\x0d\x00\x3a\xe3\xf8\xc3\x8f\xd7\x96\x62\x0e\x2e\xfc\xc8\x6c\x77\xa6\x1d\x22\xc1\xb8\xe6\x61\xd7\x67\x36\x13\x7b\xbb\x9b\x59\x09\xa6\xdf\xf7\x6b\xa3\x40\x1a\xf5\x4f\xb4\xda\xd3\xf3\x81\x93\xc6\x18\xd9\x26\xee\xac\xf0\xaa\xdf\xc5\x9c\xca\xc2\xa2\xcc\x7b\x5c\x24\xb0\xbc\xd0\x6a\x4d\x89\x09\xb8\x07\xfe\x87\xad\x0a\xea\xb8\x42\xf9\x5e\xb3\x3e\x36\x4c\xaf\x75\x9e\x1c\xeb\xbd\xbc\xbb\x80\x40\xa7\x3a\x30\xbf\xa8\x44\xf4\xeb\x38\xad\x29\xba\x23\xed\x41\x0c\xea\xd2\xbb\x41\x18\xd6\xb9\xba\x65\x2b\xa3\x91\x6d\x1f\xa9\xf4\xd1\x25\x8d\x4d\x38\xff\x64\xa0\xec\xde\xa6\xb6\x79\xab\x8e\x33\x6c\x47\xde\xaf\x94\xa4\xa5\x86\x77\x55\x09\x92\x81\x31\x76\xc7\x34\x22\x89\x8e\x3d\x26\x26\xd7\xfc\x1e\x16\x72\x13\x33\x63\xd5\x22\xbe\xb8\x04\x34\x84\x41\xbb\x80\xd0\x9f\x46\x48\x07\xa7"
+ "\xfc\x2b\x3a\x75\x55\x8c\xc7\x6a\xbd\x7e\x46\x08\x84\x0f\xd5\x74\xc0\x82\x8e\xaa\x61\x05\x01\xb2\x47\x6e\x20\x6a\x2d\x58\x70\x48\x32\xa7\x37\xd2\xb8\x82\x1a\x51\xb9\x61\xdd\xfd\x9d\x6b\x0e\x18\x97\xf8\x45\x5f\x87\x10\xcf\x34\x72\x45\x26\x49\x70\xe7\xa3\x78\xe0\x52\x89\x84\x94\x83\x82\xc2\x69\x8f\xe3\xe1\x3f\x60\x74\x88\xc4\xf7\x75\x2c\xfb\xbd\xb6\xc4\x7e\x10\x0a\x6c\x90\x04\x9e\xc3\x3f\x59\x7c\xce\x31\x18\x60\x57\x73\x46\x94\x7d\x06\xa0\x6d\x44\xec\xa2\x0a\x9e\x05\x15\xef\xca\x5c\xbf\x00\xeb\xf7\x3d\x32\xd4\xa5\xef\x49\x89\x5e\x46\xb0\xa6\x63\x5b\x8a\x73\xae\x6f\xd5\x9d\xf8\x4f\x40\xb5\xb2\x6e\xd3\xb6\x01\xa9\x26\xa2\x21\xcf\x33\x7a\x3a\xa4\x23\x13\xb0\x69\x6a\xee\xce\xd8\x9d\x01\x1d\x50\xc1\x30\x6c\xb1\xcd\xa0\xf0\xf0\xa2\x64\x6f\xbb\xbf\x5e\xe6\xab\x87\xb4\x0f\x4f\x15\xaf\xb5\x25\xa1\xb2\xd0\x80"
+ "\x2c\xfb\xf9\xfe\xd2\x33\xbb\x76\xfe\x7c\xa8\x66\xf7\xe7\x85\x9f\x1f\x85\x57\x88\xe1\xe9\x63\xe4\xd8\x1c\xa1\xfb\xda\x44\x05\x2e\x1d\x3a\x1c\xff\xc8\x3b\xc0\xfe\xda\x22\x0b\x43\xd6\x88\x39\x4c\x4a\xa6\x69\x18\x93\x42\x4e\xb5\xcc\x66\x0d\x09\xf8\x1e\x7c\xd3\x3c\x99\x0d\x50\x1d\x62\xe9\x57\x06\xbf\x19\x88\xdd\xad\x7b\x4f\xf9\xc7\x82\x6d\x8d\xc8\xc4\xc5\x78\x17\x20\x15\xc5\x52\x41\xcf\x5b\xd6\x7f\x94\x02\x41\xe0\x40\x22\x03\x5e\xd1\x53\xd4\x86\xd3\x2c\x9f\x0f\x96\xe3\x6b\x9a\x76\x32\x06\x47\x4b\x11\xb3\xdd\x03\x65\xbd\x9b\x01\xda\x9c\xb9\x7e\x3f\x6a\xc4\x7b\xea\xd4\x3c\xb9\xfb\x5c\x6b\x64\x33\x52\xba\x64\x78\x8f\xa4\xaf\x7a\x61\x8d\xbc\xc5\x73\xe9\x6b\x58\x97\x4b\xbf\x63\x22\xd3\x37\x02\x54\xc5\xb9\x16\x4a\xf0\x19\xd8\x94\x57\xb8\x8a\xb3\x16\x3b\xd0\x84\x8e\x67\xa6\xa3\x7d\x78\xec\x00",
+ .failure = true
+} };
+
+static const struct chacha20poly1305_testvec xchacha20poly1305_enc_vectors[] __initconst = { {
+ .key = "\x1c\x92\x40\xa5\xeb\x55\xd3\x8a\xf3\x33\x88\x86\x04\xf6\xb5\xf0\x47\x39\x17\xc1\x40\x2b\x80\x09\x9d\xca\x5c\xbc\x20\x70\x75\xc0",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b\x0c\x0d\x0e\x0f\x10\x11\x12\x13\x14\x15\x16\x17",
+ .nlen = 8,
+ .assoc = "\xf3\x33\x88\x86\x00\x00\x00\x00\x00\x00\x4e\x91",
+ .alen = 12,
+ .input = "\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x72\x65\x20\x64\x72\x61\x66\x74\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x76\x61\x6c\x69\x64\x20\x66\x6f\x72\x20\x61\x20\x6d\x61\x78\x69\x6d\x75\x6d\x20\x6f\x66\x20\x73\x69\x78\x20\x6d\x6f\x6e\x74\x68\x73\x20\x61\x6e\x64\x20\x6d\x61\x79\x20\x62\x65\x20\x75\x70\x64\x61\x74\x65\x64\x2c\x20\x72\x65\x70\x6c\x61\x63\x65\x64\x2c\x20\x6f\x72\x20\x6f\x62\x73\x6f\x6c\x65\x74\x65\x64\x20\x62\x79\x20\x6f\x74\x68\x65\x72\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x61\x74\x20\x61\x6e\x79\x20\x74\x69\x6d\x65\x2e\x20\x49\x74\x20\x69\x73\x20\x69\x6e\x61\x70\x70\x72\x6f\x70\x72\x69\x61\x74\x65\x20\x74\x6f\x20\x75\x73\x65\x20\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x73\x20\x72\x65\x66\x65\x72\x65"
+ "\x6e\x63\x65\x20\x6d\x61\x74\x65\x72\x69\x61\x6c\x20\x6f\x72\x20\x74\x6f\x20\x63\x69\x74\x65\x20\x74\x68\x65\x6d\x20\x6f\x74\x68\x65\x72\x20\x74\x68\x61\x6e\x20\x61\x73\x20\x2f\xe2\x80\x9c\x77\x6f\x72\x6b\x20\x69\x6e\x20\x70\x72\x6f\x67\x72\x65\x73\x73\x2e\x2f\xe2\x80\x9d",
+ .ilen = 265,
+ .result = "\x1a\x6e\x3a\xd9\xfd\x41\x3f\x77\x54\x72\x0a\x70\x9a\xa0\x29\x92\x2e\xed\x93\xcf\x0f\x71\x88\x18\x7a\x9d\x2d\x24\xe0\xf5\xea\x3d\x55\x64\xd7\xad\x2a\x1a\x1f\x7e\x86\x6d\xb0\xce\x80\x41\x72\x86\x26\xee\x84\xd7\xef\x82\x9e\xe2\x60\x9d\x5a\xfc\xf0\xe4\x19\x85\xea\x09\xc6\xfb\xb3\xa9\x50\x09\xec\x5e\x11\x90\xa1\xc5\x4e\x49\xef\x50\xd8\x8f\xe0\x78\xd7\xfd\xb9\x3b\xc9\xf2\x91\xc8\x25\xc8\xa7\x63\x60\xce\x10\xcd\xc6\x7f\xf8\x16\xf8\xe1\x0a\xd9\xde\x79\x50\x33\xf2\x16\x0f\x17\xba\xb8\x5d\xd8\xdf\x4e\x51\xa8\x39\xd0\x85\xca\x46\x6a\x10\xa7\xa3\x88\xef\x79\xb9\xf8\x24\xf3\xe0\x71\x7b\x76\x28\x46\x3a\x3a\x1b\x91\xb6\xd4\x3e\x23\xe5\x44\x15\xbf\x60\x43\x9d\xa4\xbb\xd5\x5f\x89\xeb\xef\x8e\xfd\xdd\xb4\x0d\x46\xf0\x69\x23\x63\xae\x94\xf5\x5e\xa5\xad\x13\x1c\x41\x76\xe6\x90\xd6\x6d\xa2\x8f\x97"
+ "\x4c\xa8\x0b\xcf\x8d\x43\x2b\x9c\x9b\xc5\x58\xa5\xb6\x95\x9a\xbf\x81\xc6\x54\xc9\x66\x0c\xe5\x4f\x6a\x53\xa1\xe5\x0c\xba\x31\xde\x34\x64\x73\x8a\x3b\xbd\x92\x01\xdb\x71\x69\xf3\x58\x99\xbc\xd1\xcb\x4a\x05\xe2\x58\x9c\x25\x17\xcd\xdc\x83\xb7\xff\xfb\x09\x61\xad\xbf\x13\x5b\x5e\xed\x46\x82\x6f\x22\xd8\x93\xa6\x85\x5b\x40\x39\x5c\xc5\x9c"
+} };
+static const struct chacha20poly1305_testvec xchacha20poly1305_dec_vectors[] __initconst = { {
+ .key = "\x1c\x92\x40\xa5\xeb\x55\xd3\x8a\xf3\x33\x88\x86\x04\xf6\xb5\xf0\x47\x39\x17\xc1\x40\x2b\x80\x09\x9d\xca\x5c\xbc\x20\x70\x75\xc0",
+ .nonce = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b\x0c\x0d\x0e\x0f\x10\x11\x12\x13\x14\x15\x16\x17",
+ .nlen = 8,
+ .assoc = "\xf3\x33\x88\x86\x00\x00\x00\x00\x00\x00\x4e\x91",
+ .alen = 12,
+ .input = "\x1a\x6e\x3a\xd9\xfd\x41\x3f\x77\x54\x72\x0a\x70\x9a\xa0\x29\x92\x2e\xed\x93\xcf\x0f\x71\x88\x18\x7a\x9d\x2d\x24\xe0\xf5\xea\x3d\x55\x64\xd7\xad\x2a\x1a\x1f\x7e\x86\x6d\xb0\xce\x80\x41\x72\x86\x26\xee\x84\xd7\xef\x82\x9e\xe2\x60\x9d\x5a\xfc\xf0\xe4\x19\x85\xea\x09\xc6\xfb\xb3\xa9\x50\x09\xec\x5e\x11\x90\xa1\xc5\x4e\x49\xef\x50\xd8\x8f\xe0\x78\xd7\xfd\xb9\x3b\xc9\xf2\x91\xc8\x25\xc8\xa7\x63\x60\xce\x10\xcd\xc6\x7f\xf8\x16\xf8\xe1\x0a\xd9\xde\x79\x50\x33\xf2\x16\x0f\x17\xba\xb8\x5d\xd8\xdf\x4e\x51\xa8\x39\xd0\x85\xca\x46\x6a\x10\xa7\xa3\x88\xef\x79\xb9\xf8\x24\xf3\xe0\x71\x7b\x76\x28\x46\x3a\x3a\x1b\x91\xb6\xd4\x3e\x23\xe5\x44\x15\xbf\x60\x43\x9d\xa4\xbb\xd5\x5f\x89\xeb\xef\x8e\xfd\xdd\xb4\x0d\x46\xf0\x69\x23\x63\xae\x94\xf5\x5e\xa5\xad\x13\x1c\x41\x76\xe6\x90\xd6\x6d\xa2\x8f\x97"
+ "\x4c\xa8\x0b\xcf\x8d\x43\x2b\x9c\x9b\xc5\x58\xa5\xb6\x95\x9a\xbf\x81\xc6\x54\xc9\x66\x0c\xe5\x4f\x6a\x53\xa1\xe5\x0c\xba\x31\xde\x34\x64\x73\x8a\x3b\xbd\x92\x01\xdb\x71\x69\xf3\x58\x99\xbc\xd1\xcb\x4a\x05\xe2\x58\x9c\x25\x17\xcd\xdc\x83\xb7\xff\xfb\x09\x61\xad\xbf\x13\x5b\x5e\xed\x46\x82\x6f\x22\xd8\x93\xa6\x85\x5b\x40\x39\x5c\xc5\x9c",
+ .ilen = 281,
+ .result = "\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x72\x65\x20\x64\x72\x61\x66\x74\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x76\x61\x6c\x69\x64\x20\x66\x6f\x72\x20\x61\x20\x6d\x61\x78\x69\x6d\x75\x6d\x20\x6f\x66\x20\x73\x69\x78\x20\x6d\x6f\x6e\x74\x68\x73\x20\x61\x6e\x64\x20\x6d\x61\x79\x20\x62\x65\x20\x75\x70\x64\x61\x74\x65\x64\x2c\x20\x72\x65\x70\x6c\x61\x63\x65\x64\x2c\x20\x6f\x72\x20\x6f\x62\x73\x6f\x6c\x65\x74\x65\x64\x20\x62\x79\x20\x6f\x74\x68\x65\x72\x20\x64\x6f\x63\x75\x6d\x65\x6e\x74\x73\x20\x61\x74\x20\x61\x6e\x79\x20\x74\x69\x6d\x65\x2e\x20\x49\x74\x20\x69\x73\x20\x69\x6e\x61\x70\x70\x72\x6f\x70\x72\x69\x61\x74\x65\x20\x74\x6f\x20\x75\x73\x65\x20\x49\x6e\x74\x65\x72\x6e\x65\x74\x2d\x44\x72\x61\x66\x74\x73\x20\x61\x73\x20\x72\x65\x66\x65\x72\x65"
+ "\x6e\x63\x65\x20\x6d\x61\x74\x65\x72\x69\x61\x6c\x20\x6f\x72\x20\x74\x6f\x20\x63\x69\x74\x65\x20\x74\x68\x65\x6d\x20\x6f\x74\x68\x65\x72\x20\x74\x68\x61\x6e\x20\x61\x73\x20\x2f\xe2\x80\x9c\x77\x6f\x72\x6b\x20\x69\x6e\x20\x70\x72\x6f\x67\x72\x65\x73\x73\x2e\x2f\xe2\x80\x9d"
+} };
+
+static inline void chacha20poly1305_selftest_encrypt_bignonce(u8 *dst, const u8 *src, const size_t src_len, const u8 *ad, const size_t ad_len, const u8 nonce[12], const u8 key[CHACHA20POLY1305_KEYLEN])
+{
+ bool have_simd = simd_get();
+ struct poly1305_ctx poly1305_state;
+ struct chacha20_ctx chacha20_state;
+ union {
+ u8 block0[POLY1305_KEY_SIZE];
+ __le64 lens[2];
+ } b = {{ 0 }};
+
+ chacha20_init(&chacha20_state, key, 0);
+ chacha20_state.counter[1] = le32_to_cpu(*(__le32 *)(nonce + 0));
+ chacha20_state.counter[2] = le32_to_cpu(*(__le32 *)(nonce + 4));
+ chacha20_state.counter[3] = le32_to_cpu(*(__le32 *)(nonce + 8));
+ chacha20(&chacha20_state, b.block0, b.block0, sizeof(b.block0), have_simd);
+ poly1305_init(&poly1305_state, b.block0, have_simd);
+ poly1305_update(&poly1305_state, ad, ad_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - ad_len) & 0xf, have_simd);
+ chacha20(&chacha20_state, dst, src, src_len, have_simd);
+ poly1305_update(&poly1305_state, dst, src_len, have_simd);
+ poly1305_update(&poly1305_state, pad0, (0x10 - src_len) & 0xf, have_simd);
+ b.lens[0] = cpu_to_le64(ad_len);
+ b.lens[1] = cpu_to_le64(src_len);
+ poly1305_update(&poly1305_state, (u8 *)b.lens, sizeof(b.lens), have_simd);
+ poly1305_finish(&poly1305_state, dst + src_len, have_simd);
+ simd_put(have_simd);
+ memzero_explicit(&chacha20_state, sizeof(chacha20_state));
+ memzero_explicit(&b, sizeof(b));
+}
+
+static inline void chacha20poly1305_selftest_encrypt(u8 *dst, const u8 *src, const size_t src_len, const u8 *ad, const size_t ad_len, const u8 *nonce, const size_t nonce_len, const u8 key[CHACHA20POLY1305_KEYLEN])
+{
+ if (nonce_len == 8)
+ chacha20poly1305_encrypt(dst, src, src_len, ad, ad_len, le64_to_cpup((__force __le64 *)nonce), key);
+ else if (nonce_len == 12)
+ chacha20poly1305_selftest_encrypt_bignonce(dst, src, src_len, ad, ad_len, nonce, key);
+ else
+ BUG();
+}
+
+static inline bool decryption_success(bool func_ret, bool expect_failure, int memcmp_result)
+{
+ if (expect_failure)
+ return !func_ret;
+ return func_ret && !memcmp_result;
+}
+
+enum { MAXIMUM_TEST_BUFFER_LEN = 3000 };
+
+bool __init chacha20poly1305_selftest(void)
+{
+ size_t i;
+ u8 computed_result[MAXIMUM_TEST_BUFFER_LEN], *heap_src, *heap_dst;
+ bool success = true, ret, have_simd;
+ struct scatterlist sg_src, sg_dst;
+
+ heap_src = kmalloc(MAXIMUM_TEST_BUFFER_LEN, GFP_KERNEL);
+ heap_dst = kmalloc(MAXIMUM_TEST_BUFFER_LEN, GFP_KERNEL);
+ if (!heap_src || !heap_dst) {
+ kfree(heap_src);
+ kfree(heap_dst);
+ pr_info("chacha20poly1305 self-test malloc: FAIL\n");
+ return false;
+ }
+
+ for (i = 0; i < ARRAY_SIZE(chacha20poly1305_enc_vectors); ++i) {
+ memset(computed_result, 0, sizeof(computed_result));
+ chacha20poly1305_selftest_encrypt(computed_result, chacha20poly1305_enc_vectors[i].input, chacha20poly1305_enc_vectors[i].ilen, chacha20poly1305_enc_vectors[i].assoc, chacha20poly1305_enc_vectors[i].alen, chacha20poly1305_enc_vectors[i].nonce, chacha20poly1305_enc_vectors[i].nlen, chacha20poly1305_enc_vectors[i].key);
+ if (memcmp(computed_result, chacha20poly1305_enc_vectors[i].result, chacha20poly1305_enc_vectors[i].ilen + POLY1305_MAC_SIZE)) {
+ pr_info("chacha20poly1305 encryption self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+ }
+ have_simd = simd_get();
+ for (i = 0; i < ARRAY_SIZE(chacha20poly1305_enc_vectors); ++i) {
+ if (chacha20poly1305_enc_vectors[i].nlen != 8)
+ continue;
+ memset(heap_dst, 0, MAXIMUM_TEST_BUFFER_LEN);
+ memcpy(heap_src, chacha20poly1305_enc_vectors[i].input, chacha20poly1305_enc_vectors[i].ilen);
+ sg_init_one(&sg_src, heap_src, chacha20poly1305_enc_vectors[i].ilen);
+ sg_init_one(&sg_dst, heap_dst, chacha20poly1305_enc_vectors[i].ilen + POLY1305_MAC_SIZE);
+ ret = chacha20poly1305_encrypt_sg(&sg_dst, &sg_src, chacha20poly1305_enc_vectors[i].ilen, chacha20poly1305_enc_vectors[i].assoc, chacha20poly1305_enc_vectors[i].alen, le64_to_cpup((__force __le64 *)chacha20poly1305_enc_vectors[i].nonce), chacha20poly1305_enc_vectors[i].key, have_simd);
+ if (!ret || memcmp(heap_dst, chacha20poly1305_enc_vectors[i].result, chacha20poly1305_enc_vectors[i].ilen + POLY1305_MAC_SIZE)) {
+ pr_info("chacha20poly1305 sg encryption self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+ }
+ simd_put(have_simd);
+ for (i = 0; i < ARRAY_SIZE(chacha20poly1305_dec_vectors); ++i) {
+ memset(computed_result, 0, sizeof(computed_result));
+ ret = chacha20poly1305_decrypt(computed_result, chacha20poly1305_dec_vectors[i].input, chacha20poly1305_dec_vectors[i].ilen, chacha20poly1305_dec_vectors[i].assoc, chacha20poly1305_dec_vectors[i].alen, le64_to_cpu(*(__force __le64 *)chacha20poly1305_dec_vectors[i].nonce), chacha20poly1305_dec_vectors[i].key);
+ if (!decryption_success(ret, chacha20poly1305_dec_vectors[i].failure, memcmp(computed_result, chacha20poly1305_dec_vectors[i].result, chacha20poly1305_dec_vectors[i].ilen - POLY1305_MAC_SIZE))) {
+ pr_info("chacha20poly1305 decryption self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+ }
+ have_simd = simd_get();
+ for (i = 0; i < ARRAY_SIZE(chacha20poly1305_dec_vectors); ++i) {
+ memset(heap_dst, 0, MAXIMUM_TEST_BUFFER_LEN);
+ memcpy(heap_src, chacha20poly1305_dec_vectors[i].input, chacha20poly1305_dec_vectors[i].ilen);
+ sg_init_one(&sg_src, heap_src, chacha20poly1305_dec_vectors[i].ilen);
+ sg_init_one(&sg_dst, heap_dst, chacha20poly1305_dec_vectors[i].ilen - POLY1305_MAC_SIZE);
+ ret = chacha20poly1305_decrypt_sg(&sg_dst, &sg_src, chacha20poly1305_dec_vectors[i].ilen, chacha20poly1305_dec_vectors[i].assoc, chacha20poly1305_dec_vectors[i].alen, le64_to_cpup((__force __le64 *)chacha20poly1305_dec_vectors[i].nonce), chacha20poly1305_dec_vectors[i].key, have_simd);
+ if (!decryption_success(ret, chacha20poly1305_dec_vectors[i].failure, memcmp(heap_dst, chacha20poly1305_dec_vectors[i].result, chacha20poly1305_dec_vectors[i].ilen - POLY1305_MAC_SIZE))) {
+ pr_info("chacha20poly1305 sg decryption self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+ }
+ simd_put(have_simd);
+ for (i = 0; i < ARRAY_SIZE(xchacha20poly1305_enc_vectors); ++i) {
+ memset(computed_result, 0, sizeof(computed_result));
+ xchacha20poly1305_encrypt(computed_result, xchacha20poly1305_enc_vectors[i].input, xchacha20poly1305_enc_vectors[i].ilen, xchacha20poly1305_enc_vectors[i].assoc, xchacha20poly1305_enc_vectors[i].alen, xchacha20poly1305_enc_vectors[i].nonce, xchacha20poly1305_enc_vectors[i].key);
+ if (memcmp(computed_result, xchacha20poly1305_enc_vectors[i].result, xchacha20poly1305_enc_vectors[i].ilen + POLY1305_MAC_SIZE)) {
+ pr_info("xchacha20poly1305 encryption self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+ }
+ for (i = 0; i < ARRAY_SIZE(xchacha20poly1305_dec_vectors); ++i) {
+ memset(computed_result, 0, sizeof(computed_result));
+ ret = xchacha20poly1305_decrypt(computed_result, xchacha20poly1305_dec_vectors[i].input, xchacha20poly1305_dec_vectors[i].ilen, xchacha20poly1305_dec_vectors[i].assoc, xchacha20poly1305_dec_vectors[i].alen, xchacha20poly1305_dec_vectors[i].nonce, xchacha20poly1305_dec_vectors[i].key);
+ if (!decryption_success(ret, xchacha20poly1305_dec_vectors[i].failure, memcmp(computed_result, xchacha20poly1305_dec_vectors[i].result, xchacha20poly1305_dec_vectors[i].ilen - POLY1305_MAC_SIZE))) {
+ pr_info("xchacha20poly1305 decryption self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+ }
+ if (success)
+ pr_info("chacha20poly1305 self-tests: pass\n");
+ kfree(heap_src);
+ kfree(heap_dst);
+ return success;
+}
+#endif
diff --git a/lib/zinc/selftest/curve25519.h b/lib/zinc/selftest/curve25519.h
new file mode 100644
index 000000000000..5eeba4ace423
--- /dev/null
+++ b/lib/zinc/selftest/curve25519.h
@@ -0,0 +1,607 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ */
+
+#ifdef CONFIG_ZINC_DEBUG
+struct curve25519_test_vector {
+ u8 private[CURVE25519_POINT_SIZE];
+ u8 public[CURVE25519_POINT_SIZE];
+ u8 result[CURVE25519_POINT_SIZE];
+ bool valid;
+};
+static const struct curve25519_test_vector curve25519_test_vectors[] __initconst = {
+ {
+ .private = { 0x77, 0x07, 0x6d, 0x0a, 0x73, 0x18, 0xa5, 0x7d, 0x3c, 0x16, 0xc1, 0x72, 0x51, 0xb2, 0x66, 0x45, 0xdf, 0x4c, 0x2f, 0x87, 0xeb, 0xc0, 0x99, 0x2a, 0xb1, 0x77, 0xfb, 0xa5, 0x1d, 0xb9, 0x2c, 0x2a },
+ .public = { 0xde, 0x9e, 0xdb, 0x7d, 0x7b, 0x7d, 0xc1, 0xb4, 0xd3, 0x5b, 0x61, 0xc2, 0xec, 0xe4, 0x35, 0x37, 0x3f, 0x83, 0x43, 0xc8, 0x5b, 0x78, 0x67, 0x4d, 0xad, 0xfc, 0x7e, 0x14, 0x6f, 0x88, 0x2b, 0x4f },
+ .result = { 0x4a, 0x5d, 0x9d, 0x5b, 0xa4, 0xce, 0x2d, 0xe1, 0x72, 0x8e, 0x3b, 0xf4, 0x80, 0x35, 0x0f, 0x25, 0xe0, 0x7e, 0x21, 0xc9, 0x47, 0xd1, 0x9e, 0x33, 0x76, 0xf0, 0x9b, 0x3c, 0x1e, 0x16, 0x17, 0x42 },
+ .valid = true
+ },
+ {
+ .private = { 0x5d, 0xab, 0x08, 0x7e, 0x62, 0x4a, 0x8a, 0x4b, 0x79, 0xe1, 0x7f, 0x8b, 0x83, 0x80, 0x0e, 0xe6, 0x6f, 0x3b, 0xb1, 0x29, 0x26, 0x18, 0xb6, 0xfd, 0x1c, 0x2f, 0x8b, 0x27, 0xff, 0x88, 0xe0, 0xeb },
+ .public = { 0x85, 0x20, 0xf0, 0x09, 0x89, 0x30, 0xa7, 0x54, 0x74, 0x8b, 0x7d, 0xdc, 0xb4, 0x3e, 0xf7, 0x5a, 0x0d, 0xbf, 0x3a, 0x0d, 0x26, 0x38, 0x1a, 0xf4, 0xeb, 0xa4, 0xa9, 0x8e, 0xaa, 0x9b, 0x4e, 0x6a },
+ .result = { 0x4a, 0x5d, 0x9d, 0x5b, 0xa4, 0xce, 0x2d, 0xe1, 0x72, 0x8e, 0x3b, 0xf4, 0x80, 0x35, 0x0f, 0x25, 0xe0, 0x7e, 0x21, 0xc9, 0x47, 0xd1, 0x9e, 0x33, 0x76, 0xf0, 0x9b, 0x3c, 0x1e, 0x16, 0x17, 0x42 },
+ .valid = true
+ },
+ {
+ .private = { 1 },
+ .public = { 0x25, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .result = { 0x3c, 0x77, 0x77, 0xca, 0xf9, 0x97, 0xb2, 0x64, 0x41, 0x60, 0x77, 0x66, 0x5b, 0x4e, 0x22, 0x9d, 0xb, 0x95, 0x48, 0xdc, 0xc, 0xd8, 0x19, 0x98, 0xdd, 0xcd, 0xc5, 0xc8, 0x53, 0x3c, 0x79, 0x7f },
+ .valid = true
+ },
+ {
+ .private = { 1 },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0xb3, 0x2d, 0x13, 0x62, 0xc2, 0x48, 0xd6, 0x2f, 0xe6, 0x26, 0x19, 0xcf, 0xf0, 0x4d, 0xd4, 0x3d, 0xb7, 0x3f, 0xfc, 0x1b, 0x63, 0x8, 0xed, 0xe3, 0xb, 0x78, 0xd8, 0x73, 0x80, 0xf1, 0xe8, 0x34 },
+ .valid = true
+ },
+ {
+ .private = { 0xa5, 0x46, 0xe3, 0x6b, 0xf0, 0x52, 0x7c, 0x9d, 0x3b, 0x16, 0x15, 0x4b, 0x82, 0x46, 0x5e, 0xdd, 0x62, 0x14, 0x4c, 0x0a, 0xc1, 0xfc, 0x5a, 0x18, 0x50, 0x6a, 0x22, 0x44, 0xba, 0x44, 0x9a, 0xc4 },
+ .public = { 0xe6, 0xdb, 0x68, 0x67, 0x58, 0x30, 0x30, 0xdb, 0x35, 0x94, 0xc1, 0xa4, 0x24, 0xb1, 0x5f, 0x7c, 0x72, 0x66, 0x24, 0xec, 0x26, 0xb3, 0x35, 0x3b, 0x10, 0xa9, 0x03, 0xa6, 0xd0, 0xab, 0x1c, 0x4c },
+ .result = { 0xc3, 0xda, 0x55, 0x37, 0x9d, 0xe9, 0xc6, 0x90, 0x8e, 0x94, 0xea, 0x4d, 0xf2, 0x8d, 0x08, 0x4f, 0x32, 0xec, 0xcf, 0x03, 0x49, 0x1c, 0x71, 0xf7, 0x54, 0xb4, 0x07, 0x55, 0x77, 0xa2, 0x85, 0x52 },
+ .valid = true
+ },
+ {
+ .private = { 1, 2, 3, 4 },
+ .public = { 0 },
+ .result = { 0 },
+ .valid = false
+ },
+ {
+ .private = { 2, 4, 6, 8 },
+ .public = { 0xe0, 0xeb, 0x7a, 0x7c, 0x3b, 0x41, 0xb8, 0xae, 0x16, 0x56, 0xe3, 0xfa, 0xf1, 0x9f, 0xc4, 0x6a, 0xda, 0x09, 0x8d, 0xeb, 0x9c, 0x32, 0xb1, 0xfd, 0x86, 0x62, 0x05, 0x16, 0x5f, 0x49, 0xb8 },
+ .result = { 0 },
+ .valid = false
+ },
+ {
+ .private = { 0xff, 0xff, 0xff, 0xff, 0x0a, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x0a, 0x00, 0xfb, 0x9f },
+ .result = { 0x77, 0x52, 0xb6, 0x18, 0xc1, 0x2d, 0x48, 0xd2, 0xc6, 0x93, 0x46, 0x83, 0x81, 0x7c, 0xc6, 0x57, 0xf3, 0x31, 0x03, 0x19, 0x49, 0x48, 0x20, 0x05, 0x42, 0x2b, 0x4e, 0xae, 0x8d, 0x1d, 0x43, 0x23 },
+ .valid = true
+ },
+ {
+ .private = { 0x8e, 0x0a, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .public = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x8e, 0x06 },
+ .result = { 0x5a, 0xdf, 0xaa, 0x25, 0x86, 0x8e, 0x32, 0x3d, 0xae, 0x49, 0x62, 0xc1, 0x01, 0x5c, 0xb3, 0x12, 0xe1, 0xc5, 0xc7, 0x9e, 0x95, 0x3f, 0x03, 0x99, 0xb0, 0xba, 0x16, 0x22, 0xf3, 0xb6, 0xf7, 0x0c },
+ .valid = true
+ },
+ /* wycheproof - normal case */
+ {
+ .private = { 0x48, 0x52, 0x83, 0x4d, 0x9d, 0x6b, 0x77, 0xda, 0xde, 0xab, 0xaa, 0xf2, 0xe1, 0x1d, 0xca, 0x66, 0xd1, 0x9f, 0xe7, 0x49, 0x93, 0xa7, 0xbe, 0xc3, 0x6c, 0x6e, 0x16, 0xa0, 0x98, 0x3f, 0xea, 0xba },
+ .public = { 0x9c, 0x64, 0x7d, 0x9a, 0xe5, 0x89, 0xb9, 0xf5, 0x8f, 0xdc, 0x3c, 0xa4, 0x94, 0x7e, 0xfb, 0xc9, 0x15, 0xc4, 0xb2, 0xe0, 0x8e, 0x74, 0x4a, 0x0e, 0xdf, 0x46, 0x9d, 0xac, 0x59, 0xc8, 0xf8, 0x5a },
+ .result = { 0x87, 0xb7, 0xf2, 0x12, 0xb6, 0x27, 0xf7, 0xa5, 0x4c, 0xa5, 0xe0, 0xbc, 0xda, 0xdd, 0xd5, 0x38, 0x9d, 0x9d, 0xe6, 0x15, 0x6c, 0xdb, 0xcf, 0x8e, 0xbe, 0x14, 0xff, 0xbc, 0xfb, 0x43, 0x65, 0x51 },
+ .valid = true
+ },
+ /* wycheproof - public key on twist */
+ {
+ .private = { 0x58, 0x8c, 0x06, 0x1a, 0x50, 0x80, 0x4a, 0xc4, 0x88, 0xad, 0x77, 0x4a, 0xc7, 0x16, 0xc3, 0xf5, 0xba, 0x71, 0x4b, 0x27, 0x12, 0xe0, 0x48, 0x49, 0x13, 0x79, 0xa5, 0x00, 0x21, 0x19, 0x98, 0xa8 },
+ .public = { 0x63, 0xaa, 0x40, 0xc6, 0xe3, 0x83, 0x46, 0xc5, 0xca, 0xf2, 0x3a, 0x6d, 0xf0, 0xa5, 0xe6, 0xc8, 0x08, 0x89, 0xa0, 0x86, 0x47, 0xe5, 0x51, 0xb3, 0x56, 0x34, 0x49, 0xbe, 0xfc, 0xfc, 0x97, 0x33 },
+ .result = { 0xb1, 0xa7, 0x07, 0x51, 0x94, 0x95, 0xff, 0xff, 0xb2, 0x98, 0xff, 0x94, 0x17, 0x16, 0xb0, 0x6d, 0xfa, 0xb8, 0x7c, 0xf8, 0xd9, 0x11, 0x23, 0xfe, 0x2b, 0xe9, 0xa2, 0x33, 0xdd, 0xa2, 0x22, 0x12 },
+ .valid = true
+ },
+ /* wycheproof - public key on twist */
+ {
+ .private = { 0xb0, 0x5b, 0xfd, 0x32, 0xe5, 0x53, 0x25, 0xd9, 0xfd, 0x64, 0x8c, 0xb3, 0x02, 0x84, 0x80, 0x39, 0x00, 0x0b, 0x39, 0x0e, 0x44, 0xd5, 0x21, 0xe5, 0x8a, 0xab, 0x3b, 0x29, 0xa6, 0x96, 0x0b, 0xa8 },
+ .public = { 0x0f, 0x83, 0xc3, 0x6f, 0xde, 0xd9, 0xd3, 0x2f, 0xad, 0xf4, 0xef, 0xa3, 0xae, 0x93, 0xa9, 0x0b, 0xb5, 0xcf, 0xa6, 0x68, 0x93, 0xbc, 0x41, 0x2c, 0x43, 0xfa, 0x72, 0x87, 0xdb, 0xb9, 0x97, 0x79 },
+ .result = { 0x67, 0xdd, 0x4a, 0x6e, 0x16, 0x55, 0x33, 0x53, 0x4c, 0x0e, 0x3f, 0x17, 0x2e, 0x4a, 0xb8, 0x57, 0x6b, 0xca, 0x92, 0x3a, 0x5f, 0x07, 0xb2, 0xc0, 0x69, 0xb4, 0xc3, 0x10, 0xff, 0x2e, 0x93, 0x5b },
+ .valid = true
+ },
+ /* wycheproof - public key on twist */
+ {
+ .private = { 0x70, 0xe3, 0x4b, 0xcb, 0xe1, 0xf4, 0x7f, 0xbc, 0x0f, 0xdd, 0xfd, 0x7c, 0x1e, 0x1a, 0xa5, 0x3d, 0x57, 0xbf, 0xe0, 0xf6, 0x6d, 0x24, 0x30, 0x67, 0xb4, 0x24, 0xbb, 0x62, 0x10, 0xbe, 0xd1, 0x9c },
+ .public = { 0x0b, 0x82, 0x11, 0xa2, 0xb6, 0x04, 0x90, 0x97, 0xf6, 0x87, 0x1c, 0x6c, 0x05, 0x2d, 0x3c, 0x5f, 0xc1, 0xba, 0x17, 0xda, 0x9e, 0x32, 0xae, 0x45, 0x84, 0x03, 0xb0, 0x5b, 0xb2, 0x83, 0x09, 0x2a },
+ .result = { 0x4a, 0x06, 0x38, 0xcf, 0xaa, 0x9e, 0xf1, 0x93, 0x3b, 0x47, 0xf8, 0x93, 0x92, 0x96, 0xa6, 0xb2, 0x5b, 0xe5, 0x41, 0xef, 0x7f, 0x70, 0xe8, 0x44, 0xc0, 0xbc, 0xc0, 0x0b, 0x13, 0x4d, 0xe6, 0x4a },
+ .valid = true
+ },
+ /* wycheproof - public key on twist */
+ {
+ .private = { 0x68, 0xc1, 0xf3, 0xa6, 0x53, 0xa4, 0xcd, 0xb1, 0xd3, 0x7b, 0xba, 0x94, 0x73, 0x8f, 0x8b, 0x95, 0x7a, 0x57, 0xbe, 0xb2, 0x4d, 0x64, 0x6e, 0x99, 0x4d, 0xc2, 0x9a, 0x27, 0x6a, 0xad, 0x45, 0x8d },
+ .public = { 0x34, 0x3a, 0xc2, 0x0a, 0x3b, 0x9c, 0x6a, 0x27, 0xb1, 0x00, 0x81, 0x76, 0x50, 0x9a, 0xd3, 0x07, 0x35, 0x85, 0x6e, 0xc1, 0xc8, 0xd8, 0xfc, 0xae, 0x13, 0x91, 0x2d, 0x08, 0xd1, 0x52, 0xf4, 0x6c },
+ .result = { 0x39, 0x94, 0x91, 0xfc, 0xe8, 0xdf, 0xab, 0x73, 0xb4, 0xf9, 0xf6, 0x11, 0xde, 0x8e, 0xa0, 0xb2, 0x7b, 0x28, 0xf8, 0x59, 0x94, 0x25, 0x0b, 0x0f, 0x47, 0x5d, 0x58, 0x5d, 0x04, 0x2a, 0xc2, 0x07 },
+ .valid = true
+ },
+ /* wycheproof - public key on twist */
+ {
+ .private = { 0xd8, 0x77, 0xb2, 0x6d, 0x06, 0xdf, 0xf9, 0xd9, 0xf7, 0xfd, 0x4c, 0x5b, 0x37, 0x69, 0xf8, 0xcd, 0xd5, 0xb3, 0x05, 0x16, 0xa5, 0xab, 0x80, 0x6b, 0xe3, 0x24, 0xff, 0x3e, 0xb6, 0x9e, 0xa0, 0xb2 },
+ .public = { 0xfa, 0x69, 0x5f, 0xc7, 0xbe, 0x8d, 0x1b, 0xe5, 0xbf, 0x70, 0x48, 0x98, 0xf3, 0x88, 0xc4, 0x52, 0xba, 0xfd, 0xd3, 0xb8, 0xea, 0xe8, 0x05, 0xf8, 0x68, 0x1a, 0x8d, 0x15, 0xc2, 0xd4, 0xe1, 0x42 },
+ .result = { 0x2c, 0x4f, 0xe1, 0x1d, 0x49, 0x0a, 0x53, 0x86, 0x17, 0x76, 0xb1, 0x3b, 0x43, 0x54, 0xab, 0xd4, 0xcf, 0x5a, 0x97, 0x69, 0x9d, 0xb6, 0xe6, 0xc6, 0x8c, 0x16, 0x26, 0xd0, 0x76, 0x62, 0xf7, 0x58 },
+ .valid = true
+ },
+ /* wycheproof - public key = 0 */
+ {
+ .private = { 0x20, 0x74, 0x94, 0x03, 0x8f, 0x2b, 0xb8, 0x11, 0xd4, 0x78, 0x05, 0xbc, 0xdf, 0x04, 0xa2, 0xac, 0x58, 0x5a, 0xda, 0x7f, 0x2f, 0x23, 0x38, 0x9b, 0xfd, 0x46, 0x58, 0xf9, 0xdd, 0xd4, 0xde, 0xbc },
+ .public = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key = 1 */
+ {
+ .private = { 0x20, 0x2e, 0x89, 0x72, 0xb6, 0x1c, 0x7e, 0x61, 0x93, 0x0e, 0xb9, 0x45, 0x0b, 0x50, 0x70, 0xea, 0xe1, 0xc6, 0x70, 0x47, 0x56, 0x85, 0x54, 0x1f, 0x04, 0x76, 0x21, 0x7e, 0x48, 0x18, 0xcf, 0xab },
+ .public = { 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - edge case on twist */
+ {
+ .private = { 0x38, 0xdd, 0xe9, 0xf3, 0xe7, 0xb7, 0x99, 0x04, 0x5f, 0x9a, 0xc3, 0x79, 0x3d, 0x4a, 0x92, 0x77, 0xda, 0xde, 0xad, 0xc4, 0x1b, 0xec, 0x02, 0x90, 0xf8, 0x1f, 0x74, 0x4f, 0x73, 0x77, 0x5f, 0x84 },
+ .public = { 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .result = { 0x9a, 0x2c, 0xfe, 0x84, 0xff, 0x9c, 0x4a, 0x97, 0x39, 0x62, 0x5c, 0xae, 0x4a, 0x3b, 0x82, 0xa9, 0x06, 0x87, 0x7a, 0x44, 0x19, 0x46, 0xf8, 0xd7, 0xb3, 0xd7, 0x95, 0xfe, 0x8f, 0x5d, 0x16, 0x39 },
+ .valid = true
+ },
+ /* wycheproof - edge case on twist */
+ {
+ .private = { 0x98, 0x57, 0xa9, 0x14, 0xe3, 0xc2, 0x90, 0x36, 0xfd, 0x9a, 0x44, 0x2b, 0xa5, 0x26, 0xb5, 0xcd, 0xcd, 0xf2, 0x82, 0x16, 0x15, 0x3e, 0x63, 0x6c, 0x10, 0x67, 0x7a, 0xca, 0xb6, 0xbd, 0x6a, 0xa5 },
+ .public = { 0x03, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .result = { 0x4d, 0xa4, 0xe0, 0xaa, 0x07, 0x2c, 0x23, 0x2e, 0xe2, 0xf0, 0xfa, 0x4e, 0x51, 0x9a, 0xe5, 0x0b, 0x52, 0xc1, 0xed, 0xd0, 0x8a, 0x53, 0x4d, 0x4e, 0xf3, 0x46, 0xc2, 0xe1, 0x06, 0xd2, 0x1d, 0x60 },
+ .valid = true
+ },
+ /* wycheproof - edge case on twist */
+ {
+ .private = { 0x48, 0xe2, 0x13, 0x0d, 0x72, 0x33, 0x05, 0xed, 0x05, 0xe6, 0xe5, 0x89, 0x4d, 0x39, 0x8a, 0x5e, 0x33, 0x36, 0x7a, 0x8c, 0x6a, 0xac, 0x8f, 0xcd, 0xf0, 0xa8, 0x8e, 0x4b, 0x42, 0x82, 0x0d, 0xb7 },
+ .public = { 0xff, 0xff, 0xff, 0x03, 0x00, 0x00, 0xf8, 0xff, 0xff, 0x1f, 0x00, 0x00, 0xc0, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0xfe, 0xff, 0xff, 0x07, 0x00, 0x00, 0xf0, 0xff, 0xff, 0x3f, 0x00, 0x00, 0x00 },
+ .result = { 0x9e, 0xd1, 0x0c, 0x53, 0x74, 0x7f, 0x64, 0x7f, 0x82, 0xf4, 0x51, 0x25, 0xd3, 0xde, 0x15, 0xa1, 0xe6, 0xb8, 0x24, 0x49, 0x6a, 0xb4, 0x04, 0x10, 0xff, 0xcc, 0x3c, 0xfe, 0x95, 0x76, 0x0f, 0x3b },
+ .valid = true
+ },
+ /* wycheproof - edge case on twist */
+ {
+ .private = { 0x28, 0xf4, 0x10, 0x11, 0x69, 0x18, 0x51, 0xb3, 0xa6, 0x2b, 0x64, 0x15, 0x53, 0xb3, 0x0d, 0x0d, 0xfd, 0xdc, 0xb8, 0xff, 0xfc, 0xf5, 0x37, 0x00, 0xa7, 0xbe, 0x2f, 0x6a, 0x87, 0x2e, 0x9f, 0xb0 },
+ .public = { 0x00, 0x00, 0x00, 0xfc, 0xff, 0xff, 0x07, 0x00, 0x00, 0xe0, 0xff, 0xff, 0x3f, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0x01, 0x00, 0x00, 0xf8, 0xff, 0xff, 0x0f, 0x00, 0x00, 0xc0, 0xff, 0xff, 0x7f },
+ .result = { 0xcf, 0x72, 0xb4, 0xaa, 0x6a, 0xa1, 0xc9, 0xf8, 0x94, 0xf4, 0x16, 0x5b, 0x86, 0x10, 0x9a, 0xa4, 0x68, 0x51, 0x76, 0x48, 0xe1, 0xf0, 0xcc, 0x70, 0xe1, 0xab, 0x08, 0x46, 0x01, 0x76, 0x50, 0x6b },
+ .valid = true
+ },
+ /* wycheproof - edge case on twist */
+ {
+ .private = { 0x18, 0xa9, 0x3b, 0x64, 0x99, 0xb9, 0xf6, 0xb3, 0x22, 0x5c, 0xa0, 0x2f, 0xef, 0x41, 0x0e, 0x0a, 0xde, 0xc2, 0x35, 0x32, 0x32, 0x1d, 0x2d, 0x8e, 0xf1, 0xa6, 0xd6, 0x02, 0xa8, 0xc6, 0x5b, 0x83 },
+ .public = { 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x5d, 0x50, 0xb6, 0x28, 0x36, 0xbb, 0x69, 0x57, 0x94, 0x10, 0x38, 0x6c, 0xf7, 0xbb, 0x81, 0x1c, 0x14, 0xbf, 0x85, 0xb1, 0xc7, 0xb1, 0x7e, 0x59, 0x24, 0xc7, 0xff, 0xea, 0x91, 0xef, 0x9e, 0x12 },
+ .valid = true
+ },
+ /* wycheproof - edge case on twist */
+ {
+ .private = { 0xc0, 0x1d, 0x13, 0x05, 0xa1, 0x33, 0x8a, 0x1f, 0xca, 0xc2, 0xba, 0x7e, 0x2e, 0x03, 0x2b, 0x42, 0x7e, 0x0b, 0x04, 0x90, 0x31, 0x65, 0xac, 0xa9, 0x57, 0xd8, 0xd0, 0x55, 0x3d, 0x87, 0x17, 0xb0 },
+ .public = { 0xea, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x19, 0x23, 0x0e, 0xb1, 0x48, 0xd5, 0xd6, 0x7c, 0x3c, 0x22, 0xab, 0x1d, 0xae, 0xff, 0x80, 0xa5, 0x7e, 0xae, 0x42, 0x65, 0xce, 0x28, 0x72, 0x65, 0x7b, 0x2c, 0x80, 0x99, 0xfc, 0x69, 0x8e, 0x50 },
+ .valid = true
+ },
+ /* wycheproof - edge case for public key */
+ {
+ .private = { 0x38, 0x6f, 0x7f, 0x16, 0xc5, 0x07, 0x31, 0xd6, 0x4f, 0x82, 0xe6, 0xa1, 0x70, 0xb1, 0x42, 0xa4, 0xe3, 0x4f, 0x31, 0xfd, 0x77, 0x68, 0xfc, 0xb8, 0x90, 0x29, 0x25, 0xe7, 0xd1, 0xe2, 0x1a, 0xbe },
+ .public = { 0x04, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .result = { 0x0f, 0xca, 0xb5, 0xd8, 0x42, 0xa0, 0x78, 0xd7, 0xa7, 0x1f, 0xc5, 0x9b, 0x57, 0xbf, 0xb4, 0xca, 0x0b, 0xe6, 0x87, 0x3b, 0x49, 0xdc, 0xdb, 0x9f, 0x44, 0xe1, 0x4a, 0xe8, 0xfb, 0xdf, 0xa5, 0x42 },
+ .valid = true
+ },
+ /* wycheproof - edge case for public key */
+ {
+ .private = { 0xe0, 0x23, 0xa2, 0x89, 0xbd, 0x5e, 0x90, 0xfa, 0x28, 0x04, 0xdd, 0xc0, 0x19, 0xa0, 0x5e, 0xf3, 0xe7, 0x9d, 0x43, 0x4b, 0xb6, 0xea, 0x2f, 0x52, 0x2e, 0xcb, 0x64, 0x3a, 0x75, 0x29, 0x6e, 0x95 },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff, 0x00, 0x00, 0x00, 0x00 },
+ .result = { 0x54, 0xce, 0x8f, 0x22, 0x75, 0xc0, 0x77, 0xe3, 0xb1, 0x30, 0x6a, 0x39, 0x39, 0xc5, 0xe0, 0x3e, 0xef, 0x6b, 0xbb, 0x88, 0x06, 0x05, 0x44, 0x75, 0x8d, 0x9f, 0xef, 0x59, 0xb0, 0xbc, 0x3e, 0x4f },
+ .valid = true
+ },
+ /* wycheproof - edge case for public key */
+ {
+ .private = { 0x68, 0xf0, 0x10, 0xd6, 0x2e, 0xe8, 0xd9, 0x26, 0x05, 0x3a, 0x36, 0x1c, 0x3a, 0x75, 0xc6, 0xea, 0x4e, 0xbd, 0xc8, 0x60, 0x6a, 0xb2, 0x85, 0x00, 0x3a, 0x6f, 0x8f, 0x40, 0x76, 0xb0, 0x1e, 0x83 },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x03 },
+ .result = { 0xf1, 0x36, 0x77, 0x5c, 0x5b, 0xeb, 0x0a, 0xf8, 0x11, 0x0a, 0xf1, 0x0b, 0x20, 0x37, 0x23, 0x32, 0x04, 0x3c, 0xab, 0x75, 0x24, 0x19, 0x67, 0x87, 0x75, 0xa2, 0x23, 0xdf, 0x57, 0xc9, 0xd3, 0x0d },
+ .valid = true
+ },
+ /* wycheproof - edge case for public key */
+ {
+ .private = { 0x58, 0xeb, 0xcb, 0x35, 0xb0, 0xf8, 0x84, 0x5c, 0xaf, 0x1e, 0xc6, 0x30, 0xf9, 0x65, 0x76, 0xb6, 0x2c, 0x4b, 0x7b, 0x6c, 0x36, 0xb2, 0x9d, 0xeb, 0x2c, 0xb0, 0x08, 0x46, 0x51, 0x75, 0x5c, 0x96 },
+ .public = { 0xff, 0xff, 0xff, 0xfb, 0xff, 0xff, 0xfb, 0xff, 0xff, 0xdf, 0xff, 0xff, 0xdf, 0xff, 0xff, 0xff, 0xfe, 0xff, 0xff, 0xfe, 0xff, 0xff, 0xf7, 0xff, 0xff, 0xf7, 0xff, 0xff, 0xbf, 0xff, 0xff, 0x3f },
+ .result = { 0xbf, 0x9a, 0xff, 0xd0, 0x6b, 0x84, 0x40, 0x85, 0x58, 0x64, 0x60, 0x96, 0x2e, 0xf2, 0x14, 0x6f, 0xf3, 0xd4, 0x53, 0x3d, 0x94, 0x44, 0xaa, 0xb0, 0x06, 0xeb, 0x88, 0xcc, 0x30, 0x54, 0x40, 0x7d },
+ .valid = true
+ },
+ /* wycheproof - edge case for public key */
+ {
+ .private = { 0x18, 0x8c, 0x4b, 0xc5, 0xb9, 0xc4, 0x4b, 0x38, 0xbb, 0x65, 0x8b, 0x9b, 0x2a, 0xe8, 0x2d, 0x5b, 0x01, 0x01, 0x5e, 0x09, 0x31, 0x84, 0xb1, 0x7c, 0xb7, 0x86, 0x35, 0x03, 0xa7, 0x83, 0xe1, 0xbb },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x3f },
+ .result = { 0xd4, 0x80, 0xde, 0x04, 0xf6, 0x99, 0xcb, 0x3b, 0xe0, 0x68, 0x4a, 0x9c, 0xc2, 0xe3, 0x12, 0x81, 0xea, 0x0b, 0xc5, 0xa9, 0xdc, 0xc1, 0x57, 0xd3, 0xd2, 0x01, 0x58, 0xd4, 0x6c, 0xa5, 0x24, 0x6d },
+ .valid = true
+ },
+ /* wycheproof - edge case for public key */
+ {
+ .private = { 0xe0, 0x6c, 0x11, 0xbb, 0x2e, 0x13, 0xce, 0x3d, 0xc7, 0x67, 0x3f, 0x67, 0xf5, 0x48, 0x22, 0x42, 0x90, 0x94, 0x23, 0xa9, 0xae, 0x95, 0xee, 0x98, 0x6a, 0x98, 0x8d, 0x98, 0xfa, 0xee, 0x23, 0xa2 },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0xfe, 0xff, 0xff, 0x7f, 0xff, 0xff, 0xff, 0xff, 0xfe, 0xff, 0xff, 0x7f, 0xff, 0xff, 0xff, 0xff, 0xfe, 0xff, 0xff, 0x7f, 0xff, 0xff, 0xff, 0xff, 0xfe, 0xff, 0xff, 0x7f },
+ .result = { 0x4c, 0x44, 0x01, 0xcc, 0xe6, 0xb5, 0x1e, 0x4c, 0xb1, 0x8f, 0x27, 0x90, 0x24, 0x6c, 0x9b, 0xf9, 0x14, 0xdb, 0x66, 0x77, 0x50, 0xa1, 0xcb, 0x89, 0x06, 0x90, 0x92, 0xaf, 0x07, 0x29, 0x22, 0x76 },
+ .valid = true
+ },
+ /* wycheproof - edge case for public key */
+ {
+ .private = { 0xc0, 0x65, 0x8c, 0x46, 0xdd, 0xe1, 0x81, 0x29, 0x29, 0x38, 0x77, 0x53, 0x5b, 0x11, 0x62, 0xb6, 0xf9, 0xf5, 0x41, 0x4a, 0x23, 0xcf, 0x4d, 0x2c, 0xbc, 0x14, 0x0a, 0x4d, 0x99, 0xda, 0x2b, 0x8f },
+ .public = { 0xeb, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x57, 0x8b, 0xa8, 0xcc, 0x2d, 0xbd, 0xc5, 0x75, 0xaf, 0xcf, 0x9d, 0xf2, 0xb3, 0xee, 0x61, 0x89, 0xf5, 0x33, 0x7d, 0x68, 0x54, 0xc7, 0x9b, 0x4c, 0xe1, 0x65, 0xea, 0x12, 0x29, 0x3b, 0x3a, 0x0f },
+ .valid = true
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x10, 0x25, 0x5c, 0x92, 0x30, 0xa9, 0x7a, 0x30, 0xa4, 0x58, 0xca, 0x28, 0x4a, 0x62, 0x96, 0x69, 0x29, 0x3a, 0x31, 0x89, 0x0c, 0xda, 0x9d, 0x14, 0x7f, 0xeb, 0xc7, 0xd1, 0xe2, 0x2d, 0x6b, 0xb1 },
+ .public = { 0xe0, 0xeb, 0x7a, 0x7c, 0x3b, 0x41, 0xb8, 0xae, 0x16, 0x56, 0xe3, 0xfa, 0xf1, 0x9f, 0xc4, 0x6a, 0xda, 0x09, 0x8d, 0xeb, 0x9c, 0x32, 0xb1, 0xfd, 0x86, 0x62, 0x05, 0x16, 0x5f, 0x49, 0xb8, 0x00 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x78, 0xf1, 0xe8, 0xed, 0xf1, 0x44, 0x81, 0xb3, 0x89, 0x44, 0x8d, 0xac, 0x8f, 0x59, 0xc7, 0x0b, 0x03, 0x8e, 0x7c, 0xf9, 0x2e, 0xf2, 0xc7, 0xef, 0xf5, 0x7a, 0x72, 0x46, 0x6e, 0x11, 0x52, 0x96 },
+ .public = { 0x5f, 0x9c, 0x95, 0xbc, 0xa3, 0x50, 0x8c, 0x24, 0xb1, 0xd0, 0xb1, 0x55, 0x9c, 0x83, 0xef, 0x5b, 0x04, 0x44, 0x5c, 0xc4, 0x58, 0x1c, 0x8e, 0x86, 0xd8, 0x22, 0x4e, 0xdd, 0xd0, 0x9f, 0x11, 0x57 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0xa0, 0xa0, 0x5a, 0x3e, 0x8f, 0x9f, 0x44, 0x20, 0x4d, 0x5f, 0x80, 0x59, 0xa9, 0x4a, 0xc7, 0xdf, 0xc3, 0x9a, 0x49, 0xac, 0x01, 0x6d, 0xd7, 0x43, 0xdb, 0xfa, 0x43, 0xc5, 0xd6, 0x71, 0xfd, 0x88 },
+ .public = { 0xec, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0xd0, 0xdb, 0xb3, 0xed, 0x19, 0x06, 0x66, 0x3f, 0x15, 0x42, 0x0a, 0xf3, 0x1f, 0x4e, 0xaf, 0x65, 0x09, 0xd9, 0xa9, 0x94, 0x97, 0x23, 0x50, 0x06, 0x05, 0xad, 0x7c, 0x1c, 0x6e, 0x74, 0x50, 0xa9 },
+ .public = { 0xed, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0xc0, 0xb1, 0xd0, 0xeb, 0x22, 0xb2, 0x44, 0xfe, 0x32, 0x91, 0x14, 0x00, 0x72, 0xcd, 0xd9, 0xd9, 0x89, 0xb5, 0xf0, 0xec, 0xd9, 0x6c, 0x10, 0x0f, 0xeb, 0x5b, 0xca, 0x24, 0x1c, 0x1d, 0x9f, 0x8f },
+ .public = { 0xee, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x48, 0x0b, 0xf4, 0x5f, 0x59, 0x49, 0x42, 0xa8, 0xbc, 0x0f, 0x33, 0x53, 0xc6, 0xe8, 0xb8, 0x85, 0x3d, 0x77, 0xf3, 0x51, 0xf1, 0xc2, 0xca, 0x6c, 0x2d, 0x1a, 0xbf, 0x8a, 0x00, 0xb4, 0x22, 0x9c },
+ .public = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x80 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x30, 0xf9, 0x93, 0xfc, 0xf8, 0x51, 0x4f, 0xc8, 0x9b, 0xd8, 0xdb, 0x14, 0xcd, 0x43, 0xba, 0x0d, 0x4b, 0x25, 0x30, 0xe7, 0x3c, 0x42, 0x76, 0xa0, 0x5e, 0x1b, 0x14, 0x5d, 0x42, 0x0c, 0xed, 0xb4 },
+ .public = { 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x80 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0xc0, 0x49, 0x74, 0xb7, 0x58, 0x38, 0x0e, 0x2a, 0x5b, 0x5d, 0xf6, 0xeb, 0x09, 0xbb, 0x2f, 0x6b, 0x34, 0x34, 0xf9, 0x82, 0x72, 0x2a, 0x8e, 0x67, 0x6d, 0x3d, 0xa2, 0x51, 0xd1, 0xb3, 0xde, 0x83 },
+ .public = { 0xe0, 0xeb, 0x7a, 0x7c, 0x3b, 0x41, 0xb8, 0xae, 0x16, 0x56, 0xe3, 0xfa, 0xf1, 0x9f, 0xc4, 0x6a, 0xda, 0x09, 0x8d, 0xeb, 0x9c, 0x32, 0xb1, 0xfd, 0x86, 0x62, 0x05, 0x16, 0x5f, 0x49, 0xb8, 0x80 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x50, 0x2a, 0x31, 0x37, 0x3d, 0xb3, 0x24, 0x46, 0x84, 0x2f, 0xe5, 0xad, 0xd3, 0xe0, 0x24, 0x02, 0x2e, 0xa5, 0x4f, 0x27, 0x41, 0x82, 0xaf, 0xc3, 0xd9, 0xf1, 0xbb, 0x3d, 0x39, 0x53, 0x4e, 0xb5 },
+ .public = { 0x5f, 0x9c, 0x95, 0xbc, 0xa3, 0x50, 0x8c, 0x24, 0xb1, 0xd0, 0xb1, 0x55, 0x9c, 0x83, 0xef, 0x5b, 0x04, 0x44, 0x5c, 0xc4, 0x58, 0x1c, 0x8e, 0x86, 0xd8, 0x22, 0x4e, 0xdd, 0xd0, 0x9f, 0x11, 0xd7 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x90, 0xfa, 0x64, 0x17, 0xb0, 0xe3, 0x70, 0x30, 0xfd, 0x6e, 0x43, 0xef, 0xf2, 0xab, 0xae, 0xf1, 0x4c, 0x67, 0x93, 0x11, 0x7a, 0x03, 0x9c, 0xf6, 0x21, 0x31, 0x8b, 0xa9, 0x0f, 0x4e, 0x98, 0xbe },
+ .public = { 0xec, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x78, 0xad, 0x3f, 0x26, 0x02, 0x7f, 0x1c, 0x9f, 0xdd, 0x97, 0x5a, 0x16, 0x13, 0xb9, 0x47, 0x77, 0x9b, 0xad, 0x2c, 0xf2, 0xb7, 0x41, 0xad, 0xe0, 0x18, 0x40, 0x88, 0x5a, 0x30, 0xbb, 0x97, 0x9c },
+ .public = { 0xed, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key with low order */
+ {
+ .private = { 0x98, 0xe2, 0x3d, 0xe7, 0xb1, 0xe0, 0x92, 0x6e, 0xd9, 0xc8, 0x7e, 0x7b, 0x14, 0xba, 0xf5, 0x5f, 0x49, 0x7a, 0x1d, 0x70, 0x96, 0xf9, 0x39, 0x77, 0x68, 0x0e, 0x44, 0xdc, 0x1c, 0x7b, 0x7b, 0x8b },
+ .public = { 0xee, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = false
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0xf0, 0x1e, 0x48, 0xda, 0xfa, 0xc9, 0xd7, 0xbc, 0xf5, 0x89, 0xcb, 0xc3, 0x82, 0xc8, 0x78, 0xd1, 0x8b, 0xda, 0x35, 0x50, 0x58, 0x9f, 0xfb, 0x5d, 0x50, 0xb5, 0x23, 0xbe, 0xbe, 0x32, 0x9d, 0xae },
+ .public = { 0xef, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0xbd, 0x36, 0xa0, 0x79, 0x0e, 0xb8, 0x83, 0x09, 0x8c, 0x98, 0x8b, 0x21, 0x78, 0x67, 0x73, 0xde, 0x0b, 0x3a, 0x4d, 0xf1, 0x62, 0x28, 0x2c, 0xf1, 0x10, 0xde, 0x18, 0xdd, 0x48, 0x4c, 0xe7, 0x4b },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x28, 0x87, 0x96, 0xbc, 0x5a, 0xff, 0x4b, 0x81, 0xa3, 0x75, 0x01, 0x75, 0x7b, 0xc0, 0x75, 0x3a, 0x3c, 0x21, 0x96, 0x47, 0x90, 0xd3, 0x86, 0x99, 0x30, 0x8d, 0xeb, 0xc1, 0x7a, 0x6e, 0xaf, 0x8d },
+ .public = { 0xf0, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0xb4, 0xe0, 0xdd, 0x76, 0xda, 0x7b, 0x07, 0x17, 0x28, 0xb6, 0x1f, 0x85, 0x67, 0x71, 0xaa, 0x35, 0x6e, 0x57, 0xed, 0xa7, 0x8a, 0x5b, 0x16, 0x55, 0xcc, 0x38, 0x20, 0xfb, 0x5f, 0x85, 0x4c, 0x5c },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x98, 0xdf, 0x84, 0x5f, 0x66, 0x51, 0xbf, 0x11, 0x38, 0x22, 0x1f, 0x11, 0x90, 0x41, 0xf7, 0x2b, 0x6d, 0xbc, 0x3c, 0x4a, 0xce, 0x71, 0x43, 0xd9, 0x9f, 0xd5, 0x5a, 0xd8, 0x67, 0x48, 0x0d, 0xa8 },
+ .public = { 0xf1, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x6f, 0xdf, 0x6c, 0x37, 0x61, 0x1d, 0xbd, 0x53, 0x04, 0xdc, 0x0f, 0x2e, 0xb7, 0xc9, 0x51, 0x7e, 0xb3, 0xc5, 0x0e, 0x12, 0xfd, 0x05, 0x0a, 0xc6, 0xde, 0xc2, 0x70, 0x71, 0xd4, 0xbf, 0xc0, 0x34 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0xf0, 0x94, 0x98, 0xe4, 0x6f, 0x02, 0xf8, 0x78, 0x82, 0x9e, 0x78, 0xb8, 0x03, 0xd3, 0x16, 0xa2, 0xed, 0x69, 0x5d, 0x04, 0x98, 0xa0, 0x8a, 0xbd, 0xf8, 0x27, 0x69, 0x30, 0xe2, 0x4e, 0xdc, 0xb0 },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .result = { 0x4c, 0x8f, 0xc4, 0xb1, 0xc6, 0xab, 0x88, 0xfb, 0x21, 0xf1, 0x8f, 0x6d, 0x4c, 0x81, 0x02, 0x40, 0xd4, 0xe9, 0x46, 0x51, 0xba, 0x44, 0xf7, 0xa2, 0xc8, 0x63, 0xce, 0xc7, 0xdc, 0x56, 0x60, 0x2d },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x18, 0x13, 0xc1, 0x0a, 0x5c, 0x7f, 0x21, 0xf9, 0x6e, 0x17, 0xf2, 0x88, 0xc0, 0xcc, 0x37, 0x60, 0x7c, 0x04, 0xc5, 0xf5, 0xae, 0xa2, 0xdb, 0x13, 0x4f, 0x9e, 0x2f, 0xfc, 0x66, 0xbd, 0x9d, 0xb8 },
+ .public = { 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x80 },
+ .result = { 0x1c, 0xd0, 0xb2, 0x82, 0x67, 0xdc, 0x54, 0x1c, 0x64, 0x2d, 0x6d, 0x7d, 0xca, 0x44, 0xa8, 0xb3, 0x8a, 0x63, 0x73, 0x6e, 0xef, 0x5c, 0x4e, 0x65, 0x01, 0xff, 0xbb, 0xb1, 0x78, 0x0c, 0x03, 0x3c },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x78, 0x57, 0xfb, 0x80, 0x86, 0x53, 0x64, 0x5a, 0x0b, 0xeb, 0x13, 0x8a, 0x64, 0xf5, 0xf4, 0xd7, 0x33, 0xa4, 0x5e, 0xa8, 0x4c, 0x3c, 0xda, 0x11, 0xa9, 0xc0, 0x6f, 0x7e, 0x71, 0x39, 0x14, 0x9e },
+ .public = { 0x03, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x80 },
+ .result = { 0x87, 0x55, 0xbe, 0x01, 0xc6, 0x0a, 0x7e, 0x82, 0x5c, 0xff, 0x3e, 0x0e, 0x78, 0xcb, 0x3a, 0xa4, 0x33, 0x38, 0x61, 0x51, 0x6a, 0xa5, 0x9b, 0x1c, 0x51, 0xa8, 0xb2, 0xa5, 0x43, 0xdf, 0xa8, 0x22 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0xe0, 0x3a, 0xa8, 0x42, 0xe2, 0xab, 0xc5, 0x6e, 0x81, 0xe8, 0x7b, 0x8b, 0x9f, 0x41, 0x7b, 0x2a, 0x1e, 0x59, 0x13, 0xc7, 0x23, 0xee, 0xd2, 0x8d, 0x75, 0x2f, 0x8d, 0x47, 0xa5, 0x9f, 0x49, 0x8f },
+ .public = { 0x04, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x80 },
+ .result = { 0x54, 0xc9, 0xa1, 0xed, 0x95, 0xe5, 0x46, 0xd2, 0x78, 0x22, 0xa3, 0x60, 0x93, 0x1d, 0xda, 0x60, 0xa1, 0xdf, 0x04, 0x9d, 0xa6, 0xf9, 0x04, 0x25, 0x3c, 0x06, 0x12, 0xbb, 0xdc, 0x08, 0x74, 0x76 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0xf8, 0xf7, 0x07, 0xb7, 0x99, 0x9b, 0x18, 0xcb, 0x0d, 0x6b, 0x96, 0x12, 0x4f, 0x20, 0x45, 0x97, 0x2c, 0xa2, 0x74, 0xbf, 0xc1, 0x54, 0xad, 0x0c, 0x87, 0x03, 0x8c, 0x24, 0xc6, 0xd0, 0xd4, 0xb2 },
+ .public = { 0xda, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0xcc, 0x1f, 0x40, 0xd7, 0x43, 0xcd, 0xc2, 0x23, 0x0e, 0x10, 0x43, 0xda, 0xba, 0x8b, 0x75, 0xe8, 0x10, 0xf1, 0xfb, 0xab, 0x7f, 0x25, 0x52, 0x69, 0xbd, 0x9e, 0xbb, 0x29, 0xe6, 0xbf, 0x49, 0x4f },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0xa0, 0x34, 0xf6, 0x84, 0xfa, 0x63, 0x1e, 0x1a, 0x34, 0x81, 0x18, 0xc1, 0xce, 0x4c, 0x98, 0x23, 0x1f, 0x2d, 0x9e, 0xec, 0x9b, 0xa5, 0x36, 0x5b, 0x4a, 0x05, 0xd6, 0x9a, 0x78, 0x5b, 0x07, 0x96 },
+ .public = { 0xdb, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x54, 0x99, 0x8e, 0xe4, 0x3a, 0x5b, 0x00, 0x7b, 0xf4, 0x99, 0xf0, 0x78, 0xe7, 0x36, 0x52, 0x44, 0x00, 0xa8, 0xb5, 0xc7, 0xe9, 0xb9, 0xb4, 0x37, 0x71, 0x74, 0x8c, 0x7c, 0xdf, 0x88, 0x04, 0x12 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x30, 0xb6, 0xc6, 0xa0, 0xf2, 0xff, 0xa6, 0x80, 0x76, 0x8f, 0x99, 0x2b, 0xa8, 0x9e, 0x15, 0x2d, 0x5b, 0xc9, 0x89, 0x3d, 0x38, 0xc9, 0x11, 0x9b, 0xe4, 0xf7, 0x67, 0xbf, 0xab, 0x6e, 0x0c, 0xa5 },
+ .public = { 0xdc, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0xea, 0xd9, 0xb3, 0x8e, 0xfd, 0xd7, 0x23, 0x63, 0x79, 0x34, 0xe5, 0x5a, 0xb7, 0x17, 0xa7, 0xae, 0x09, 0xeb, 0x86, 0xa2, 0x1d, 0xc3, 0x6a, 0x3f, 0xee, 0xb8, 0x8b, 0x75, 0x9e, 0x39, 0x1e, 0x09 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x90, 0x1b, 0x9d, 0xcf, 0x88, 0x1e, 0x01, 0xe0, 0x27, 0x57, 0x50, 0x35, 0xd4, 0x0b, 0x43, 0xbd, 0xc1, 0xc5, 0x24, 0x2e, 0x03, 0x08, 0x47, 0x49, 0x5b, 0x0c, 0x72, 0x86, 0x46, 0x9b, 0x65, 0x91 },
+ .public = { 0xea, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x60, 0x2f, 0xf4, 0x07, 0x89, 0xb5, 0x4b, 0x41, 0x80, 0x59, 0x15, 0xfe, 0x2a, 0x62, 0x21, 0xf0, 0x7a, 0x50, 0xff, 0xc2, 0xc3, 0xfc, 0x94, 0xcf, 0x61, 0xf1, 0x3d, 0x79, 0x04, 0xe8, 0x8e, 0x0e },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x80, 0x46, 0x67, 0x7c, 0x28, 0xfd, 0x82, 0xc9, 0xa1, 0xbd, 0xb7, 0x1a, 0x1a, 0x1a, 0x34, 0xfa, 0xba, 0x12, 0x25, 0xe2, 0x50, 0x7f, 0xe3, 0xf5, 0x4d, 0x10, 0xbd, 0x5b, 0x0d, 0x86, 0x5f, 0x8e },
+ .public = { 0xeb, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0xe0, 0x0a, 0xe8, 0xb1, 0x43, 0x47, 0x12, 0x47, 0xba, 0x24, 0xf1, 0x2c, 0x88, 0x55, 0x36, 0xc3, 0xcb, 0x98, 0x1b, 0x58, 0xe1, 0xe5, 0x6b, 0x2b, 0xaf, 0x35, 0xc1, 0x2a, 0xe1, 0xf7, 0x9c, 0x26 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x60, 0x2f, 0x7e, 0x2f, 0x68, 0xa8, 0x46, 0xb8, 0x2c, 0xc2, 0x69, 0xb1, 0xd4, 0x8e, 0x93, 0x98, 0x86, 0xae, 0x54, 0xfd, 0x63, 0x6c, 0x1f, 0xe0, 0x74, 0xd7, 0x10, 0x12, 0x7d, 0x47, 0x24, 0x91 },
+ .public = { 0xef, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x98, 0xcb, 0x9b, 0x50, 0xdd, 0x3f, 0xc2, 0xb0, 0xd4, 0xf2, 0xd2, 0xbf, 0x7c, 0x5c, 0xfd, 0xd1, 0x0c, 0x8f, 0xcd, 0x31, 0xfc, 0x40, 0xaf, 0x1a, 0xd4, 0x4f, 0x47, 0xc1, 0x31, 0x37, 0x63, 0x62 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x60, 0x88, 0x7b, 0x3d, 0xc7, 0x24, 0x43, 0x02, 0x6e, 0xbe, 0xdb, 0xbb, 0xb7, 0x06, 0x65, 0xf4, 0x2b, 0x87, 0xad, 0xd1, 0x44, 0x0e, 0x77, 0x68, 0xfb, 0xd7, 0xe8, 0xe2, 0xce, 0x5f, 0x63, 0x9d },
+ .public = { 0xf0, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x38, 0xd6, 0x30, 0x4c, 0x4a, 0x7e, 0x6d, 0x9f, 0x79, 0x59, 0x33, 0x4f, 0xb5, 0x24, 0x5b, 0xd2, 0xc7, 0x54, 0x52, 0x5d, 0x4c, 0x91, 0xdb, 0x95, 0x02, 0x06, 0x92, 0x62, 0x34, 0xc1, 0xf6, 0x33 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0x78, 0xd3, 0x1d, 0xfa, 0x85, 0x44, 0x97, 0xd7, 0x2d, 0x8d, 0xef, 0x8a, 0x1b, 0x7f, 0xb0, 0x06, 0xce, 0xc2, 0xd8, 0xc4, 0x92, 0x46, 0x47, 0xc9, 0x38, 0x14, 0xae, 0x56, 0xfa, 0xed, 0xa4, 0x95 },
+ .public = { 0xf1, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x78, 0x6c, 0xd5, 0x49, 0x96, 0xf0, 0x14, 0xa5, 0xa0, 0x31, 0xec, 0x14, 0xdb, 0x81, 0x2e, 0xd0, 0x83, 0x55, 0x06, 0x1f, 0xdb, 0x5d, 0xe6, 0x80, 0xa8, 0x00, 0xac, 0x52, 0x1f, 0x31, 0x8e, 0x23 },
+ .valid = true
+ },
+ /* wycheproof - public key >= p */
+ {
+ .private = { 0xc0, 0x4c, 0x5b, 0xae, 0xfa, 0x83, 0x02, 0xdd, 0xde, 0xd6, 0xa4, 0xbb, 0x95, 0x77, 0x61, 0xb4, 0xeb, 0x97, 0xae, 0xfa, 0x4f, 0xc3, 0xb8, 0x04, 0x30, 0x85, 0xf9, 0x6a, 0x56, 0x59, 0xb3, 0xa5 },
+ .public = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff },
+ .result = { 0x29, 0xae, 0x8b, 0xc7, 0x3e, 0x9b, 0x10, 0xa0, 0x8b, 0x4f, 0x68, 0x1c, 0x43, 0xc3, 0xe0, 0xac, 0x1a, 0x17, 0x1d, 0x31, 0xb3, 0x8f, 0x1a, 0x48, 0xef, 0xba, 0x29, 0xae, 0x63, 0x9e, 0xa1, 0x34 },
+ .valid = true
+ },
+ /* wycheproof - RFC 7748 */
+ {
+ .private = { 0xa0, 0x46, 0xe3, 0x6b, 0xf0, 0x52, 0x7c, 0x9d, 0x3b, 0x16, 0x15, 0x4b, 0x82, 0x46, 0x5e, 0xdd, 0x62, 0x14, 0x4c, 0x0a, 0xc1, 0xfc, 0x5a, 0x18, 0x50, 0x6a, 0x22, 0x44, 0xba, 0x44, 0x9a, 0x44 },
+ .public = { 0xe6, 0xdb, 0x68, 0x67, 0x58, 0x30, 0x30, 0xdb, 0x35, 0x94, 0xc1, 0xa4, 0x24, 0xb1, 0x5f, 0x7c, 0x72, 0x66, 0x24, 0xec, 0x26, 0xb3, 0x35, 0x3b, 0x10, 0xa9, 0x03, 0xa6, 0xd0, 0xab, 0x1c, 0x4c },
+ .result = { 0xc3, 0xda, 0x55, 0x37, 0x9d, 0xe9, 0xc6, 0x90, 0x8e, 0x94, 0xea, 0x4d, 0xf2, 0x8d, 0x08, 0x4f, 0x32, 0xec, 0xcf, 0x03, 0x49, 0x1c, 0x71, 0xf7, 0x54, 0xb4, 0x07, 0x55, 0x77, 0xa2, 0x85, 0x52 },
+ .valid = true
+ },
+ /* wycheproof - RFC 7748 */
+ {
+ .private = { 0x48, 0x66, 0xe9, 0xd4, 0xd1, 0xb4, 0x67, 0x3c, 0x5a, 0xd2, 0x26, 0x91, 0x95, 0x7d, 0x6a, 0xf5, 0xc1, 0x1b, 0x64, 0x21, 0xe0, 0xea, 0x01, 0xd4, 0x2c, 0xa4, 0x16, 0x9e, 0x79, 0x18, 0xba, 0x4d },
+ .public = { 0xe5, 0x21, 0x0f, 0x12, 0x78, 0x68, 0x11, 0xd3, 0xf4, 0xb7, 0x95, 0x9d, 0x05, 0x38, 0xae, 0x2c, 0x31, 0xdb, 0xe7, 0x10, 0x6f, 0xc0, 0x3c, 0x3e, 0xfc, 0x4c, 0xd5, 0x49, 0xc7, 0x15, 0xa4, 0x13 },
+ .result = { 0x95, 0xcb, 0xde, 0x94, 0x76, 0xe8, 0x90, 0x7d, 0x7a, 0xad, 0xe4, 0x5c, 0xb4, 0xb8, 0x73, 0xf8, 0x8b, 0x59, 0x5a, 0x68, 0x79, 0x9f, 0xa1, 0x52, 0xe6, 0xf8, 0xf7, 0x64, 0x7a, 0xac, 0x79, 0x57 },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x0a, 0xb4, 0xe7, 0x63, 0x80, 0xd8, 0x4d, 0xde, 0x4f, 0x68, 0x33, 0xc5, 0x8f, 0x2a, 0x9f, 0xb8, 0xf8, 0x3b, 0xb0, 0x16, 0x9b, 0x17, 0x2b, 0xe4, 0xb6, 0xe0, 0x59, 0x28, 0x87, 0x74, 0x1a, 0x36 },
+ .result = { 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x89, 0xe1, 0x0d, 0x57, 0x01, 0xb4, 0x33, 0x7d, 0x2d, 0x03, 0x21, 0x81, 0x53, 0x8b, 0x10, 0x64, 0xbd, 0x40, 0x84, 0x40, 0x1c, 0xec, 0xa1, 0xfd, 0x12, 0x66, 0x3a, 0x19, 0x59, 0x38, 0x80, 0x00 },
+ .result = { 0x09, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x2b, 0x55, 0xd3, 0xaa, 0x4a, 0x8f, 0x80, 0xc8, 0xc0, 0xb2, 0xae, 0x5f, 0x93, 0x3e, 0x85, 0xaf, 0x49, 0xbe, 0xac, 0x36, 0xc2, 0xfa, 0x73, 0x94, 0xba, 0xb7, 0x6c, 0x89, 0x33, 0xf8, 0xf8, 0x1d },
+ .result = { 0x10, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x63, 0xe5, 0xb1, 0xfe, 0x96, 0x01, 0xfe, 0x84, 0x38, 0x5d, 0x88, 0x66, 0xb0, 0x42, 0x12, 0x62, 0xf7, 0x8f, 0xbf, 0xa5, 0xaf, 0xf9, 0x58, 0x5e, 0x62, 0x66, 0x79, 0xb1, 0x85, 0x47, 0xd9, 0x59 },
+ .result = { 0xfe, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x3f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0xe4, 0x28, 0xf3, 0xda, 0xc1, 0x78, 0x09, 0xf8, 0x27, 0xa5, 0x22, 0xce, 0x32, 0x35, 0x50, 0x58, 0xd0, 0x73, 0x69, 0x36, 0x4a, 0xa7, 0x89, 0x02, 0xee, 0x10, 0x13, 0x9b, 0x9f, 0x9d, 0xd6, 0x53 },
+ .result = { 0xfc, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x3f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0xb3, 0xb5, 0x0e, 0x3e, 0xd3, 0xa4, 0x07, 0xb9, 0x5d, 0xe9, 0x42, 0xef, 0x74, 0x57, 0x5b, 0x5a, 0xb8, 0xa1, 0x0c, 0x09, 0xee, 0x10, 0x35, 0x44, 0xd6, 0x0b, 0xdf, 0xed, 0x81, 0x38, 0xab, 0x2b },
+ .result = { 0xf9, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x3f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x21, 0x3f, 0xff, 0xe9, 0x3d, 0x5e, 0xa8, 0xcd, 0x24, 0x2e, 0x46, 0x28, 0x44, 0x02, 0x99, 0x22, 0xc4, 0x3c, 0x77, 0xc9, 0xe3, 0xe4, 0x2f, 0x56, 0x2f, 0x48, 0x5d, 0x24, 0xc5, 0x01, 0xa2, 0x0b },
+ .result = { 0xf3, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x3f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x91, 0xb2, 0x32, 0xa1, 0x78, 0xb3, 0xcd, 0x53, 0x09, 0x32, 0x44, 0x1e, 0x61, 0x39, 0x41, 0x8f, 0x72, 0x17, 0x22, 0x92, 0xf1, 0xda, 0x4c, 0x18, 0x34, 0xfc, 0x5e, 0xbf, 0xef, 0xb5, 0x1e, 0x3f },
+ .result = { 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x03 },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x04, 0x5c, 0x6e, 0x11, 0xc5, 0xd3, 0x32, 0x55, 0x6c, 0x78, 0x22, 0xfe, 0x94, 0xeb, 0xf8, 0x9b, 0x56, 0xa3, 0x87, 0x8d, 0xc2, 0x7c, 0xa0, 0x79, 0x10, 0x30, 0x58, 0x84, 0x9f, 0xab, 0xcb, 0x4f },
+ .result = { 0xe5, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x1c, 0xa2, 0x19, 0x0b, 0x71, 0x16, 0x35, 0x39, 0x06, 0x3c, 0x35, 0x77, 0x3b, 0xda, 0x0c, 0x9c, 0x92, 0x8e, 0x91, 0x36, 0xf0, 0x62, 0x0a, 0xeb, 0x09, 0x3f, 0x09, 0x91, 0x97, 0xb7, 0xf7, 0x4e },
+ .result = { 0xe3, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0xf7, 0x6e, 0x90, 0x10, 0xac, 0x33, 0xc5, 0x04, 0x3b, 0x2d, 0x3b, 0x76, 0xa8, 0x42, 0x17, 0x10, 0x00, 0xc4, 0x91, 0x62, 0x22, 0xe9, 0xe8, 0x58, 0x97, 0xa0, 0xae, 0xc7, 0xf6, 0x35, 0x0b, 0x3c },
+ .result = { 0xdd, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0xbb, 0x72, 0x68, 0x8d, 0x8f, 0x8a, 0xa7, 0xa3, 0x9c, 0xd6, 0x06, 0x0c, 0xd5, 0xc8, 0x09, 0x3c, 0xde, 0xc6, 0xfe, 0x34, 0x19, 0x37, 0xc3, 0x88, 0x6a, 0x99, 0x34, 0x6c, 0xd0, 0x7f, 0xaa, 0x55 },
+ .result = { 0xdb, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x7f },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x88, 0xfd, 0xde, 0xa1, 0x93, 0x39, 0x1c, 0x6a, 0x59, 0x33, 0xef, 0x9b, 0x71, 0x90, 0x15, 0x49, 0x44, 0x72, 0x05, 0xaa, 0xe9, 0xda, 0x92, 0x8a, 0x6b, 0x91, 0xa3, 0x52, 0xba, 0x10, 0xf4, 0x1f },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02 },
+ .valid = true
+ },
+ /* wycheproof - edge case for shared secret */
+ {
+ .private = { 0xa0, 0xa4, 0xf1, 0x30, 0xb9, 0x8a, 0x5b, 0xe4, 0xb1, 0xce, 0xdb, 0x7c, 0xb8, 0x55, 0x84, 0xa3, 0x52, 0x0e, 0x14, 0x2d, 0x47, 0x4d, 0xc9, 0xcc, 0xb9, 0x09, 0xa0, 0x73, 0xa9, 0x76, 0xbf, 0x63 },
+ .public = { 0x30, 0x3b, 0x39, 0x2f, 0x15, 0x31, 0x16, 0xca, 0xd9, 0xcc, 0x68, 0x2a, 0x00, 0xcc, 0xc4, 0x4c, 0x95, 0xff, 0x0d, 0x3b, 0xbe, 0x56, 0x8b, 0xeb, 0x6c, 0x4e, 0x73, 0x9b, 0xaf, 0xdc, 0x2c, 0x68 },
+ .result = { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x80, 0x00 },
+ .valid = true
+ },
+ /* wycheproof - checking for overflow */
+ {
+ .private = { 0xc8, 0x17, 0x24, 0x70, 0x40, 0x00, 0xb2, 0x6d, 0x31, 0x70, 0x3c, 0xc9, 0x7e, 0x3a, 0x37, 0x8d, 0x56, 0xfa, 0xd8, 0x21, 0x93, 0x61, 0xc8, 0x8c, 0xca, 0x8b, 0xd7, 0xc5, 0x71, 0x9b, 0x12, 0xb2 },
+ .public = { 0xfd, 0x30, 0x0a, 0xeb, 0x40, 0xe1, 0xfa, 0x58, 0x25, 0x18, 0x41, 0x2b, 0x49, 0xb2, 0x08, 0xa7, 0x84, 0x2b, 0x1e, 0x1f, 0x05, 0x6a, 0x04, 0x01, 0x78, 0xea, 0x41, 0x41, 0x53, 0x4f, 0x65, 0x2d },
+ .result = { 0xb7, 0x34, 0x10, 0x5d, 0xc2, 0x57, 0x58, 0x5d, 0x73, 0xb5, 0x66, 0xcc, 0xb7, 0x6f, 0x06, 0x27, 0x95, 0xcc, 0xbe, 0xc8, 0x91, 0x28, 0xe5, 0x2b, 0x02, 0xf3, 0xe5, 0x96, 0x39, 0xf1, 0x3c, 0x46 },
+ .valid = true
+ },
+ /* wycheproof - checking for overflow */
+ {
+ .private = { 0xc8, 0x17, 0x24, 0x70, 0x40, 0x00, 0xb2, 0x6d, 0x31, 0x70, 0x3c, 0xc9, 0x7e, 0x3a, 0x37, 0x8d, 0x56, 0xfa, 0xd8, 0x21, 0x93, 0x61, 0xc8, 0x8c, 0xca, 0x8b, 0xd7, 0xc5, 0x71, 0x9b, 0x12, 0xb2 },
+ .public = { 0xc8, 0xef, 0x79, 0xb5, 0x14, 0xd7, 0x68, 0x26, 0x77, 0xbc, 0x79, 0x31, 0xe0, 0x6e, 0xe5, 0xc2, 0x7c, 0x9b, 0x39, 0x2b, 0x4a, 0xe9, 0x48, 0x44, 0x73, 0xf5, 0x54, 0xe6, 0x67, 0x8e, 0xcc, 0x2e },
+ .result = { 0x64, 0x7a, 0x46, 0xb6, 0xfc, 0x3f, 0x40, 0xd6, 0x21, 0x41, 0xee, 0x3c, 0xee, 0x70, 0x6b, 0x4d, 0x7a, 0x92, 0x71, 0x59, 0x3a, 0x7b, 0x14, 0x3e, 0x8e, 0x2e, 0x22, 0x79, 0x88, 0x3e, 0x45, 0x50 },
+ .valid = true
+ },
+ /* wycheproof - checking for overflow */
+ {
+ .private = { 0xc8, 0x17, 0x24, 0x70, 0x40, 0x00, 0xb2, 0x6d, 0x31, 0x70, 0x3c, 0xc9, 0x7e, 0x3a, 0x37, 0x8d, 0x56, 0xfa, 0xd8, 0x21, 0x93, 0x61, 0xc8, 0x8c, 0xca, 0x8b, 0xd7, 0xc5, 0x71, 0x9b, 0x12, 0xb2 },
+ .public = { 0x64, 0xae, 0xac, 0x25, 0x04, 0x14, 0x48, 0x61, 0x53, 0x2b, 0x7b, 0xbc, 0xb6, 0xc8, 0x7d, 0x67, 0xdd, 0x4c, 0x1f, 0x07, 0xeb, 0xc2, 0xe0, 0x6e, 0xff, 0xb9, 0x5a, 0xec, 0xc6, 0x17, 0x0b, 0x2c },
+ .result = { 0x4f, 0xf0, 0x3d, 0x5f, 0xb4, 0x3c, 0xd8, 0x65, 0x7a, 0x3c, 0xf3, 0x7c, 0x13, 0x8c, 0xad, 0xce, 0xcc, 0xe5, 0x09, 0xe4, 0xeb, 0xa0, 0x89, 0xd0, 0xef, 0x40, 0xb4, 0xe4, 0xfb, 0x94, 0x61, 0x55 },
+ .valid = true
+ },
+ /* wycheproof - checking for overflow */
+ {
+ .private = { 0xc8, 0x17, 0x24, 0x70, 0x40, 0x00, 0xb2, 0x6d, 0x31, 0x70, 0x3c, 0xc9, 0x7e, 0x3a, 0x37, 0x8d, 0x56, 0xfa, 0xd8, 0x21, 0x93, 0x61, 0xc8, 0x8c, 0xca, 0x8b, 0xd7, 0xc5, 0x71, 0x9b, 0x12, 0xb2 },
+ .public = { 0xbf, 0x68, 0xe3, 0x5e, 0x9b, 0xdb, 0x7e, 0xee, 0x1b, 0x50, 0x57, 0x02, 0x21, 0x86, 0x0f, 0x5d, 0xcd, 0xad, 0x8a, 0xcb, 0xab, 0x03, 0x1b, 0x14, 0x97, 0x4c, 0xc4, 0x90, 0x13, 0xc4, 0x98, 0x31 },
+ .result = { 0x21, 0xce, 0xe5, 0x2e, 0xfd, 0xbc, 0x81, 0x2e, 0x1d, 0x02, 0x1a, 0x4a, 0xf1, 0xe1, 0xd8, 0xbc, 0x4d, 0xb3, 0xc4, 0x00, 0xe4, 0xd2, 0xa2, 0xc5, 0x6a, 0x39, 0x26, 0xdb, 0x4d, 0x99, 0xc6, 0x5b },
+ .valid = true
+ },
+ /* wycheproof - checking for overflow */
+ {
+ .private = { 0xc8, 0x17, 0x24, 0x70, 0x40, 0x00, 0xb2, 0x6d, 0x31, 0x70, 0x3c, 0xc9, 0x7e, 0x3a, 0x37, 0x8d, 0x56, 0xfa, 0xd8, 0x21, 0x93, 0x61, 0xc8, 0x8c, 0xca, 0x8b, 0xd7, 0xc5, 0x71, 0x9b, 0x12, 0xb2 },
+ .public = { 0x53, 0x47, 0xc4, 0x91, 0x33, 0x1a, 0x64, 0xb4, 0x3d, 0xdc, 0x68, 0x30, 0x34, 0xe6, 0x77, 0xf5, 0x3d, 0xc3, 0x2b, 0x52, 0xa5, 0x2a, 0x57, 0x7c, 0x15, 0xa8, 0x3b, 0xf2, 0x98, 0xe9, 0x9f, 0x19 },
+ .result = { 0x18, 0xcb, 0x89, 0xe4, 0xe2, 0x0c, 0x0c, 0x2b, 0xd3, 0x24, 0x30, 0x52, 0x45, 0x26, 0x6c, 0x93, 0x27, 0x69, 0x0b, 0xbe, 0x79, 0xac, 0xb8, 0x8f, 0x5b, 0x8f, 0xb3, 0xf7, 0x4e, 0xca, 0x3e, 0x52 },
+ .valid = true
+ },
+ /* wycheproof - private key == -1 (mod order) */
+ {
+ .private = { 0xa0, 0x23, 0xcd, 0xd0, 0x83, 0xef, 0x5b, 0xb8, 0x2f, 0x10, 0xd6, 0x2e, 0x59, 0xe1, 0x5a, 0x68, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x50 },
+ .public = { 0x25, 0x8e, 0x04, 0x52, 0x3b, 0x8d, 0x25, 0x3e, 0xe6, 0x57, 0x19, 0xfc, 0x69, 0x06, 0xc6, 0x57, 0x19, 0x2d, 0x80, 0x71, 0x7e, 0xdc, 0x82, 0x8f, 0xa0, 0xaf, 0x21, 0x68, 0x6e, 0x2f, 0xaa, 0x75 },
+ .result = { 0x25, 0x8e, 0x04, 0x52, 0x3b, 0x8d, 0x25, 0x3e, 0xe6, 0x57, 0x19, 0xfc, 0x69, 0x06, 0xc6, 0x57, 0x19, 0x2d, 0x80, 0x71, 0x7e, 0xdc, 0x82, 0x8f, 0xa0, 0xaf, 0x21, 0x68, 0x6e, 0x2f, 0xaa, 0x75 },
+ .valid = true
+ },
+ /* wycheproof - private key == 1 (mod order) on twist */
+ {
+ .private = { 0x58, 0x08, 0x3d, 0xd2, 0x61, 0xad, 0x91, 0xef, 0xf9, 0x52, 0x32, 0x2e, 0xc8, 0x24, 0xc6, 0x82, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0x5f },
+ .public = { 0x2e, 0xae, 0x5e, 0xc3, 0xdd, 0x49, 0x4e, 0x9f, 0x2d, 0x37, 0xd2, 0x58, 0xf8, 0x73, 0xa8, 0xe6, 0xe9, 0xd0, 0xdb, 0xd1, 0xe3, 0x83, 0xef, 0x64, 0xd9, 0x8b, 0xb9, 0x1b, 0x3e, 0x0b, 0xe0, 0x35 },
+ .result = { 0x2e, 0xae, 0x5e, 0xc3, 0xdd, 0x49, 0x4e, 0x9f, 0x2d, 0x37, 0xd2, 0x58, 0xf8, 0x73, 0xa8, 0xe6, 0xe9, 0xd0, 0xdb, 0xd1, 0xe3, 0x83, 0xef, 0x64, 0xd9, 0x8b, 0xb9, 0x1b, 0x3e, 0x0b, 0xe0, 0x35 },
+ .valid = true
+ }
+};
+bool __init curve25519_selftest(void)
+{
+ bool success = true, ret, ret2;
+ size_t i = 0, j;
+ u8 in[CURVE25519_POINT_SIZE], out[CURVE25519_POINT_SIZE], out2[CURVE25519_POINT_SIZE];
+
+ for (i = 0; i < ARRAY_SIZE(curve25519_test_vectors); ++i) {
+ memset(out, 0, CURVE25519_POINT_SIZE);
+ ret = curve25519(out, curve25519_test_vectors[i].private, curve25519_test_vectors[i].public);
+ if (ret != curve25519_test_vectors[i].valid || memcmp(out, curve25519_test_vectors[i].result, CURVE25519_POINT_SIZE)) {
+ pr_info("curve25519 self-test %zu: FAIL\n", i + 1);
+ success = false;
+ break;
+ }
+ }
+
+ for (i = 0; i < 5; ++i) {
+ get_random_bytes(in, sizeof(in));
+ ret = curve25519_generate_public(out, in);
+ ret2 = curve25519(out2, in, (u8[CURVE25519_POINT_SIZE]){ 9 });
+ if (ret != ret2 || memcmp(out, out2, CURVE25519_POINT_SIZE)) {
+ pr_info("curve25519 basepoint self-test %zu: FAIL: input - 0x", i + 1);
+ for (j = CURVE25519_POINT_SIZE; j-- > 0;)
+ printk(KERN_CONT "%02x", in[j]);
+ printk(KERN_CONT "\n");
+ success = false;
+ break;
+ }
+ }
+
+ if (success)
+ pr_info("curve25519 self-tests: pass\n");
+ return success;
+}
+#endif
diff --git a/lib/zinc/selftest/poly1305.h b/lib/zinc/selftest/poly1305.h
new file mode 100644
index 000000000000..3088f8c248a0
--- /dev/null
+++ b/lib/zinc/selftest/poly1305.h
@@ -0,0 +1,1568 @@
+/* SPDX-License-Identifier: GPL-2.0
+ *
+ * Copyright (C) 2015-2018 Jason A. Donenfeld <Jason@...c4.com>. All Rights Reserved.
+ * Copyright (C) 2016-2017 The OpenSSL Project Authors. All Rights Reserved.
+ */
+
+#ifdef CONFIG_ZINC_DEBUG
+
+#include <zinc/simd.h>
+
+struct poly1305_testdata {
+ size_t size;
+ const u8 data[1024];
+};
+
+struct poly1305_testvec {
+ struct poly1305_testdata input, key, expected;
+};
+
+static const struct poly1305_testvec poly1305_testvecs[] = {
+ /*
+ * RFC7539
+ */
+ {
+ {
+ 34,
+ {
+ 0x43, 0x72, 0x79, 0x70, 0x74, 0x6f, 0x67, 0x72,
+ 0x61, 0x70, 0x68, 0x69, 0x63, 0x20, 0x46, 0x6f,
+ 0x72, 0x75, 0x6d, 0x20, 0x52, 0x65, 0x73, 0x65,
+ 0x61, 0x72, 0x63, 0x68, 0x20, 0x47, 0x72, 0x6f,
+
+ 0x75, 0x70
+ }
+ },
+ {
+ 32,
+ {
+ 0x85, 0xd6, 0xbe, 0x78, 0x57, 0x55, 0x6d, 0x33,
+ 0x7f, 0x44, 0x52, 0xfe, 0x42, 0xd5, 0x06, 0xa8,
+ 0x01, 0x03, 0x80, 0x8a, 0xfb, 0x0d, 0xb2, 0xfd,
+ 0x4a, 0xbf, 0xf6, 0xaf, 0x41, 0x49, 0xf5, 0x1b
+ }
+ },
+ {
+ 16,
+ {
+ 0xa8, 0x06, 0x1d, 0xc1, 0x30, 0x51, 0x36, 0xc6,
+ 0xc2, 0x2b, 0x8b, 0xaf, 0x0c, 0x01, 0x27, 0xa9
+ }
+ }
+ },
+ /*
+ * test vectors from "The Poly1305-AES message-authentication code"
+ */
+ {
+ {
+ 2,
+ {
+ 0xf3, 0xf6
+ }
+ },
+ {
+ 32,
+ {
+ 0x85, 0x1f, 0xc4, 0x0c, 0x34, 0x67, 0xac, 0x0b,
+ 0xe0, 0x5c, 0xc2, 0x04, 0x04, 0xf3, 0xf7, 0x00,
+ 0x58, 0x0b, 0x3b, 0x0f, 0x94, 0x47, 0xbb, 0x1e,
+ 0x69, 0xd0, 0x95, 0xb5, 0x92, 0x8b, 0x6d, 0xbc
+ }
+ },
+ {
+ 16,
+ {
+ 0xf4, 0xc6, 0x33, 0xc3, 0x04, 0x4f, 0xc1, 0x45,
+ 0xf8, 0x4f, 0x33, 0x5c, 0xb8, 0x19, 0x53, 0xde
+ }
+ }
+ },
+ {
+ {
+ 0,
+ {
+ 0
+ }
+ },
+ {
+ 32,
+ {
+ 0xa0, 0xf3, 0x08, 0x00, 0x00, 0xf4, 0x64, 0x00,
+ 0xd0, 0xc7, 0xe9, 0x07, 0x6c, 0x83, 0x44, 0x03,
+ 0xdd, 0x3f, 0xab, 0x22, 0x51, 0xf1, 0x1a, 0xc7,
+ 0x59, 0xf0, 0x88, 0x71, 0x29, 0xcc, 0x2e, 0xe7
+ }
+ },
+ {
+ 16,
+ {
+ 0xdd, 0x3f, 0xab, 0x22, 0x51, 0xf1, 0x1a, 0xc7,
+ 0x59, 0xf0, 0x88, 0x71, 0x29, 0xcc, 0x2e, 0xe7
+ }
+ }
+ },
+ {
+ {
+ 32,
+ {
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36
+ }
+ },
+ {
+ 32,
+ {
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef
+ }
+ },
+ {
+ 16,
+ {
+ 0x0e, 0xe1, 0xc1, 0x6b, 0xb7, 0x3f, 0x0f, 0x4f,
+ 0xd1, 0x98, 0x81, 0x75, 0x3c, 0x01, 0xcd, 0xbe
+ }
+ }
+ },
+ {
+ {
+ 63,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0x51, 0x54, 0xad, 0x0d, 0x2c, 0xb2, 0x6e, 0x01,
+ 0x27, 0x4f, 0xc5, 0x11, 0x48, 0x49, 0x1f, 0x1b
+ }
+ },
+ },
+ /*
+ * self-generated vectors exercise "significant" lengths, such that
+ * are handled by different code paths
+ */
+ {
+ {
+ 64,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0x81, 0x20, 0x59, 0xa5, 0xda, 0x19, 0x86, 0x37,
+ 0xca, 0xc7, 0xc4, 0xa6, 0x31, 0xbe, 0xe4, 0x66
+ }
+ },
+ },
+ {
+ {
+ 48,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+
+ }
+ },
+ {
+ 16,
+ {
+ 0x5b, 0x88, 0xd7, 0xf6, 0x22, 0x8b, 0x11, 0xe2,
+ 0xe2, 0x85, 0x79, 0xa5, 0xc0, 0xc1, 0xf7, 0x61
+ }
+ },
+ },
+ {
+ {
+ 96,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0xbb, 0xb6, 0x13, 0xb2, 0xb6, 0xd7, 0x53, 0xba,
+ 0x07, 0x39, 0x5b, 0x91, 0x6a, 0xae, 0xce, 0x15
+ }
+ },
+ },
+ {
+ {
+ 112,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0xc7, 0x94, 0xd7, 0x05, 0x7d, 0x17, 0x78, 0xc4,
+ 0xbb, 0xee, 0x0a, 0x39, 0xb3, 0xd9, 0x73, 0x42
+ }
+ },
+ },
+ {
+ {
+ 128,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0xff, 0xbc, 0xb9, 0xb3, 0x71, 0x42, 0x31, 0x52,
+ 0xd7, 0xfc, 0xa5, 0xad, 0x04, 0x2f, 0xba, 0xa9
+ }
+ },
+ },
+ {
+ {
+ 144,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36,
+
+ 0x81, 0x20, 0x59, 0xa5, 0xda, 0x19, 0x86, 0x37,
+ 0xca, 0xc7, 0xc4, 0xa6, 0x31, 0xbe, 0xe4, 0x66
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0x06, 0x9e, 0xd6, 0xb8, 0xef, 0x0f, 0x20, 0x7b,
+ 0x3e, 0x24, 0x3b, 0xb1, 0x01, 0x9f, 0xe6, 0x32
+ }
+ },
+ },
+ {
+ {
+ 160,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36,
+
+ 0x81, 0x20, 0x59, 0xa5, 0xda, 0x19, 0x86, 0x37,
+ 0xca, 0xc7, 0xc4, 0xa6, 0x31, 0xbe, 0xe4, 0x66,
+ 0x5b, 0x88, 0xd7, 0xf6, 0x22, 0x8b, 0x11, 0xe2,
+ 0xe2, 0x85, 0x79, 0xa5, 0xc0, 0xc1, 0xf7, 0x61
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0xcc, 0xa3, 0x39, 0xd9, 0xa4, 0x5f, 0xa2, 0x36,
+ 0x8c, 0x2c, 0x68, 0xb3, 0xa4, 0x17, 0x91, 0x33
+ }
+ },
+ },
+ {
+ {
+ 288,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36,
+
+ 0x81, 0x20, 0x59, 0xa5, 0xda, 0x19, 0x86, 0x37,
+ 0xca, 0xc7, 0xc4, 0xa6, 0x31, 0xbe, 0xe4, 0x66,
+ 0x5b, 0x88, 0xd7, 0xf6, 0x22, 0x8b, 0x11, 0xe2,
+ 0xe2, 0x85, 0x79, 0xa5, 0xc0, 0xc1, 0xf7, 0x61,
+
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0x53, 0xf6, 0xe8, 0x28, 0xa2, 0xf0, 0xfe, 0x0e,
+ 0xe8, 0x15, 0xbf, 0x0b, 0xd5, 0x84, 0x1a, 0x34
+ }
+ },
+ },
+ {
+ {
+ 320,
+ {
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36,
+
+ 0x81, 0x20, 0x59, 0xa5, 0xda, 0x19, 0x86, 0x37,
+ 0xca, 0xc7, 0xc4, 0xa6, 0x31, 0xbe, 0xe4, 0x66,
+ 0x5b, 0x88, 0xd7, 0xf6, 0x22, 0x8b, 0x11, 0xe2,
+ 0xe2, 0x85, 0x79, 0xa5, 0xc0, 0xc1, 0xf7, 0x61,
+
+ 0xab, 0x08, 0x12, 0x72, 0x4a, 0x7f, 0x1e, 0x34,
+ 0x27, 0x42, 0xcb, 0xed, 0x37, 0x4d, 0x94, 0xd1,
+ 0x36, 0xc6, 0xb8, 0x79, 0x5d, 0x45, 0xb3, 0x81,
+ 0x98, 0x30, 0xf2, 0xc0, 0x44, 0x91, 0xfa, 0xf0,
+
+ 0x99, 0x0c, 0x62, 0xe4, 0x8b, 0x80, 0x18, 0xb2,
+ 0xc3, 0xe4, 0xa0, 0xfa, 0x31, 0x34, 0xcb, 0x67,
+ 0xfa, 0x83, 0xe1, 0x58, 0xc9, 0x94, 0xd9, 0x61,
+ 0xc4, 0xcb, 0x21, 0x09, 0x5c, 0x1b, 0xf9, 0xaf,
+
+ 0x48, 0x44, 0x3d, 0x0b, 0xb0, 0xd2, 0x11, 0x09,
+ 0xc8, 0x9a, 0x10, 0x0b, 0x5c, 0xe2, 0xc2, 0x08,
+ 0x83, 0x14, 0x9c, 0x69, 0xb5, 0x61, 0xdd, 0x88,
+ 0x29, 0x8a, 0x17, 0x98, 0xb1, 0x07, 0x16, 0xef,
+
+ 0x66, 0x3c, 0xea, 0x19, 0x0f, 0xfb, 0x83, 0xd8,
+ 0x95, 0x93, 0xf3, 0xf4, 0x76, 0xb6, 0xbc, 0x24,
+ 0xd7, 0xe6, 0x79, 0x10, 0x7e, 0xa2, 0x6a, 0xdb,
+ 0x8c, 0xaf, 0x66, 0x52, 0xd0, 0x65, 0x61, 0x36,
+
+ 0x81, 0x20, 0x59, 0xa5, 0xda, 0x19, 0x86, 0x37,
+ 0xca, 0xc7, 0xc4, 0xa6, 0x31, 0xbe, 0xe4, 0x66,
+ 0x5b, 0x88, 0xd7, 0xf6, 0x22, 0x8b, 0x11, 0xe2,
+ 0xe2, 0x85, 0x79, 0xa5, 0xc0, 0xc1, 0xf7, 0x61
+ }
+ },
+ {
+ 32,
+ {
+ 0x12, 0x97, 0x6a, 0x08, 0xc4, 0x42, 0x6d, 0x0c,
+ 0xe8, 0xa8, 0x24, 0x07, 0xc4, 0xf4, 0x82, 0x07,
+ 0x80, 0xf8, 0xc2, 0x0a, 0xa7, 0x12, 0x02, 0xd1,
+ 0xe2, 0x91, 0x79, 0xcb, 0xcb, 0x55, 0x5a, 0x57
+ }
+ },
+ {
+ 16,
+ {
+ 0xb8, 0x46, 0xd4, 0x4e, 0x9b, 0xbd, 0x53, 0xce,
+ 0xdf, 0xfb, 0xfb, 0xb6, 0xb7, 0xfa, 0x49, 0x33
+ }
+ },
+ },
+ /*
+ * 4th power of the key spills to 131th bit in SIMD key setup
+ */
+ {
+ {
+ 256,
+ {
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff
+ }
+ },
+ {
+ 32,
+ {
+ 0xad, 0x62, 0x81, 0x07, 0xe8, 0x35, 0x1d, 0x0f,
+ 0x2c, 0x23, 0x1a, 0x05, 0xdc, 0x4a, 0x41, 0x06,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0x07, 0x14, 0x5a, 0x4c, 0x02, 0xfe, 0x5f, 0xa3,
+ 0x20, 0x36, 0xde, 0x68, 0xfa, 0xbe, 0x90, 0x66
+ }
+ },
+ },
+ /*
+ * OpenSSL's poly1305_ieee754.c failed this in final stage
+ */
+ {
+ {
+ 252,
+ {
+ 0x84, 0x23, 0x64, 0xe1, 0x56, 0x33, 0x6c, 0x09,
+ 0x98, 0xb9, 0x33, 0xa6, 0x23, 0x77, 0x26, 0x18,
+ 0x0d, 0x9e, 0x3f, 0xdc, 0xbd, 0xe4, 0xcd, 0x5d,
+ 0x17, 0x08, 0x0f, 0xc3, 0xbe, 0xb4, 0x96, 0x14,
+
+ 0xd7, 0x12, 0x2c, 0x03, 0x74, 0x63, 0xff, 0x10,
+ 0x4d, 0x73, 0xf1, 0x9c, 0x12, 0x70, 0x46, 0x28,
+ 0xd4, 0x17, 0xc4, 0xc5, 0x4a, 0x3f, 0xe3, 0x0d,
+ 0x3c, 0x3d, 0x77, 0x14, 0x38, 0x2d, 0x43, 0xb0,
+
+ 0x38, 0x2a, 0x50, 0xa5, 0xde, 0xe5, 0x4b, 0xe8,
+ 0x44, 0xb0, 0x76, 0xe8, 0xdf, 0x88, 0x20, 0x1a,
+ 0x1c, 0xd4, 0x3b, 0x90, 0xeb, 0x21, 0x64, 0x3f,
+ 0xa9, 0x6f, 0x39, 0xb5, 0x18, 0xaa, 0x83, 0x40,
+
+ 0xc9, 0x42, 0xff, 0x3c, 0x31, 0xba, 0xf7, 0xc9,
+ 0xbd, 0xbf, 0x0f, 0x31, 0xae, 0x3f, 0xa0, 0x96,
+ 0xbf, 0x8c, 0x63, 0x03, 0x06, 0x09, 0x82, 0x9f,
+ 0xe7, 0x2e, 0x17, 0x98, 0x24, 0x89, 0x0b, 0xc8,
+
+ 0xe0, 0x8c, 0x31, 0x5c, 0x1c, 0xce, 0x2a, 0x83,
+ 0x14, 0x4d, 0xbb, 0xff, 0x09, 0xf7, 0x4e, 0x3e,
+ 0xfc, 0x77, 0x0b, 0x54, 0xd0, 0x98, 0x4a, 0x8f,
+ 0x19, 0xb1, 0x47, 0x19, 0xe6, 0x36, 0x35, 0x64,
+
+ 0x1d, 0x6b, 0x1e, 0xed, 0xf6, 0x3e, 0xfb, 0xf0,
+ 0x80, 0xe1, 0x78, 0x3d, 0x32, 0x44, 0x54, 0x12,
+ 0x11, 0x4c, 0x20, 0xde, 0x0b, 0x83, 0x7a, 0x0d,
+ 0xfa, 0x33, 0xd6, 0xb8, 0x28, 0x25, 0xff, 0xf4,
+
+ 0x4c, 0x9a, 0x70, 0xea, 0x54, 0xce, 0x47, 0xf0,
+ 0x7d, 0xf6, 0x98, 0xe6, 0xb0, 0x33, 0x23, 0xb5,
+ 0x30, 0x79, 0x36, 0x4a, 0x5f, 0xc3, 0xe9, 0xdd,
+ 0x03, 0x43, 0x92, 0xbd, 0xde, 0x86, 0xdc, 0xcd,
+
+ 0xda, 0x94, 0x32, 0x1c, 0x5e, 0x44, 0x06, 0x04,
+ 0x89, 0x33, 0x6c, 0xb6, 0x5b, 0xf3, 0x98, 0x9c,
+ 0x36, 0xf7, 0x28, 0x2c, 0x2f, 0x5d, 0x2b, 0x88,
+ 0x2c, 0x17, 0x1e, 0x74
+ }
+ },
+ {
+ 32,
+ {
+ 0x95, 0xd5, 0xc0, 0x05, 0x50, 0x3e, 0x51, 0x0d,
+ 0x8c, 0xd0, 0xaa, 0x07, 0x2c, 0x4a, 0x4d, 0x06,
+ 0x6e, 0xab, 0xc5, 0x2d, 0x11, 0x65, 0x3d, 0xf4,
+ 0x7f, 0xbf, 0x63, 0xab, 0x19, 0x8b, 0xcc, 0x26
+ }
+ },
+ {
+ 16,
+ {
+ 0xf2, 0x48, 0x31, 0x2e, 0x57, 0x8d, 0x9d, 0x58,
+ 0xf8, 0xb7, 0xbb, 0x4d, 0x19, 0x10, 0x54, 0x31
+ }
+ },
+ },
+ /*
+ * AVX2 in OpenSSL's poly1305-x86.pl failed this with 176+32 split
+ */
+ {
+ {
+ 208,
+ {
+ 0x24, 0x8a, 0xc3, 0x10, 0x85, 0xb6, 0xc2, 0xad,
+ 0xaa, 0xa3, 0x82, 0x59, 0xa0, 0xd7, 0x19, 0x2c,
+ 0x5c, 0x35, 0xd1, 0xbb, 0x4e, 0xf3, 0x9a, 0xd9,
+ 0x4c, 0x38, 0xd1, 0xc8, 0x24, 0x79, 0xe2, 0xdd,
+
+ 0x21, 0x59, 0xa0, 0x77, 0x02, 0x4b, 0x05, 0x89,
+ 0xbc, 0x8a, 0x20, 0x10, 0x1b, 0x50, 0x6f, 0x0a,
+ 0x1a, 0xd0, 0xbb, 0xab, 0x76, 0xe8, 0x3a, 0x83,
+ 0xf1, 0xb9, 0x4b, 0xe6, 0xbe, 0xae, 0x74, 0xe8,
+
+ 0x74, 0xca, 0xb6, 0x92, 0xc5, 0x96, 0x3a, 0x75,
+ 0x43, 0x6b, 0x77, 0x61, 0x21, 0xec, 0x9f, 0x62,
+ 0x39, 0x9a, 0x3e, 0x66, 0xb2, 0xd2, 0x27, 0x07,
+ 0xda, 0xe8, 0x19, 0x33, 0xb6, 0x27, 0x7f, 0x3c,
+
+ 0x85, 0x16, 0xbc, 0xbe, 0x26, 0xdb, 0xbd, 0x86,
+ 0xf3, 0x73, 0x10, 0x3d, 0x7c, 0xf4, 0xca, 0xd1,
+ 0x88, 0x8c, 0x95, 0x21, 0x18, 0xfb, 0xfb, 0xd0,
+ 0xd7, 0xb4, 0xbe, 0xdc, 0x4a, 0xe4, 0x93, 0x6a,
+
+ 0xff, 0x91, 0x15, 0x7e, 0x7a, 0xa4, 0x7c, 0x54,
+ 0x44, 0x2e, 0xa7, 0x8d, 0x6a, 0xc2, 0x51, 0xd3,
+ 0x24, 0xa0, 0xfb, 0xe4, 0x9d, 0x89, 0xcc, 0x35,
+ 0x21, 0xb6, 0x6d, 0x16, 0xe9, 0xc6, 0x6a, 0x37,
+
+ 0x09, 0x89, 0x4e, 0x4e, 0xb0, 0xa4, 0xee, 0xdc,
+ 0x4a, 0xe1, 0x94, 0x68, 0xe6, 0x6b, 0x81, 0xf2,
+
+ 0x71, 0x35, 0x1b, 0x1d, 0x92, 0x1e, 0xa5, 0x51,
+ 0x04, 0x7a, 0xbc, 0xc6, 0xb8, 0x7a, 0x90, 0x1f,
+ 0xde, 0x7d, 0xb7, 0x9f, 0xa1, 0x81, 0x8c, 0x11,
+ 0x33, 0x6d, 0xbc, 0x07, 0x24, 0x4a, 0x40, 0xeb
+ }
+ },
+ {
+ 32,
+ {
+ 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07,
+ 0x08, 0x09, 0x0a, 0x0b, 0x0c, 0x0d, 0x0e, 0x0f,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0xbc, 0x93, 0x9b, 0xc5, 0x28, 0x14, 0x80, 0xfa,
+ 0x99, 0xc6, 0xd6, 0x8c, 0x25, 0x8e, 0xc4, 0x2f
+ }
+ },
+ },
+ /*
+ * test vectors from Google
+ */
+ {
+ {
+ 0,
+ {
+ 0x00,
+ }
+ },
+ {
+ 32,
+ {
+ 0xc8, 0xaf, 0xaa, 0xc3, 0x31, 0xee, 0x37, 0x2c,
+ 0xd6, 0x08, 0x2d, 0xe1, 0x34, 0x94, 0x3b, 0x17,
+ 0x47, 0x10, 0x13, 0x0e, 0x9f, 0x6f, 0xea, 0x8d,
+ 0x72, 0x29, 0x38, 0x50, 0xa6, 0x67, 0xd8, 0x6c
+ }
+ },
+ {
+ 16,
+ {
+ 0x47, 0x10, 0x13, 0x0e, 0x9f, 0x6f, 0xea, 0x8d,
+ 0x72, 0x29, 0x38, 0x50, 0xa6, 0x67, 0xd8, 0x6c
+ }
+ },
+ },
+ {
+ {
+ 12,
+ {
+ 0x48, 0x65, 0x6c, 0x6c, 0x6f, 0x20, 0x77, 0x6f,
+ 0x72, 0x6c, 0x64, 0x21
+ }
+ },
+ {
+ 32,
+ {
+ 0x74, 0x68, 0x69, 0x73, 0x20, 0x69, 0x73, 0x20,
+ 0x33, 0x32, 0x2d, 0x62, 0x79, 0x74, 0x65, 0x20,
+ 0x6b, 0x65, 0x79, 0x20, 0x66, 0x6f, 0x72, 0x20,
+ 0x50, 0x6f, 0x6c, 0x79, 0x31, 0x33, 0x30, 0x35
+ }
+ },
+ {
+ 16,
+ {
+ 0xa6, 0xf7, 0x45, 0x00, 0x8f, 0x81, 0xc9, 0x16,
+ 0xa2, 0x0d, 0xcc, 0x74, 0xee, 0xf2, 0xb2, 0xf0
+ }
+ },
+ },
+ {
+ {
+ 32,
+ {
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 32,
+ {
+ 0x74, 0x68, 0x69, 0x73, 0x20, 0x69, 0x73, 0x20,
+ 0x33, 0x32, 0x2d, 0x62, 0x79, 0x74, 0x65, 0x20,
+ 0x6b, 0x65, 0x79, 0x20, 0x66, 0x6f, 0x72, 0x20,
+ 0x50, 0x6f, 0x6c, 0x79, 0x31, 0x33, 0x30, 0x35
+ }
+ },
+ {
+ 16,
+ {
+ 0x49, 0xec, 0x78, 0x09, 0x0e, 0x48, 0x1e, 0xc6,
+ 0xc2, 0x6b, 0x33, 0xb9, 0x1c, 0xcc, 0x03, 0x07
+ }
+ },
+ },
+ {
+ {
+ 128,
+ {
+ 0x89, 0xda, 0xb8, 0x0b, 0x77, 0x17, 0xc1, 0xdb,
+ 0x5d, 0xb4, 0x37, 0x86, 0x0a, 0x3f, 0x70, 0x21,
+ 0x8e, 0x93, 0xe1, 0xb8, 0xf4, 0x61, 0xfb, 0x67,
+ 0x7f, 0x16, 0xf3, 0x5f, 0x6f, 0x87, 0xe2, 0xa9,
+
+ 0x1c, 0x99, 0xbc, 0x3a, 0x47, 0xac, 0xe4, 0x76,
+ 0x40, 0xcc, 0x95, 0xc3, 0x45, 0xbe, 0x5e, 0xcc,
+ 0xa5, 0xa3, 0x52, 0x3c, 0x35, 0xcc, 0x01, 0x89,
+ 0x3a, 0xf0, 0xb6, 0x4a, 0x62, 0x03, 0x34, 0x27,
+
+ 0x03, 0x72, 0xec, 0x12, 0x48, 0x2d, 0x1b, 0x1e,
+ 0x36, 0x35, 0x61, 0x69, 0x8a, 0x57, 0x8b, 0x35,
+ 0x98, 0x03, 0x49, 0x5b, 0xb4, 0xe2, 0xef, 0x19,
+ 0x30, 0xb1, 0x7a, 0x51, 0x90, 0xb5, 0x80, 0xf1,
+
+ 0x41, 0x30, 0x0d, 0xf3, 0x0a, 0xdb, 0xec, 0xa2,
+ 0x8f, 0x64, 0x27, 0xa8, 0xbc, 0x1a, 0x99, 0x9f,
+ 0xd5, 0x1c, 0x55, 0x4a, 0x01, 0x7d, 0x09, 0x5d,
+ 0x8c, 0x3e, 0x31, 0x27, 0xda, 0xf9, 0xf5, 0x95
+ }
+ },
+ {
+ 32,
+ {
+ 0x2d, 0x77, 0x3b, 0xe3, 0x7a, 0xdb, 0x1e, 0x4d,
+ 0x68, 0x3b, 0xf0, 0x07, 0x5e, 0x79, 0xc4, 0xee,
+ 0x03, 0x79, 0x18, 0x53, 0x5a, 0x7f, 0x99, 0xcc,
+ 0xb7, 0x04, 0x0f, 0xb5, 0xf5, 0xf4, 0x3a, 0xea
+ }
+ },
+ {
+ 16,
+ {
+ 0xc8, 0x5d, 0x15, 0xed, 0x44, 0xc3, 0x78, 0xd6,
+ 0xb0, 0x0e, 0x23, 0x06, 0x4c, 0x7b, 0xcd, 0x51
+ }
+ },
+ },
+ {
+ {
+ 528,
+ {
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x0b,
+ 0x17, 0x03, 0x03, 0x02, 0x00, 0x00, 0x00, 0x00,
+
+ 0x06, 0xdb, 0x1f, 0x1f, 0x36, 0x8d, 0x69, 0x6a,
+ 0x81, 0x0a, 0x34, 0x9c, 0x0c, 0x71, 0x4c, 0x9a,
+ 0x5e, 0x78, 0x50, 0xc2, 0x40, 0x7d, 0x72, 0x1a,
+ 0xcd, 0xed, 0x95, 0xe0, 0x18, 0xd7, 0xa8, 0x52,
+
+ 0x66, 0xa6, 0xe1, 0x28, 0x9c, 0xdb, 0x4a, 0xeb,
+ 0x18, 0xda, 0x5a, 0xc8, 0xa2, 0xb0, 0x02, 0x6d,
+ 0x24, 0xa5, 0x9a, 0xd4, 0x85, 0x22, 0x7f, 0x3e,
+ 0xae, 0xdb, 0xb2, 0xe7, 0xe3, 0x5e, 0x1c, 0x66,
+
+ 0xcd, 0x60, 0xf9, 0xab, 0xf7, 0x16, 0xdc, 0xc9,
+ 0xac, 0x42, 0x68, 0x2d, 0xd7, 0xda, 0xb2, 0x87,
+ 0xa7, 0x02, 0x4c, 0x4e, 0xef, 0xc3, 0x21, 0xcc,
+ 0x05, 0x74, 0xe1, 0x67, 0x93, 0xe3, 0x7c, 0xec,
+
+ 0x03, 0xc5, 0xbd, 0xa4, 0x2b, 0x54, 0xc1, 0x14,
+ 0xa8, 0x0b, 0x57, 0xaf, 0x26, 0x41, 0x6c, 0x7b,
+ 0xe7, 0x42, 0x00, 0x5e, 0x20, 0x85, 0x5c, 0x73,
+ 0xe2, 0x1d, 0xc8, 0xe2, 0xed, 0xc9, 0xd4, 0x35,
+
+ 0xcb, 0x6f, 0x60, 0x59, 0x28, 0x00, 0x11, 0xc2,
+ 0x70, 0xb7, 0x15, 0x70, 0x05, 0x1c, 0x1c, 0x9b,
+ 0x30, 0x52, 0x12, 0x66, 0x20, 0xbc, 0x1e, 0x27,
+ 0x30, 0xfa, 0x06, 0x6c, 0x7a, 0x50, 0x9d, 0x53,
+
+ 0xc6, 0x0e, 0x5a, 0xe1, 0xb4, 0x0a, 0xa6, 0xe3,
+ 0x9e, 0x49, 0x66, 0x92, 0x28, 0xc9, 0x0e, 0xec,
+ 0xb4, 0xa5, 0x0d, 0xb3, 0x2a, 0x50, 0xbc, 0x49,
+ 0xe9, 0x0b, 0x4f, 0x4b, 0x35, 0x9a, 0x1d, 0xfd,
+
+ 0x11, 0x74, 0x9c, 0xd3, 0x86, 0x7f, 0xcf, 0x2f,
+ 0xb7, 0xbb, 0x6c, 0xd4, 0x73, 0x8f, 0x6a, 0x4a,
+ 0xd6, 0xf7, 0xca, 0x50, 0x58, 0xf7, 0x61, 0x88,
+ 0x45, 0xaf, 0x9f, 0x02, 0x0f, 0x6c, 0x3b, 0x96,
+
+ 0x7b, 0x8f, 0x4c, 0xd4, 0xa9, 0x1e, 0x28, 0x13,
+ 0xb5, 0x07, 0xae, 0x66, 0xf2, 0xd3, 0x5c, 0x18,
+ 0x28, 0x4f, 0x72, 0x92, 0x18, 0x60, 0x62, 0xe1,
+ 0x0f, 0xd5, 0x51, 0x0d, 0x18, 0x77, 0x53, 0x51,
+
+ 0xef, 0x33, 0x4e, 0x76, 0x34, 0xab, 0x47, 0x43,
+ 0xf5, 0xb6, 0x8f, 0x49, 0xad, 0xca, 0xb3, 0x84,
+ 0xd3, 0xfd, 0x75, 0xf7, 0x39, 0x0f, 0x40, 0x06,
+ 0xef, 0x2a, 0x29, 0x5c, 0x8c, 0x7a, 0x07, 0x6a,
+
+ 0xd5, 0x45, 0x46, 0xcd, 0x25, 0xd2, 0x10, 0x7f,
+ 0xbe, 0x14, 0x36, 0xc8, 0x40, 0x92, 0x4a, 0xae,
+ 0xbe, 0x5b, 0x37, 0x08, 0x93, 0xcd, 0x63, 0xd1,
+ 0x32, 0x5b, 0x86, 0x16, 0xfc, 0x48, 0x10, 0x88,
+
+ 0x6b, 0xc1, 0x52, 0xc5, 0x32, 0x21, 0xb6, 0xdf,
+ 0x37, 0x31, 0x19, 0x39, 0x32, 0x55, 0xee, 0x72,
+ 0xbc, 0xaa, 0x88, 0x01, 0x74, 0xf1, 0x71, 0x7f,
+ 0x91, 0x84, 0xfa, 0x91, 0x64, 0x6f, 0x17, 0xa2,
+
+ 0x4a, 0xc5, 0x5d, 0x16, 0xbf, 0xdd, 0xca, 0x95,
+ 0x81, 0xa9, 0x2e, 0xda, 0x47, 0x92, 0x01, 0xf0,
+ 0xed, 0xbf, 0x63, 0x36, 0x00, 0xd6, 0x06, 0x6d,
+ 0x1a, 0xb3, 0x6d, 0x5d, 0x24, 0x15, 0xd7, 0x13,
+
+ 0x51, 0xbb, 0xcd, 0x60, 0x8a, 0x25, 0x10, 0x8d,
+ 0x25, 0x64, 0x19, 0x92, 0xc1, 0xf2, 0x6c, 0x53,
+ 0x1c, 0xf9, 0xf9, 0x02, 0x03, 0xbc, 0x4c, 0xc1,
+ 0x9f, 0x59, 0x27, 0xd8, 0x34, 0xb0, 0xa4, 0x71,
+
+ 0x16, 0xd3, 0x88, 0x4b, 0xbb, 0x16, 0x4b, 0x8e,
+ 0xc8, 0x83, 0xd1, 0xac, 0x83, 0x2e, 0x56, 0xb3,
+ 0x91, 0x8a, 0x98, 0x60, 0x1a, 0x08, 0xd1, 0x71,
+ 0x88, 0x15, 0x41, 0xd5, 0x94, 0xdb, 0x39, 0x9c,
+
+ 0x6a, 0xe6, 0x15, 0x12, 0x21, 0x74, 0x5a, 0xec,
+ 0x81, 0x4c, 0x45, 0xb0, 0xb0, 0x5b, 0x56, 0x54,
+ 0x36, 0xfd, 0x6f, 0x13, 0x7a, 0xa1, 0x0a, 0x0c,
+ 0x0b, 0x64, 0x37, 0x61, 0xdb, 0xd6, 0xf9, 0xa9,
+
+ 0xdc, 0xb9, 0x9b, 0x1a, 0x6e, 0x69, 0x08, 0x54,
+ 0xce, 0x07, 0x69, 0xcd, 0xe3, 0x97, 0x61, 0xd8,
+ 0x2f, 0xcd, 0xec, 0x15, 0xf0, 0xd9, 0x2d, 0x7d,
+ 0x8e, 0x94, 0xad, 0xe8, 0xeb, 0x83, 0xfb, 0xe0
+ }
+ },
+ {
+ 32,
+ {
+ 0x99, 0xe5, 0x82, 0x2d, 0xd4, 0x17, 0x3c, 0x99,
+ 0x5e, 0x3d, 0xae, 0x0d, 0xde, 0xfb, 0x97, 0x74,
+ 0x3f, 0xde, 0x3b, 0x08, 0x01, 0x34, 0xb3, 0x9f,
+ 0x76, 0xe9, 0xbf, 0x8d, 0x0e, 0x88, 0xd5, 0x46
+ }
+ },
+ {
+ 16,
+ {
+ 0x26, 0x37, 0x40, 0x8f, 0xe1, 0x30, 0x86, 0xea,
+ 0x73, 0xf9, 0x71, 0xe3, 0x42, 0x5e, 0x28, 0x20
+ }
+ },
+ },
+ /*
+ * test vectors from Hanno Böck
+ */
+ {
+ {
+ 257,
+ {
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0x80, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xce, 0xcc, 0xcc, 0xcc,
+
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xc5,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xe3, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xac, 0xcc, 0xcc, 0xcc,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xe6,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0x00, 0x00, 0x00,
+ 0xaf, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc,
+
+ 0xcc, 0xcc, 0xff, 0xff, 0xff, 0xf5, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0x00, 0xff, 0xff, 0xff, 0xe7, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x71, 0x92, 0x05, 0xa8, 0x52, 0x1d,
+
+ 0xfc
+ }
+ },
+ {
+ 32,
+ {
+ 0x7f, 0x1b, 0x02, 0x64, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc, 0xcc
+ }
+ },
+ {
+ 16,
+ {
+ 0x85, 0x59, 0xb8, 0x76, 0xec, 0xee, 0xd6, 0x6e,
+ 0xb3, 0x77, 0x98, 0xc0, 0x45, 0x7b, 0xaf, 0xf9
+ }
+ },
+ },
+ {
+ {
+ 39,
+ {
+ 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa,
+ 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa,
+ 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa,
+ 0xaa, 0xaa, 0xaa, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0x00, 0x00, 0x00, 0x00, 0x80, 0x02, 0x64
+ }
+ },
+ {
+ 32,
+ {
+ 0xe0, 0x00, 0x16, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa,
+ 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa, 0xaa
+ }
+ },
+ {
+ 16,
+ {
+ 0x00, 0xbd, 0x12, 0x58, 0x97, 0x8e, 0x20, 0x54,
+ 0x44, 0xc9, 0xaa, 0xaa, 0x82, 0x00, 0x6f, 0xed
+ }
+ },
+ },
+ {
+ {
+ 2,
+ {
+ 0x02, 0xfc
+ }
+ },
+ {
+ 32,
+ {
+ 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c,
+ 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c,
+ 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c,
+ 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c
+ }
+ },
+ {
+ 16,
+ {
+ 0x06, 0x12, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c,
+ 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c, 0x0c
+ }
+ },
+ },
+ {
+ {
+ 415,
+ {
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7a, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+
+ 0x7b, 0x7b, 0x5c, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x6e, 0x7b, 0x00, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7a, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x5c,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+ 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b, 0x7b,
+
+ 0x7b, 0x6e, 0x7b, 0x00, 0x13, 0x00, 0x00, 0x00,
+ 0x00, 0xb3, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0xf2, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x20, 0x00, 0xef, 0xff, 0x00,
+ 0x09, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x10, 0x00, 0x00,
+ 0x00, 0x00, 0x09, 0x00, 0x00, 0x00, 0x64, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x13, 0x00, 0x00, 0x00, 0x00,
+
+ 0xb3, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xf2,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x20, 0x00, 0xef, 0xff, 0x00, 0x09,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x7a, 0x00, 0x00, 0x10, 0x00, 0x00, 0x00,
+
+ 0x00, 0x09, 0x00, 0x00, 0x00, 0x64, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xfc
+ }
+ },
+ {
+ 32,
+ {
+ 0x00, 0xff, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x1e, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x7b, 0x7b
+ }
+ },
+ {
+ 16,
+ {
+ 0x33, 0x20, 0x5b, 0xbf, 0x9e, 0x9f, 0x8f, 0x72,
+ 0x12, 0xab, 0x9e, 0x2a, 0xb9, 0xb7, 0xe4, 0xa5
+ }
+ },
+ },
+ {
+ {
+ 118,
+ {
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0xff, 0xff, 0xff, 0xe9,
+ 0xe9, 0xac, 0xac, 0xac, 0xac, 0xac, 0xac, 0xac,
+ 0xac, 0xac, 0xac, 0xac, 0x00, 0x00, 0xac, 0xac,
+
+ 0xec, 0x01, 0x00, 0xac, 0xac, 0xac, 0x2c, 0xac,
+ 0xa2, 0xac, 0xac, 0xac, 0xac, 0xac, 0xac, 0xac,
+ 0xac, 0xac, 0xac, 0xac, 0x64, 0xf2
+ }
+ },
+ {
+ 32,
+ {
+ 0x00, 0x00, 0x00, 0x7f, 0x00, 0x00, 0x00, 0x7f,
+ 0x01, 0x00, 0x00, 0x20, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0xcf, 0x77, 0x77, 0x77, 0x77, 0x77,
+ 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77, 0x77
+ }
+ },
+ {
+ 16,
+ {
+ 0x02, 0xee, 0x7c, 0x8c, 0x54, 0x6d, 0xde, 0xb1,
+ 0xa4, 0x67, 0xe4, 0xc3, 0x98, 0x11, 0x58, 0xb9
+ }
+ },
+ },
+ /*
+ * test vectors from Andrew Moon
+ */
+ { /* nacl */
+ {
+ 131,
+ {
+ 0x8e, 0x99, 0x3b, 0x9f, 0x48, 0x68, 0x12, 0x73,
+ 0xc2, 0x96, 0x50, 0xba, 0x32, 0xfc, 0x76, 0xce,
+ 0x48, 0x33, 0x2e, 0xa7, 0x16, 0x4d, 0x96, 0xa4,
+ 0x47, 0x6f, 0xb8, 0xc5, 0x31, 0xa1, 0x18, 0x6a,
+
+ 0xc0, 0xdf, 0xc1, 0x7c, 0x98, 0xdc, 0xe8, 0x7b,
+ 0x4d, 0xa7, 0xf0, 0x11, 0xec, 0x48, 0xc9, 0x72,
+ 0x71, 0xd2, 0xc2, 0x0f, 0x9b, 0x92, 0x8f, 0xe2,
+ 0x27, 0x0d, 0x6f, 0xb8, 0x63, 0xd5, 0x17, 0x38,
+
+ 0xb4, 0x8e, 0xee, 0xe3, 0x14, 0xa7, 0xcc, 0x8a,
+ 0xb9, 0x32, 0x16, 0x45, 0x48, 0xe5, 0x26, 0xae,
+ 0x90, 0x22, 0x43, 0x68, 0x51, 0x7a, 0xcf, 0xea,
+ 0xbd, 0x6b, 0xb3, 0x73, 0x2b, 0xc0, 0xe9, 0xda,
+
+ 0x99, 0x83, 0x2b, 0x61, 0xca, 0x01, 0xb6, 0xde,
+ 0x56, 0x24, 0x4a, 0x9e, 0x88, 0xd5, 0xf9, 0xb3,
+ 0x79, 0x73, 0xf6, 0x22, 0xa4, 0x3d, 0x14, 0xa6,
+ 0x59, 0x9b, 0x1f, 0x65, 0x4c, 0xb4, 0x5a, 0x74,
+
+ 0xe3, 0x55, 0xa5
+ }
+ },
+ {
+ 32,
+ {
+ 0xee, 0xa6, 0xa7, 0x25, 0x1c, 0x1e, 0x72, 0x91,
+ 0x6d, 0x11, 0xc2, 0xcb, 0x21, 0x4d, 0x3c, 0x25,
+ 0x25, 0x39, 0x12, 0x1d, 0x8e, 0x23, 0x4e, 0x65,
+ 0x2d, 0x65, 0x1f, 0xa4, 0xc8, 0xcf, 0xf8, 0x80
+ }
+ },
+ {
+ 16,
+ {
+ 0xf3, 0xff, 0xc7, 0x70, 0x3f, 0x94, 0x00, 0xe5,
+ 0x2a, 0x7d, 0xfb, 0x4b, 0x3d, 0x33, 0x05, 0xd9
+ }
+ },
+ },
+ { /* wrap 2^130-5 */
+ {
+ 16,
+ {
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff
+ }
+ },
+ {
+ 32,
+ {
+ 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0x03, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ },
+ { /* wrap 2^128 */
+ {
+ 16,
+ {
+ 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 32,
+ {
+ 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff
+ }
+ },
+ {
+ 16,
+ {
+ 0x03, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ },
+ { /* limb carry */
+ {
+ 48,
+ {
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xf0, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+
+ 0x11, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 32,
+ {
+ 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0x05, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ },
+ { /* 2^130-5 */
+ {
+ 48,
+ {
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xfb, 0xfe, 0xfe, 0xfe, 0xfe, 0xfe, 0xfe, 0xfe,
+ 0xfe, 0xfe, 0xfe, 0xfe, 0xfe, 0xfe, 0xfe, 0xfe,
+
+ 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01,
+ 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01
+ }
+ },
+ {
+ 32,
+ {
+ 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+
+ }
+ },
+ },
+ { /* 2^130-6 */
+ {
+ 16,
+ {
+ 0xfd, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff
+ }
+ },
+ {
+ 32,
+ {
+ 0x02, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0xfa, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff,
+ 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff
+ }
+ },
+ },
+ { /* 5*H+L reduction intermediate */
+ {
+ 64,
+ {
+ 0xe3, 0x35, 0x94, 0xd7, 0x50, 0x5e, 0x43, 0xb9,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x33, 0x94, 0xd7, 0x50, 0x5e, 0x43, 0x79, 0xcd,
+ 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 32,
+ {
+ 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x04, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0x14, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x55, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ },
+ { /* 5*H+L reduction final */
+ {
+ 48,
+ {
+ 0xe3, 0x35, 0x94, 0xd7, 0x50, 0x5e, 0x43, 0xb9,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x33, 0x94, 0xd7, 0x50, 0x5e, 0x43, 0x79, 0xcd,
+ 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+
+ }
+ },
+ {
+ 32,
+ {
+ 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x04, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ },
+ {
+ 16,
+ {
+ 0x13, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+ 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00
+ }
+ }
+ }
+};
+
+bool __init poly1305_selftest(void)
+{
+ bool have_simd = simd_get();
+ bool success = true;
+ size_t i;
+
+ for (i = 0; i < ARRAY_SIZE(poly1305_testvecs); ++i) {
+ struct poly1305_ctx poly1305;
+ const u8 *in = poly1305_testvecs[i].input.data;
+ size_t inlen = poly1305_testvecs[i].input.size;
+ const u8 *key = poly1305_testvecs[i].key.data;
+ const u8 *expected = poly1305_testvecs[i].expected.data;
+ size_t expectedlen = poly1305_testvecs[i].expected.size;
+ u8 out[POLY1305_MAC_SIZE];
+
+ if (expectedlen != sizeof(out)) {
+ pr_info("poly1305 self-test %zu logic: FAIL\n", i + 1);
+ success = false;
+ }
+
+ memset(out, 0, sizeof(out));
+ memset(&poly1305, 0, sizeof(poly1305));
+ poly1305_init(&poly1305, key, have_simd);
+ poly1305_update(&poly1305, in, inlen, have_simd);
+ poly1305_finish(&poly1305, out, have_simd);
+ if (memcmp(out, expected, expectedlen)) {
+ pr_info("poly1305 self-test %zu: FAIL\n", i + 1);
+ success = false;
+ }
+
+ if (inlen > 16) {
+ memset(out, 0, sizeof(out));
+ memset(&poly1305, 0, sizeof(poly1305));
+ poly1305_init(&poly1305, key, have_simd);
+ poly1305_update(&poly1305, in, 1, have_simd);
+ poly1305_update(&poly1305, in + 1, inlen - 1, have_simd);
+ poly1305_finish(&poly1305, out, have_simd);
+ if (memcmp(out, expected, expectedlen)) {
+ pr_info("poly1305 self-test %zu/1+(N-1): FAIL\n", i + 1);
+ success = false;
+ }
+ }
+
+ if (inlen > 32) {
+ size_t half = inlen / 2;
+
+ memset(out, 0, sizeof(out));
+ memset(&poly1305, 0, sizeof(poly1305));
+ poly1305_init(&poly1305, key, have_simd);
+ poly1305_update(&poly1305, in, half, have_simd);
+ poly1305_update(&poly1305, in + half, inlen - half, have_simd);
+ poly1305_finish(&poly1305, out, have_simd);
+ if (memcmp(out, expected, expectedlen)) {
+ pr_info("poly1305 self-test %zu/2: FAIL\n", i + 1);
+ success = false;
+ }
+
+ for (half = 16; half < inlen; half += 16) {
+ memset(out, 0, sizeof(out));
+ memset(&poly1305, 0, sizeof(poly1305));
+ poly1305_init(&poly1305, key, have_simd);
+ poly1305_update(&poly1305, in, half, have_simd);
+ poly1305_update(&poly1305, in + half, inlen - half, have_simd);
+ poly1305_finish(&poly1305, out, have_simd);
+ if (memcmp(out, expected, expectedlen)) {
+ pr_info("poly1305 self-test %zu/%zu+%zu: FAIL\n", i + 1, half, inlen - half);
+ success = false;
+ }
+ }
+ }
+ }
+ simd_put(have_simd);
+
+ if (success)
+ pr_info("poly1305 self-tests: pass\n");
+
+ return success;
+}
+#endif
--
2.18.0
Powered by blists - more mailing lists