[<prev] [next>] [<thread-prev] [thread-next>] [day] [month] [year] [list]
Message-ID: <CAMj1kXGfwuY_uEGT83QpoUZwy9X=6k7zaxHs2kFrdsArKpVpOw@mail.gmail.com>
Date: Wed, 28 Oct 2020 10:06:58 +0100
From: Ard Biesheuvel <ardb@...nel.org>
To: Horia Geantă <horia.geanta@....com>
Cc: Herbert Xu <herbert@...dor.apana.org.au>,
"David S. Miller" <davem@...emloft.net>,
Russell King <linux@...linux.org.uk>,
Linux Crypto Mailing List <linux-crypto@...r.kernel.org>,
Linux ARM <linux-arm-kernel@...ts.infradead.org>,
NXP Linux Team <linux-imx@....com>,
Linux Kernel Mailing List <linux-kernel@...r.kernel.org>
Subject: Re: [PATCH] crypto: arm/aes-neonbs - fix usage of cbc(aes) fallback
On Wed, 28 Oct 2020 at 10:03, Horia Geantă <horia.geanta@....com> wrote:
>
> Loading the module deadlocks since:
> -local cbc(aes) implementation needs a fallback and
> -crypto API tries to find one but the request_module() resolves back to
> the same module
>
> Fix this by changing the module alias for cbc(aes) and
> using the NEED_FALLBACK flag when requesting for a fallback algorithm.
>
> Fixes: 00b99ad2bac2 ("crypto: arm/aes-neonbs - Use generic cbc encryption path")
> Signed-off-by: Horia Geantă <horia.geanta@....com>
Not sure what is happening here: IIRC the intention was to rely on the
fact that only the sync cbc(aes) implementation needs the fallback,
and therefore, allocating a sync skcipher explicitly would avoid this
recursion.
Herbert?
> ---
> arch/arm/crypto/aes-neonbs-glue.c | 8 +++++---
> 1 file changed, 5 insertions(+), 3 deletions(-)
>
> diff --git a/arch/arm/crypto/aes-neonbs-glue.c b/arch/arm/crypto/aes-neonbs-glue.c
> index bda8bf17631e..f70af1d0514b 100644
> --- a/arch/arm/crypto/aes-neonbs-glue.c
> +++ b/arch/arm/crypto/aes-neonbs-glue.c
> @@ -19,7 +19,7 @@ MODULE_AUTHOR("Ard Biesheuvel <ard.biesheuvel@...aro.org>");
> MODULE_LICENSE("GPL v2");
>
> MODULE_ALIAS_CRYPTO("ecb(aes)");
> -MODULE_ALIAS_CRYPTO("cbc(aes)");
> +MODULE_ALIAS_CRYPTO("cbc(aes)-all");
> MODULE_ALIAS_CRYPTO("ctr(aes)");
> MODULE_ALIAS_CRYPTO("xts(aes)");
>
> @@ -191,7 +191,8 @@ static int cbc_init(struct crypto_skcipher *tfm)
> struct aesbs_cbc_ctx *ctx = crypto_skcipher_ctx(tfm);
> unsigned int reqsize;
>
> - ctx->enc_tfm = crypto_alloc_skcipher("cbc(aes)", 0, CRYPTO_ALG_ASYNC);
> + ctx->enc_tfm = crypto_alloc_skcipher("cbc(aes)", 0, CRYPTO_ALG_ASYNC |
> + CRYPTO_ALG_NEED_FALLBACK);
> if (IS_ERR(ctx->enc_tfm))
> return PTR_ERR(ctx->enc_tfm);
>
> @@ -441,7 +442,8 @@ static struct skcipher_alg aes_algs[] = { {
> .base.cra_blocksize = AES_BLOCK_SIZE,
> .base.cra_ctxsize = sizeof(struct aesbs_cbc_ctx),
> .base.cra_module = THIS_MODULE,
> - .base.cra_flags = CRYPTO_ALG_INTERNAL,
> + .base.cra_flags = CRYPTO_ALG_INTERNAL |
> + CRYPTO_ALG_NEED_FALLBACK,
>
> .min_keysize = AES_MIN_KEY_SIZE,
> .max_keysize = AES_MAX_KEY_SIZE,
> --
> 2.17.1
>
Powered by blists - more mailing lists