From patchwork Sat Apr 5 18:26:01 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039215 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id E5B42C36010 for ; Sat, 5 Apr 2025 18:38:31 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=0DPxsuZPz/G/GRxTFNY0ns+MM+7+KxbKKGtUq4YfWYk=; b=CE/XwUn0dRQ3wBp5BNEL82maSx SwPUEpDDmL2/MXvQaSs8BgctNIPUM7cRN58pqcqiOQ1fdvDhjKyU2RHd6+ckwf6Wj054ZWK8EosNX 4w7V/zQBp9LSSV/VA9KyuaBpETr/C3GKP/5Q/39mjyh1AGFK9tFx3H0aakxRjgISJv2s7GZqyQjAv xkR8pJzG8I3ECJSYXaxPM5ZkIS5OZGL9L3f8sUjRLtbGVQGKn6h0p699dR0lIcF80aRSFAxx8IOXb tjgUCabWDyS7LXvXzLApl2UBeNONM0POjVGUw/3iwUwUsPNh8DTssz/VcSCq43bCHefcLt7nAfNrg 8ew6XaQA==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18PL-0000000EN7Q-2NTL; Sat, 05 Apr 2025 18:38:19 +0000 Received: from dfw.source.kernel.org ([2604:1380:4641:c500::1]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IA-0000000ELct-0pq3; Sat, 05 Apr 2025 18:30:56 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by dfw.source.kernel.org (Postfix) with ESMTP id 7B9F45C4B5B; Sat, 5 Apr 2025 18:28:36 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id D6F6FC4CEEB; Sat, 5 Apr 2025 18:30:52 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877853; bh=HxORmLohzzHt6n63UhUGqHHR5CkahzSnOC2Vk+mIi7g=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=Ndae7o8kUNJmzQkUvuztaFnxK5ugJ8vgqdrgDJe+jl6mFgNi0sIvQIOrwDhjEet5s HhCI08eamGOfY7KKnd3X6l6Hsy497+r6AEPWFtjKXcIXvkOpEd2ZulNvZ+w/8LdDrm EQiNvt7ulyWUfoS8WfyMplzr7vfKN9o0Zfkah9/o98CK6m4ymYXzabER5sXNTEao8N /r7adv60+BmThd+P/xsGwVgqwCbPI7XzH0tcPVcVNGlfDWO7Mj7GNJ2POE+R5VrnRF JgxqzfvAhadTq5y1DhSuOXxfPqBq+QHzuOcMWLE7l9C+F1etgLLmgkcX8PgrRpDGIK b8cqwxV2eXNxg== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 1/9] crypto: riscv/chacha - implement library instead of skcipher Date: Sat, 5 Apr 2025 11:26:01 -0700 Message-ID: <20250405182609.404216-2-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-CRM114-Version: 20100106-BlameMichelson ( TRE 0.8.0 (BSD) ) MR-646709E3 X-CRM114-CacheID: sfid-20250405_113054_357055_FBA6FE21 X-CRM114-Status: GOOD ( 19.12 ) X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Currently the RISC-V optimized ChaCha20 is only wired up to the crypto_skcipher API, which makes it unavailable to users of the library API. The crypto_skcipher API for ChaCha20 is going to change to be implemented on top of the library API, so the library API needs to be supported. And of course it's needed anyway to serve the library users. Therefore, change the RISC-V ChaCha20 code to implement the library API instead of the crypto_skcipher API. The library functions take the ChaCha state matrix directly (instead of key and IV) and support both ChaCha20 and ChaCha12. To make the RISC-V code work properly for that, change the assembly code to take the state matrix directly and add a nrounds parameter. Signed-off-by: Eric Biggers --- arch/riscv/crypto/Kconfig | 11 +-- arch/riscv/crypto/chacha-riscv64-glue.c | 106 ++++++++---------------- arch/riscv/crypto/chacha-riscv64-zvkb.S | 71 ++++++++-------- 3 files changed, 75 insertions(+), 113 deletions(-) diff --git a/arch/riscv/crypto/Kconfig b/arch/riscv/crypto/Kconfig index c67095a3d6690..6392e1e11bc96 100644 --- a/arch/riscv/crypto/Kconfig +++ b/arch/riscv/crypto/Kconfig @@ -17,18 +17,15 @@ config CRYPTO_AES_RISCV64 - Zvbb vector extension (XTS) - Zvkb vector crypto extension (CTR) - Zvkg vector crypto extension (XTS) config CRYPTO_CHACHA_RISCV64 - tristate "Ciphers: ChaCha" + tristate depends on 64BIT && RISCV_ISA_V && TOOLCHAIN_HAS_VECTOR_CRYPTO - select CRYPTO_SKCIPHER - help - Length-preserving ciphers: ChaCha20 stream cipher algorithm - - Architecture: riscv64 using: - - Zvkb vector crypto extension + select CRYPTO_ARCH_HAVE_LIB_CHACHA + select CRYPTO_LIB_CHACHA_GENERIC + default CRYPTO_LIB_CHACHA_INTERNAL config CRYPTO_GHASH_RISCV64 tristate "Hash functions: GHASH" depends on 64BIT && RISCV_ISA_V && TOOLCHAIN_HAS_VECTOR_CRYPTO select CRYPTO_GCM diff --git a/arch/riscv/crypto/chacha-riscv64-glue.c b/arch/riscv/crypto/chacha-riscv64-glue.c index 10b46f36375af..68caef7a3d50b 100644 --- a/arch/riscv/crypto/chacha-riscv64-glue.c +++ b/arch/riscv/crypto/chacha-riscv64-glue.c @@ -1,101 +1,63 @@ // SPDX-License-Identifier: GPL-2.0-only /* - * ChaCha20 using the RISC-V vector crypto extensions + * ChaCha stream cipher (RISC-V optimized) * * Copyright (C) 2023 SiFive, Inc. * Author: Jerry Shih */ #include #include -#include -#include +#include +#include #include #include -asmlinkage void chacha20_zvkb(const u32 key[8], const u8 *in, u8 *out, - size_t len, const u32 iv[4]); +static __ro_after_init DEFINE_STATIC_KEY_FALSE(use_zvkb); -static int riscv64_chacha20_crypt(struct skcipher_request *req) +asmlinkage void chacha_zvkb(u32 state[16], const u8 *in, u8 *out, + size_t nblocks, int nrounds); + +void hchacha_block_arch(const u32 *state, u32 *out, int nrounds) { - u32 iv[CHACHA_IV_SIZE / sizeof(u32)]; - u8 block_buffer[CHACHA_BLOCK_SIZE]; - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - const struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - struct skcipher_walk walk; - unsigned int nbytes; - unsigned int tail_bytes; - int err; + hchacha_block_generic(state, out, nrounds); +} +EXPORT_SYMBOL(hchacha_block_arch); - iv[0] = get_unaligned_le32(req->iv); - iv[1] = get_unaligned_le32(req->iv + 4); - iv[2] = get_unaligned_le32(req->iv + 8); - iv[3] = get_unaligned_le32(req->iv + 12); +void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, unsigned int bytes, + int nrounds) +{ + u8 block_buffer[CHACHA_BLOCK_SIZE]; + unsigned int full_blocks = bytes / CHACHA_BLOCK_SIZE; + unsigned int tail_bytes = bytes % CHACHA_BLOCK_SIZE; - err = skcipher_walk_virt(&walk, req, false); - while (walk.nbytes) { - nbytes = walk.nbytes & ~(CHACHA_BLOCK_SIZE - 1); - tail_bytes = walk.nbytes & (CHACHA_BLOCK_SIZE - 1); - kernel_vector_begin(); - if (nbytes) { - chacha20_zvkb(ctx->key, walk.src.virt.addr, - walk.dst.virt.addr, nbytes, iv); - iv[0] += nbytes / CHACHA_BLOCK_SIZE; - } - if (walk.nbytes == walk.total && tail_bytes > 0) { - memcpy(block_buffer, walk.src.virt.addr + nbytes, - tail_bytes); - chacha20_zvkb(ctx->key, block_buffer, block_buffer, - CHACHA_BLOCK_SIZE, iv); - memcpy(walk.dst.virt.addr + nbytes, block_buffer, - tail_bytes); - tail_bytes = 0; - } - kernel_vector_end(); + if (!static_branch_likely(&use_zvkb) || !crypto_simd_usable()) + return chacha_crypt_generic(state, dst, src, bytes, nrounds); - err = skcipher_walk_done(&walk, tail_bytes); + kernel_vector_begin(); + if (full_blocks) { + chacha_zvkb(state, src, dst, full_blocks, nrounds); + src += full_blocks * CHACHA_BLOCK_SIZE; + dst += full_blocks * CHACHA_BLOCK_SIZE; } - - return err; + if (tail_bytes) { + memcpy(block_buffer, src, tail_bytes); + chacha_zvkb(state, block_buffer, block_buffer, 1, nrounds); + memcpy(dst, block_buffer, tail_bytes); + } + kernel_vector_end(); } - -static struct skcipher_alg riscv64_chacha_alg = { - .setkey = chacha20_setkey, - .encrypt = riscv64_chacha20_crypt, - .decrypt = riscv64_chacha20_crypt, - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .walksize = 4 * CHACHA_BLOCK_SIZE, - .base = { - .cra_blocksize = 1, - .cra_ctxsize = sizeof(struct chacha_ctx), - .cra_priority = 300, - .cra_name = "chacha20", - .cra_driver_name = "chacha20-riscv64-zvkb", - .cra_module = THIS_MODULE, - }, -}; +EXPORT_SYMBOL(chacha_crypt_arch); static int __init riscv64_chacha_mod_init(void) { if (riscv_isa_extension_available(NULL, ZVKB) && riscv_vector_vlen() >= 128) - return crypto_register_skcipher(&riscv64_chacha_alg); - - return -ENODEV; -} - -static void __exit riscv64_chacha_mod_exit(void) -{ - crypto_unregister_skcipher(&riscv64_chacha_alg); + static_branch_enable(&use_zvkb); + return 0; } - module_init(riscv64_chacha_mod_init); -module_exit(riscv64_chacha_mod_exit); -MODULE_DESCRIPTION("ChaCha20 (RISC-V accelerated)"); +MODULE_DESCRIPTION("ChaCha stream cipher (RISC-V optimized)"); MODULE_AUTHOR("Jerry Shih "); MODULE_LICENSE("GPL"); -MODULE_ALIAS_CRYPTO("chacha20"); diff --git a/arch/riscv/crypto/chacha-riscv64-zvkb.S b/arch/riscv/crypto/chacha-riscv64-zvkb.S index bf057737ac693..ab4423b3880ea 100644 --- a/arch/riscv/crypto/chacha-riscv64-zvkb.S +++ b/arch/riscv/crypto/chacha-riscv64-zvkb.S @@ -44,24 +44,24 @@ #include .text .option arch, +zvkb -#define KEYP a0 +#define STATEP a0 #define INP a1 #define OUTP a2 -#define LEN a3 -#define IVP a4 +#define NBLOCKS a3 +#define NROUNDS a4 #define CONSTS0 a5 #define CONSTS1 a6 #define CONSTS2 a7 #define CONSTS3 t0 #define TMP t1 #define VL t2 #define STRIDE t3 -#define NROUNDS t4 +#define ROUND_CTR t4 #define KEY0 s0 #define KEY1 s1 #define KEY2 s2 #define KEY3 s3 #define KEY4 s4 @@ -130,18 +130,20 @@ vror.vi \b1, \b1, 32 - 7 vror.vi \b2, \b2, 32 - 7 vror.vi \b3, \b3, 32 - 7 .endm -// void chacha20_zvkb(const u32 key[8], const u8 *in, u8 *out, size_t len, -// const u32 iv[4]); +// void chacha_zvkb(u32 state[16], const u8 *in, u8 *out, size_t nblocks, +// int nrounds); // -// |len| must be nonzero and a multiple of 64 (CHACHA_BLOCK_SIZE). -// The counter is treated as 32-bit, following the RFC7539 convention. -SYM_FUNC_START(chacha20_zvkb) - srli LEN, LEN, 6 // Bytes to blocks - +// |nblocks| is the number of 64-byte blocks to process, and must be nonzero. +// +// |state| gives the ChaCha state matrix, including the 32-bit counter in +// state[12] following the RFC7539 convention; note that this differs from the +// original Salsa20 paper which uses a 64-bit counter in state[12..13]. The +// updated 32-bit counter is written back to state[12] before returning. +SYM_FUNC_START(chacha_zvkb) addi sp, sp, -96 sd s0, 0(sp) sd s1, 8(sp) sd s2, 16(sp) sd s3, 24(sp) @@ -155,30 +157,30 @@ SYM_FUNC_START(chacha20_zvkb) sd s11, 88(sp) li STRIDE, 64 // Set up the initial state matrix in scalar registers. - li CONSTS0, 0x61707865 // "expa" little endian - li CONSTS1, 0x3320646e // "nd 3" little endian - li CONSTS2, 0x79622d32 // "2-by" little endian - li CONSTS3, 0x6b206574 // "te k" little endian - lw KEY0, 0(KEYP) - lw KEY1, 4(KEYP) - lw KEY2, 8(KEYP) - lw KEY3, 12(KEYP) - lw KEY4, 16(KEYP) - lw KEY5, 20(KEYP) - lw KEY6, 24(KEYP) - lw KEY7, 28(KEYP) - lw COUNTER, 0(IVP) - lw NONCE0, 4(IVP) - lw NONCE1, 8(IVP) - lw NONCE2, 12(IVP) + lw CONSTS0, 0(STATEP) + lw CONSTS1, 4(STATEP) + lw CONSTS2, 8(STATEP) + lw CONSTS3, 12(STATEP) + lw KEY0, 16(STATEP) + lw KEY1, 20(STATEP) + lw KEY2, 24(STATEP) + lw KEY3, 28(STATEP) + lw KEY4, 32(STATEP) + lw KEY5, 36(STATEP) + lw KEY6, 40(STATEP) + lw KEY7, 44(STATEP) + lw COUNTER, 48(STATEP) + lw NONCE0, 52(STATEP) + lw NONCE1, 56(STATEP) + lw NONCE2, 60(STATEP) .Lblock_loop: // Set vl to the number of blocks to process in this iteration. - vsetvli VL, LEN, e32, m1, ta, ma + vsetvli VL, NBLOCKS, e32, m1, ta, ma // Set up the initial state matrix for the next VL blocks in v0-v15. // v{i} holds the i'th 32-bit word of the state matrix for all blocks. // Note that only the counter word, at index 12, differs across blocks. vmv.v.x v0, CONSTS0 @@ -201,20 +203,20 @@ SYM_FUNC_START(chacha20_zvkb) // Load the first half of the input data for each block into v16-v23. // v{16+i} holds the i'th 32-bit word for all blocks. vlsseg8e32.v v16, (INP), STRIDE - li NROUNDS, 20 + mv ROUND_CTR, NROUNDS .Lnext_doubleround: - addi NROUNDS, NROUNDS, -2 + addi ROUND_CTR, ROUND_CTR, -2 // column round chacha_round v0, v4, v8, v12, v1, v5, v9, v13, \ v2, v6, v10, v14, v3, v7, v11, v15 // diagonal round chacha_round v0, v5, v10, v15, v1, v6, v11, v12, \ v2, v7, v8, v13, v3, v4, v9, v14 - bnez NROUNDS, .Lnext_doubleround + bnez ROUND_CTR, .Lnext_doubleround // Load the second half of the input data for each block into v24-v31. // v{24+i} holds the {8+i}'th 32-bit word for all blocks. addi TMP, INP, 32 vlsseg8e32.v v24, (TMP), STRIDE @@ -269,16 +271,17 @@ SYM_FUNC_START(chacha20_zvkb) vssseg8e32.v v24, (TMP), STRIDE // Update the counter, the remaining number of blocks, and the input and // output pointers according to the number of blocks processed (VL). add COUNTER, COUNTER, VL - sub LEN, LEN, VL + sub NBLOCKS, NBLOCKS, VL slli TMP, VL, 6 add OUTP, OUTP, TMP add INP, INP, TMP - bnez LEN, .Lblock_loop + bnez NBLOCKS, .Lblock_loop + sw COUNTER, 48(STATEP) ld s0, 0(sp) ld s1, 8(sp) ld s2, 16(sp) ld s3, 24(sp) ld s4, 32(sp) @@ -289,6 +292,6 @@ SYM_FUNC_START(chacha20_zvkb) ld s9, 72(sp) ld s10, 80(sp) ld s11, 88(sp) addi sp, sp, 96 ret -SYM_FUNC_END(chacha20_zvkb) +SYM_FUNC_END(chacha_zvkb) From patchwork Sat Apr 5 18:26:02 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039214 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id 26CDCC36010 for ; Sat, 5 Apr 2025 18:36:39 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=7t/OMNl9Xc0Nms/dEDcufS+fL8LNbYjemv37VgauOhQ=; b=yYW2U9d8n7QfMiQCpdI1bomk+F G0LSIf//6l16vqgAsEwJtnJ4NbHINFOWb7Q/yrTJWfgxqGo9gDIL8fWQB/hM/U8nCMopq0S8E9JnA Q3QYoynCwohc7axCHjigUxavOEHZNp+bUo8FhjMidsk35+Tf/zkqyyy44qgCVxeqW+kW0qDWycdgN lywYePrzZwmvIaKxaXBUt2aJZUly2m6BRgOwY2XssgoPSg0XTNE4HBci+1MxbmOXfTLdU3TDPBErJ 9LNWEAnn0NmIZj3EGYdOgD8j7PgjpEyYNmOGPOu5NVJSuvoj2f7q5MGyYl0mQEkQ7PVSPmlgS9lgw vm/k5Wbg==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18NX-0000000EMqF-28ms; Sat, 05 Apr 2025 18:36:27 +0000 Received: from tor.source.kernel.org ([2600:3c04:e001:324:0:1991:8:25]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IB-0000000ELdg-2RCH; Sat, 05 Apr 2025 18:30:55 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by tor.source.kernel.org (Postfix) with ESMTP id 0379261133; Sat, 5 Apr 2025 18:30:46 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id 5882FC4CEED; Sat, 5 Apr 2025 18:30:53 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877853; bh=WBt36oHsk6qIAHDqLsgQ+hsvoCHXobQ95GH0vd6H3pA=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=rpc6oA2hhNCxEH7EgEHKQfqT3LruJ7yeBjFiGW3p58r8oImLVacODUjFL/4bjxmnq Y3NbfFVwYZwQ31OjbWF1nEn/fNxpmeUyJmNfPd0wVGJ4pIV1CIMSYJ/qjwM5x/ATRa tSD+e8/14GU8aZUq6figLW/1yfqI+wEoR48Akgb87lImm/BaQi6YUzTsaPUCVkdg1L gf9q6VwfBg1QRBnOsReWIsej+J8YzK96NNLy2hJJnLflgePQXelMiIJ0VMtUsxxD0K 9rASsfpXMB9TDD9+XoAzujC1vudxzNSg6D9WWj5phhCt8nDu1XQvPlJNkdn5KLPehL sgnElFx59GLdg== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 2/9] crypto: chacha - centralize the skcipher wrappers for arch code Date: Sat, 5 Apr 2025 11:26:02 -0700 Message-ID: <20250405182609.404216-3-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Following the example of the crc32 and crc32c code, make the crypto subsystem register both generic and architecture-optimized chacha20, xchacha20, and xchacha12 skcipher algorithms, all implemented on top of the appropriate library functions. This eliminates the need for every architecture to implement the same skcipher glue code. To register the architecture-optimized skciphers only when architecture-optimized code is actually being used, add a function chacha_is_arch_optimized() and make each arch implement it. Change each architecture's ChaCha module_init function to arch_initcall so that the CPU feature detection is guaranteed to run before chacha_is_arch_optimized() gets called by crypto/chacha.c. In the case of s390, remove the CPU feature based module autoloading, which is no longer needed since the module just gets pulled in via function linkage. Signed-off-by: Eric Biggers --- arch/arm/crypto/chacha-glue.c | 9 +- arch/arm64/crypto/chacha-neon-glue.c | 8 +- arch/mips/crypto/chacha-glue.c | 8 +- arch/powerpc/crypto/chacha-p10-glue.c | 8 +- arch/riscv/crypto/chacha-riscv64-glue.c | 8 +- arch/s390/crypto/chacha-glue.c | 8 +- arch/x86/crypto/chacha_glue.c | 8 +- crypto/Makefile | 3 +- crypto/chacha.c | 227 ++++++++++++++++++++++++ crypto/chacha_generic.c | 139 --------------- include/crypto/chacha.h | 9 + 11 files changed, 288 insertions(+), 147 deletions(-) create mode 100644 crypto/chacha.c delete mode 100644 crypto/chacha_generic.c diff --git a/arch/arm/crypto/chacha-glue.c b/arch/arm/crypto/chacha-glue.c index 50e635512046e..e1cb34d317712 100644 --- a/arch/arm/crypto/chacha-glue.c +++ b/arch/arm/crypto/chacha-glue.c @@ -285,10 +285,17 @@ static struct skcipher_alg neon_algs[] = { .encrypt = xchacha_neon, .decrypt = xchacha_neon, } }; +bool chacha_is_arch_optimized(void) +{ + /* We always can use at least the ARM scalar implementation. */ + return true; +} +EXPORT_SYMBOL(chacha_is_arch_optimized); + static int __init chacha_simd_mod_init(void) { int err = 0; if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) { @@ -331,11 +338,11 @@ static void __exit chacha_simd_mod_fini(void) if (IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && (elf_hwcap & HWCAP_NEON)) crypto_unregister_skciphers(neon_algs, ARRAY_SIZE(neon_algs)); } } -module_init(chacha_simd_mod_init); +arch_initcall(chacha_simd_mod_init); module_exit(chacha_simd_mod_fini); MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (scalar and NEON accelerated)"); MODULE_AUTHOR("Ard Biesheuvel "); MODULE_LICENSE("GPL v2"); diff --git a/arch/arm64/crypto/chacha-neon-glue.c b/arch/arm64/crypto/chacha-neon-glue.c index 229876acfc581..bb9b52321bda7 100644 --- a/arch/arm64/crypto/chacha-neon-glue.c +++ b/arch/arm64/crypto/chacha-neon-glue.c @@ -204,10 +204,16 @@ static struct skcipher_alg algs[] = { .encrypt = xchacha_neon, .decrypt = xchacha_neon, } }; +bool chacha_is_arch_optimized(void) +{ + return static_key_enabled(&have_neon); +} +EXPORT_SYMBOL(chacha_is_arch_optimized); + static int __init chacha_simd_mod_init(void) { if (!cpu_have_named_feature(ASIMD)) return 0; @@ -221,11 +227,11 @@ static void __exit chacha_simd_mod_fini(void) { if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) && cpu_have_named_feature(ASIMD)) crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); } -module_init(chacha_simd_mod_init); +arch_initcall(chacha_simd_mod_init); module_exit(chacha_simd_mod_fini); MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (NEON accelerated)"); MODULE_AUTHOR("Ard Biesheuvel "); MODULE_LICENSE("GPL v2"); diff --git a/arch/mips/crypto/chacha-glue.c b/arch/mips/crypto/chacha-glue.c index f6fc2e1079a19..64ccaeaeaa1e1 100644 --- a/arch/mips/crypto/chacha-glue.c +++ b/arch/mips/crypto/chacha-glue.c @@ -118,10 +118,16 @@ static struct skcipher_alg algs[] = { .encrypt = xchacha_mips, .decrypt = xchacha_mips, } }; +bool chacha_is_arch_optimized(void) +{ + return true; +} +EXPORT_SYMBOL(chacha_is_arch_optimized); + static int __init chacha_simd_mod_init(void) { return IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) ? crypto_register_skciphers(algs, ARRAY_SIZE(algs)) : 0; } @@ -130,11 +136,11 @@ static void __exit chacha_simd_mod_fini(void) { if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); } -module_init(chacha_simd_mod_init); +arch_initcall(chacha_simd_mod_init); module_exit(chacha_simd_mod_fini); MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (MIPS accelerated)"); MODULE_AUTHOR("Ard Biesheuvel "); MODULE_LICENSE("GPL v2"); diff --git a/arch/powerpc/crypto/chacha-p10-glue.c b/arch/powerpc/crypto/chacha-p10-glue.c index d8796decc1fb7..3355305b6c7f8 100644 --- a/arch/powerpc/crypto/chacha-p10-glue.c +++ b/arch/powerpc/crypto/chacha-p10-glue.c @@ -187,10 +187,16 @@ static struct skcipher_alg algs[] = { .encrypt = xchacha_p10, .decrypt = xchacha_p10, } }; +bool chacha_is_arch_optimized(void) +{ + return static_key_enabled(&have_p10); +} +EXPORT_SYMBOL(chacha_is_arch_optimized); + static int __init chacha_p10_init(void) { if (!cpu_has_feature(CPU_FTR_ARCH_31)) return 0; @@ -205,11 +211,11 @@ static void __exit chacha_p10_exit(void) return; crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); } -module_init(chacha_p10_init); +arch_initcall(chacha_p10_init); module_exit(chacha_p10_exit); MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (P10 accelerated)"); MODULE_AUTHOR("Danny Tsen "); MODULE_LICENSE("GPL v2"); diff --git a/arch/riscv/crypto/chacha-riscv64-glue.c b/arch/riscv/crypto/chacha-riscv64-glue.c index 68caef7a3d50b..ccaab0dea383f 100644 --- a/arch/riscv/crypto/chacha-riscv64-glue.c +++ b/arch/riscv/crypto/chacha-riscv64-glue.c @@ -47,17 +47,23 @@ void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, unsigned int bytes, } kernel_vector_end(); } EXPORT_SYMBOL(chacha_crypt_arch); +bool chacha_is_arch_optimized(void) +{ + return static_key_enabled(&use_zvkb); +} +EXPORT_SYMBOL(chacha_is_arch_optimized); + static int __init riscv64_chacha_mod_init(void) { if (riscv_isa_extension_available(NULL, ZVKB) && riscv_vector_vlen() >= 128) static_branch_enable(&use_zvkb); return 0; } -module_init(riscv64_chacha_mod_init); +arch_initcall(riscv64_chacha_mod_init); MODULE_DESCRIPTION("ChaCha stream cipher (RISC-V optimized)"); MODULE_AUTHOR("Jerry Shih "); MODULE_LICENSE("GPL"); diff --git a/arch/s390/crypto/chacha-glue.c b/arch/s390/crypto/chacha-glue.c index 920e9f0941e75..0c68191f2aa4c 100644 --- a/arch/s390/crypto/chacha-glue.c +++ b/arch/s390/crypto/chacha-glue.c @@ -101,10 +101,16 @@ static struct skcipher_alg chacha_algs[] = { .encrypt = chacha20_s390, .decrypt = chacha20_s390, } }; +bool chacha_is_arch_optimized(void) +{ + return cpu_has_vx(); +} +EXPORT_SYMBOL(chacha_is_arch_optimized); + static int __init chacha_mod_init(void) { return IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) ? crypto_register_skciphers(chacha_algs, ARRAY_SIZE(chacha_algs)) : 0; } @@ -113,11 +119,11 @@ static void __exit chacha_mod_fini(void) { if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) crypto_unregister_skciphers(chacha_algs, ARRAY_SIZE(chacha_algs)); } -module_cpu_feature_match(S390_CPU_FEATURE_VXRS, chacha_mod_init); +arch_initcall(chacha_mod_init); module_exit(chacha_mod_fini); MODULE_DESCRIPTION("ChaCha20 stream cipher"); MODULE_LICENSE("GPL v2"); diff --git a/arch/x86/crypto/chacha_glue.c b/arch/x86/crypto/chacha_glue.c index 8bb74a2728798..83a07b31cdd3a 100644 --- a/arch/x86/crypto/chacha_glue.c +++ b/arch/x86/crypto/chacha_glue.c @@ -268,10 +268,16 @@ static struct skcipher_alg algs[] = { .encrypt = xchacha_simd, .decrypt = xchacha_simd, }, }; +bool chacha_is_arch_optimized(void) +{ + return static_key_enabled(&chacha_use_simd); +} +EXPORT_SYMBOL(chacha_is_arch_optimized); + static int __init chacha_simd_mod_init(void) { if (!boot_cpu_has(X86_FEATURE_SSSE3)) return 0; @@ -295,11 +301,11 @@ static void __exit chacha_simd_mod_fini(void) { if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) && boot_cpu_has(X86_FEATURE_SSSE3)) crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); } -module_init(chacha_simd_mod_init); +arch_initcall(chacha_simd_mod_init); module_exit(chacha_simd_mod_fini); MODULE_LICENSE("GPL"); MODULE_AUTHOR("Martin Willi "); MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (x64 SIMD accelerated)"); diff --git a/crypto/Makefile b/crypto/Makefile index 0e6ab5ffd3f77..98510a2aa0b1e 100644 --- a/crypto/Makefile +++ b/crypto/Makefile @@ -146,11 +146,12 @@ obj-$(CONFIG_CRYPTO_ARC4) += arc4.o obj-$(CONFIG_CRYPTO_TEA) += tea.o obj-$(CONFIG_CRYPTO_KHAZAD) += khazad.o obj-$(CONFIG_CRYPTO_ANUBIS) += anubis.o obj-$(CONFIG_CRYPTO_SEED) += seed.o obj-$(CONFIG_CRYPTO_ARIA) += aria_generic.o -obj-$(CONFIG_CRYPTO_CHACHA20) += chacha_generic.o +obj-$(CONFIG_CRYPTO_CHACHA20) += chacha.o +CFLAGS_chacha.o += -DARCH=$(ARCH) obj-$(CONFIG_CRYPTO_POLY1305) += poly1305_generic.o obj-$(CONFIG_CRYPTO_DEFLATE) += deflate.o obj-$(CONFIG_CRYPTO_MICHAEL_MIC) += michael_mic.o obj-$(CONFIG_CRYPTO_CRC32C) += crc32c_generic.o obj-$(CONFIG_CRYPTO_CRC32) += crc32_generic.o diff --git a/crypto/chacha.c b/crypto/chacha.c new file mode 100644 index 0000000000000..2009038c5e56c --- /dev/null +++ b/crypto/chacha.c @@ -0,0 +1,227 @@ +// SPDX-License-Identifier: GPL-2.0-or-later +/* + * Crypto API wrappers for the ChaCha20, XChaCha20, and XChaCha12 stream ciphers + * + * Copyright (C) 2015 Martin Willi + * Copyright (C) 2018 Google LLC + */ + +#include +#include +#include +#include +#include + +static int chacha_stream_xor(struct skcipher_request *req, + const struct chacha_ctx *ctx, const u8 *iv, + bool arch) +{ + struct skcipher_walk walk; + u32 state[16]; + int err; + + err = skcipher_walk_virt(&walk, req, false); + + chacha_init(state, ctx->key, iv); + + while (walk.nbytes > 0) { + unsigned int nbytes = walk.nbytes; + + if (nbytes < walk.total) + nbytes = round_down(nbytes, CHACHA_BLOCK_SIZE); + + if (arch) + chacha_crypt(state, walk.dst.virt.addr, + walk.src.virt.addr, nbytes, ctx->nrounds); + else + chacha_crypt_generic(state, walk.dst.virt.addr, + walk.src.virt.addr, nbytes, + ctx->nrounds); + err = skcipher_walk_done(&walk, walk.nbytes - nbytes); + } + + return err; +} + +static int crypto_chacha_crypt_generic(struct skcipher_request *req) +{ + struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); + const struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); + + return chacha_stream_xor(req, ctx, req->iv, false); +} + +static int crypto_chacha_crypt_arch(struct skcipher_request *req) +{ + struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); + const struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); + + return chacha_stream_xor(req, ctx, req->iv, true); +} + +static int crypto_xchacha_crypt(struct skcipher_request *req, bool arch) +{ + struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); + const struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); + struct chacha_ctx subctx; + u32 state[16]; + u8 real_iv[16]; + + /* Compute the subkey given the original key and first 128 nonce bits */ + chacha_init(state, ctx->key, req->iv); + if (arch) + hchacha_block(state, subctx.key, ctx->nrounds); + else + hchacha_block_generic(state, subctx.key, ctx->nrounds); + subctx.nrounds = ctx->nrounds; + + /* Build the real IV */ + memcpy(&real_iv[0], req->iv + 24, 8); /* stream position */ + memcpy(&real_iv[8], req->iv + 16, 8); /* remaining 64 nonce bits */ + + /* Generate the stream and XOR it with the data */ + return chacha_stream_xor(req, &subctx, real_iv, arch); +} + +static int crypto_xchacha_crypt_generic(struct skcipher_request *req) +{ + return crypto_xchacha_crypt(req, false); +} + +static int crypto_xchacha_crypt_arch(struct skcipher_request *req) +{ + return crypto_xchacha_crypt(req, true); +} + +static struct skcipher_alg algs[] = { + { + .base.cra_name = "chacha20", + .base.cra_driver_name = "chacha20-generic", + .base.cra_priority = 100, + .base.cra_blocksize = 1, + .base.cra_ctxsize = sizeof(struct chacha_ctx), + .base.cra_module = THIS_MODULE, + + .min_keysize = CHACHA_KEY_SIZE, + .max_keysize = CHACHA_KEY_SIZE, + .ivsize = CHACHA_IV_SIZE, + .chunksize = CHACHA_BLOCK_SIZE, + .setkey = chacha20_setkey, + .encrypt = crypto_chacha_crypt_generic, + .decrypt = crypto_chacha_crypt_generic, + }, + { + .base.cra_name = "xchacha20", + .base.cra_driver_name = "xchacha20-generic", + .base.cra_priority = 100, + .base.cra_blocksize = 1, + .base.cra_ctxsize = sizeof(struct chacha_ctx), + .base.cra_module = THIS_MODULE, + + .min_keysize = CHACHA_KEY_SIZE, + .max_keysize = CHACHA_KEY_SIZE, + .ivsize = XCHACHA_IV_SIZE, + .chunksize = CHACHA_BLOCK_SIZE, + .setkey = chacha20_setkey, + .encrypt = crypto_xchacha_crypt_generic, + .decrypt = crypto_xchacha_crypt_generic, + }, + { + .base.cra_name = "xchacha12", + .base.cra_driver_name = "xchacha12-generic", + .base.cra_priority = 100, + .base.cra_blocksize = 1, + .base.cra_ctxsize = sizeof(struct chacha_ctx), + .base.cra_module = THIS_MODULE, + + .min_keysize = CHACHA_KEY_SIZE, + .max_keysize = CHACHA_KEY_SIZE, + .ivsize = XCHACHA_IV_SIZE, + .chunksize = CHACHA_BLOCK_SIZE, + .setkey = chacha12_setkey, + .encrypt = crypto_xchacha_crypt_generic, + .decrypt = crypto_xchacha_crypt_generic, + }, + { + .base.cra_name = "chacha20", + .base.cra_driver_name = "chacha20-" __stringify(ARCH), + .base.cra_priority = 300, + .base.cra_blocksize = 1, + .base.cra_ctxsize = sizeof(struct chacha_ctx), + .base.cra_module = THIS_MODULE, + + .min_keysize = CHACHA_KEY_SIZE, + .max_keysize = CHACHA_KEY_SIZE, + .ivsize = CHACHA_IV_SIZE, + .chunksize = CHACHA_BLOCK_SIZE, + .setkey = chacha20_setkey, + .encrypt = crypto_chacha_crypt_arch, + .decrypt = crypto_chacha_crypt_arch, + }, + { + .base.cra_name = "xchacha20", + .base.cra_driver_name = "xchacha20-" __stringify(ARCH), + .base.cra_priority = 300, + .base.cra_blocksize = 1, + .base.cra_ctxsize = sizeof(struct chacha_ctx), + .base.cra_module = THIS_MODULE, + + .min_keysize = CHACHA_KEY_SIZE, + .max_keysize = CHACHA_KEY_SIZE, + .ivsize = XCHACHA_IV_SIZE, + .chunksize = CHACHA_BLOCK_SIZE, + .setkey = chacha20_setkey, + .encrypt = crypto_xchacha_crypt_arch, + .decrypt = crypto_xchacha_crypt_arch, + }, + { + .base.cra_name = "xchacha12", + .base.cra_driver_name = "xchacha12-" __stringify(ARCH), + .base.cra_priority = 300, + .base.cra_blocksize = 1, + .base.cra_ctxsize = sizeof(struct chacha_ctx), + .base.cra_module = THIS_MODULE, + + .min_keysize = CHACHA_KEY_SIZE, + .max_keysize = CHACHA_KEY_SIZE, + .ivsize = XCHACHA_IV_SIZE, + .chunksize = CHACHA_BLOCK_SIZE, + .setkey = chacha12_setkey, + .encrypt = crypto_xchacha_crypt_arch, + .decrypt = crypto_xchacha_crypt_arch, + } +}; + +static unsigned int num_algs; + +static int __init crypto_chacha_mod_init(void) +{ + /* register the arch flavours only if they differ from generic */ + num_algs = ARRAY_SIZE(algs); + BUILD_BUG_ON(ARRAY_SIZE(algs) % 2 != 0); + if (!chacha_is_arch_optimized()) + num_algs /= 2; + + return crypto_register_skciphers(algs, num_algs); +} + +static void __exit crypto_chacha_mod_fini(void) +{ + crypto_unregister_skciphers(algs, num_algs); +} + +subsys_initcall(crypto_chacha_mod_init); +module_exit(crypto_chacha_mod_fini); + +MODULE_LICENSE("GPL"); +MODULE_AUTHOR("Martin Willi "); +MODULE_DESCRIPTION("Crypto API wrappers for the ChaCha20, XChaCha20, and XChaCha12 stream ciphers"); +MODULE_ALIAS_CRYPTO("chacha20"); +MODULE_ALIAS_CRYPTO("chacha20-generic"); +MODULE_ALIAS_CRYPTO("chacha20-" __stringify(ARCH)); +MODULE_ALIAS_CRYPTO("xchacha20"); +MODULE_ALIAS_CRYPTO("xchacha20-generic"); +MODULE_ALIAS_CRYPTO("xchacha20-" __stringify(ARCH)); +MODULE_ALIAS_CRYPTO("xchacha12"); +MODULE_ALIAS_CRYPTO("xchacha12-generic"); +MODULE_ALIAS_CRYPTO("xchacha12-" __stringify(ARCH)); diff --git a/crypto/chacha_generic.c b/crypto/chacha_generic.c deleted file mode 100644 index 1fb9fbd302c6f..0000000000000 --- a/crypto/chacha_generic.c +++ /dev/null @@ -1,139 +0,0 @@ -// SPDX-License-Identifier: GPL-2.0-or-later -/* - * ChaCha and XChaCha stream ciphers, including ChaCha20 (RFC7539) - * - * Copyright (C) 2015 Martin Willi - * Copyright (C) 2018 Google LLC - */ - -#include -#include -#include -#include -#include - -static int chacha_stream_xor(struct skcipher_request *req, - const struct chacha_ctx *ctx, const u8 *iv) -{ - struct skcipher_walk walk; - u32 state[16]; - int err; - - err = skcipher_walk_virt(&walk, req, false); - - chacha_init(state, ctx->key, iv); - - while (walk.nbytes > 0) { - unsigned int nbytes = walk.nbytes; - - if (nbytes < walk.total) - nbytes = round_down(nbytes, CHACHA_BLOCK_SIZE); - - chacha_crypt_generic(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, ctx->nrounds); - err = skcipher_walk_done(&walk, walk.nbytes - nbytes); - } - - return err; -} - -static int crypto_chacha_crypt(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - - return chacha_stream_xor(req, ctx, req->iv); -} - -static int crypto_xchacha_crypt(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - struct chacha_ctx subctx; - u32 state[16]; - u8 real_iv[16]; - - /* Compute the subkey given the original key and first 128 nonce bits */ - chacha_init(state, ctx->key, req->iv); - hchacha_block_generic(state, subctx.key, ctx->nrounds); - subctx.nrounds = ctx->nrounds; - - /* Build the real IV */ - memcpy(&real_iv[0], req->iv + 24, 8); /* stream position */ - memcpy(&real_iv[8], req->iv + 16, 8); /* remaining 64 nonce bits */ - - /* Generate the stream and XOR it with the data */ - return chacha_stream_xor(req, &subctx, real_iv); -} - -static struct skcipher_alg algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-generic", - .base.cra_priority = 100, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = crypto_chacha_crypt, - .decrypt = crypto_chacha_crypt, - }, { - .base.cra_name = "xchacha20", - .base.cra_driver_name = "xchacha20-generic", - .base.cra_priority = 100, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = crypto_xchacha_crypt, - .decrypt = crypto_xchacha_crypt, - }, { - .base.cra_name = "xchacha12", - .base.cra_driver_name = "xchacha12-generic", - .base.cra_priority = 100, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha12_setkey, - .encrypt = crypto_xchacha_crypt, - .decrypt = crypto_xchacha_crypt, - } -}; - -static int __init chacha_generic_mod_init(void) -{ - return crypto_register_skciphers(algs, ARRAY_SIZE(algs)); -} - -static void __exit chacha_generic_mod_fini(void) -{ - crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); -} - -subsys_initcall(chacha_generic_mod_init); -module_exit(chacha_generic_mod_fini); - -MODULE_LICENSE("GPL"); -MODULE_AUTHOR("Martin Willi "); -MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (generic)"); -MODULE_ALIAS_CRYPTO("chacha20"); -MODULE_ALIAS_CRYPTO("chacha20-generic"); -MODULE_ALIAS_CRYPTO("xchacha20"); -MODULE_ALIAS_CRYPTO("xchacha20-generic"); -MODULE_ALIAS_CRYPTO("xchacha12"); -MODULE_ALIAS_CRYPTO("xchacha12-generic"); diff --git a/include/crypto/chacha.h b/include/crypto/chacha.h index f8cc073bba41b..58129e18cc31d 100644 --- a/include/crypto/chacha.h +++ b/include/crypto/chacha.h @@ -97,6 +97,15 @@ static inline void chacha20_crypt(u32 *state, u8 *dst, const u8 *src, unsigned int bytes) { chacha_crypt(state, dst, src, bytes, 20); } +#if IS_ENABLED(CONFIG_CRYPTO_ARCH_HAVE_LIB_CHACHA) +bool chacha_is_arch_optimized(void); +#else +static inline bool chacha_is_arch_optimized(void) +{ + return false; +} +#endif + #endif /* _CRYPTO_CHACHA_H */ From patchwork Sat Apr 5 18:26:03 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039219 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id D134BC36010 for ; Sat, 5 Apr 2025 18:40:20 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=+WlwStlSnfC5NEPsQH1uCIJTnt3x1QBzvDJxtvkfY/g=; b=Wc6zrd2y7BerUSrm5jsNIdn0Ea aFaf5JIo8eMcGybkeqXzv1fUcZxulvN6sWEgkjBBJGw8m3tmqQ8CO79JDpMZdR8UzurWLY/e2kYdq F9E9Z0jGu3PJJk5yfY+RJ2JCiG1/SWR7CjqUwyQcIUe+MumufhZelCmYxqiE2J2JuAHTVc6wjo1b+ GVCOFFayy9LkbwB5d2FzmGPwcJ4dOM0T8sUlZTnUjxhvIcwsofQpY+zB1ooOf3oW1iwBfLCW6Ek8z T9ALsCkxpvmyTI6WWH1Djq/qRdmfEPazKsL4b4w2NbNFb/+4URSeR0JjZEYylUZKtmMH5xX6JbQNd 2uX4M6Fg==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18R7-0000000ENMw-3dhQ; Sat, 05 Apr 2025 18:40:09 +0000 Received: from dfw.source.kernel.org ([139.178.84.217]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IB-0000000ELda-03P8; Sat, 05 Apr 2025 18:30:56 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by dfw.source.kernel.org (Postfix) with ESMTP id 57AF25C4D7F; Sat, 5 Apr 2025 18:28:37 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id D58B1C4CEEF; Sat, 5 Apr 2025 18:30:53 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877854; bh=D2tEpsWY4w41zyO9mRm9ESpTHuZZsTPd3X8yjRIpe0o=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=noIpjDqDhLDBABGCHVNmQQ2h8BWnwAxFDtsOFcoFKNxk0gyPU/LOtpq2nCtr4ctXY rLdYf4hSIAxZF6Iqma5VDA29gdV75qEAO6DNobzim4h40VZuxABxPTSuy5qA6Z3+Gi u26LT6Ped85xCa7E/CXxXYKJBJwJOgJFErvUIjCa3oK7MJJORBbuNgyXVI9UQPMHsg OPlyHfm80Opu0kXqVxLNVQVCqi7H4A9CleAyNUQMgvbWkvw+/Mm6dsTtqoDPdCmmQB I9dD9wvAB3/58AU88Te4T5KoSY/LAsgJwh/SXU8JPoTFI8l0knVbL2yEPMYIN21zZv bsDJbKSaBKKIw== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 3/9] crypto: arm/chacha - remove the redundant skcipher algorithms Date: Sat, 5 Apr 2025 11:26:03 -0700 Message-ID: <20250405182609.404216-4-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-CRM114-Version: 20100106-BlameMichelson ( TRE 0.8.0 (BSD) ) MR-646709E3 X-CRM114-CacheID: sfid-20250405_113055_152390_B1C08074 X-CRM114-Status: GOOD ( 17.52 ) X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Since crypto/chacha.c now registers chacha20-$(ARCH), xchacha20-$(ARCH), and xchacha12-$(ARCH) skcipher algorithms that use the architecture's ChaCha and HChaCha library functions, individual architectures no longer need to do the same. Therefore, remove the redundant skcipher algorithms and leave just the library functions. Signed-off-by: Eric Biggers --- arch/arm/crypto/Kconfig | 7 - arch/arm/crypto/chacha-glue.c | 242 +---------------------------- arch/arm/crypto/chacha-neon-core.S | 2 +- 3 files changed, 7 insertions(+), 244 deletions(-) diff --git a/arch/arm/crypto/Kconfig b/arch/arm/crypto/Kconfig index 23e4ea067ddbb..624c4c8c5296a 100644 --- a/arch/arm/crypto/Kconfig +++ b/arch/arm/crypto/Kconfig @@ -214,17 +214,10 @@ config CRYPTO_AES_ARM_CE Architecture: arm using: - ARMv8 Crypto Extensions config CRYPTO_CHACHA20_NEON tristate - select CRYPTO_SKCIPHER select CRYPTO_ARCH_HAVE_LIB_CHACHA default CRYPTO_LIB_CHACHA_INTERNAL - help - Length-preserving ciphers: ChaCha20, XChaCha20, and XChaCha12 - stream cipher algorithms - - Architecture: arm using: - - NEON (Advanced SIMD) extensions endmenu diff --git a/arch/arm/crypto/chacha-glue.c b/arch/arm/crypto/chacha-glue.c index e1cb34d317712..3a5c75c95d43b 100644 --- a/arch/arm/crypto/chacha-glue.c +++ b/arch/arm/crypto/chacha-glue.c @@ -1,18 +1,15 @@ // SPDX-License-Identifier: GPL-2.0 /* - * ARM NEON accelerated ChaCha and XChaCha stream ciphers, - * including ChaCha20 (RFC7539) + * ChaCha and HChaCha functions (ARM optimized) * * Copyright (C) 2016-2019 Linaro, Ltd. * Copyright (C) 2015 Martin Willi */ -#include -#include +#include #include -#include #include #include #include #include @@ -98,262 +95,35 @@ void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, unsigned int bytes, dst += todo; } while (bytes); } EXPORT_SYMBOL(chacha_crypt_arch); -static int chacha_stream_xor(struct skcipher_request *req, - const struct chacha_ctx *ctx, const u8 *iv, - bool neon) -{ - struct skcipher_walk walk; - u32 state[16]; - int err; - - err = skcipher_walk_virt(&walk, req, false); - - chacha_init(state, ctx->key, iv); - - while (walk.nbytes > 0) { - unsigned int nbytes = walk.nbytes; - - if (nbytes < walk.total) - nbytes = round_down(nbytes, walk.stride); - - if (!IS_ENABLED(CONFIG_KERNEL_MODE_NEON) || !neon) { - chacha_doarm(walk.dst.virt.addr, walk.src.virt.addr, - nbytes, state, ctx->nrounds); - state[12] += DIV_ROUND_UP(nbytes, CHACHA_BLOCK_SIZE); - } else { - kernel_neon_begin(); - chacha_doneon(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, ctx->nrounds); - kernel_neon_end(); - } - err = skcipher_walk_done(&walk, walk.nbytes - nbytes); - } - - return err; -} - -static int do_chacha(struct skcipher_request *req, bool neon) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - - return chacha_stream_xor(req, ctx, req->iv, neon); -} - -static int chacha_arm(struct skcipher_request *req) -{ - return do_chacha(req, false); -} - -static int chacha_neon(struct skcipher_request *req) -{ - return do_chacha(req, neon_usable()); -} - -static int do_xchacha(struct skcipher_request *req, bool neon) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - struct chacha_ctx subctx; - u32 state[16]; - u8 real_iv[16]; - - chacha_init(state, ctx->key, req->iv); - - if (!IS_ENABLED(CONFIG_KERNEL_MODE_NEON) || !neon) { - hchacha_block_arm(state, subctx.key, ctx->nrounds); - } else { - kernel_neon_begin(); - hchacha_block_neon(state, subctx.key, ctx->nrounds); - kernel_neon_end(); - } - subctx.nrounds = ctx->nrounds; - - memcpy(&real_iv[0], req->iv + 24, 8); - memcpy(&real_iv[8], req->iv + 16, 8); - return chacha_stream_xor(req, &subctx, real_iv, neon); -} - -static int xchacha_arm(struct skcipher_request *req) -{ - return do_xchacha(req, false); -} - -static int xchacha_neon(struct skcipher_request *req) -{ - return do_xchacha(req, neon_usable()); -} - -static struct skcipher_alg arm_algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-arm", - .base.cra_priority = 200, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = chacha_arm, - .decrypt = chacha_arm, - }, { - .base.cra_name = "xchacha20", - .base.cra_driver_name = "xchacha20-arm", - .base.cra_priority = 200, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = xchacha_arm, - .decrypt = xchacha_arm, - }, { - .base.cra_name = "xchacha12", - .base.cra_driver_name = "xchacha12-arm", - .base.cra_priority = 200, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha12_setkey, - .encrypt = xchacha_arm, - .decrypt = xchacha_arm, - }, -}; - -static struct skcipher_alg neon_algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-neon", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .walksize = 4 * CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = chacha_neon, - .decrypt = chacha_neon, - }, { - .base.cra_name = "xchacha20", - .base.cra_driver_name = "xchacha20-neon", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .walksize = 4 * CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = xchacha_neon, - .decrypt = xchacha_neon, - }, { - .base.cra_name = "xchacha12", - .base.cra_driver_name = "xchacha12-neon", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .walksize = 4 * CHACHA_BLOCK_SIZE, - .setkey = chacha12_setkey, - .encrypt = xchacha_neon, - .decrypt = xchacha_neon, - } -}; - bool chacha_is_arch_optimized(void) { /* We always can use at least the ARM scalar implementation. */ return true; } EXPORT_SYMBOL(chacha_is_arch_optimized); -static int __init chacha_simd_mod_init(void) +static int __init chacha_arm_mod_init(void) { - int err = 0; - - if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) { - err = crypto_register_skciphers(arm_algs, ARRAY_SIZE(arm_algs)); - if (err) - return err; - } - if (IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && (elf_hwcap & HWCAP_NEON)) { - int i; - switch (read_cpuid_part()) { case ARM_CPU_PART_CORTEX_A7: case ARM_CPU_PART_CORTEX_A5: /* * The Cortex-A7 and Cortex-A5 do not perform well with * the NEON implementation but do incredibly with the * scalar one and use less power. */ - for (i = 0; i < ARRAY_SIZE(neon_algs); i++) - neon_algs[i].base.cra_priority = 0; break; default: static_branch_enable(&use_neon); } - - if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) { - err = crypto_register_skciphers(neon_algs, ARRAY_SIZE(neon_algs)); - if (err) - crypto_unregister_skciphers(arm_algs, ARRAY_SIZE(arm_algs)); - } } - return err; + return 0; } +arch_initcall(chacha_arm_mod_init); -static void __exit chacha_simd_mod_fini(void) -{ - if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) { - crypto_unregister_skciphers(arm_algs, ARRAY_SIZE(arm_algs)); - if (IS_ENABLED(CONFIG_KERNEL_MODE_NEON) && (elf_hwcap & HWCAP_NEON)) - crypto_unregister_skciphers(neon_algs, ARRAY_SIZE(neon_algs)); - } -} - -arch_initcall(chacha_simd_mod_init); -module_exit(chacha_simd_mod_fini); - -MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (scalar and NEON accelerated)"); +MODULE_DESCRIPTION("ChaCha and HChaCha functions (ARM optimized)"); MODULE_AUTHOR("Ard Biesheuvel "); MODULE_LICENSE("GPL v2"); -MODULE_ALIAS_CRYPTO("chacha20"); -MODULE_ALIAS_CRYPTO("chacha20-arm"); -MODULE_ALIAS_CRYPTO("xchacha20"); -MODULE_ALIAS_CRYPTO("xchacha20-arm"); -MODULE_ALIAS_CRYPTO("xchacha12"); -MODULE_ALIAS_CRYPTO("xchacha12-arm"); -#ifdef CONFIG_KERNEL_MODE_NEON -MODULE_ALIAS_CRYPTO("chacha20-neon"); -MODULE_ALIAS_CRYPTO("xchacha20-neon"); -MODULE_ALIAS_CRYPTO("xchacha12-neon"); -#endif diff --git a/arch/arm/crypto/chacha-neon-core.S b/arch/arm/crypto/chacha-neon-core.S index 13d12f672656b..ddd62b6294a57 100644 --- a/arch/arm/crypto/chacha-neon-core.S +++ b/arch/arm/crypto/chacha-neon-core.S @@ -1,7 +1,7 @@ /* - * ChaCha/XChaCha NEON helper functions + * ChaCha/HChaCha NEON helper functions * * Copyright (C) 2016 Linaro, Ltd. * * This program is free software; you can redistribute it and/or modify * it under the terms of the GNU General Public License version 2 as From patchwork Sat Apr 5 18:26:04 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039212 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id 1BFC6C36010 for ; Sat, 5 Apr 2025 18:34:51 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=HCFT+OavEDwKSejKm18+2bNMC+EBAb4QvMYroEM42f4=; b=a3KABbdHOdQsV79oSpyZmyjgE7 4wkG9ESNENcsYesowgLqAEEAO8BLJVGaueZndJpS3WUoZqvvoR5C2rizyqS/ltWCYsZdIMz/GBMBq 0zSDgtSW78AJU7CLwYoUMEAZeKNlpnXb2Jg5/ElnK5lNFYd2AaomfZwZm3D1Xg0olsZJfyYlcHLO6 kLa0nFYDH0F5de3DlwA4J1YAhYHIO0uuywXp4eMiMts0JP4Krux1J24IvP+VOA7KoKJmWsWS6mT4d 8paS0p4nJgFjrQ42lkdZILBKbPqpJ/x4NVtncDGKwhI0Zn3utmiLsG6PIi+SbEE/Aj0CeVFK7sg8G 8wdCjGog==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18Ll-0000000EMVy-3DxP; Sat, 05 Apr 2025 18:34:37 +0000 Received: from tor.source.kernel.org ([172.105.4.254]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IB-0000000ELdy-2W2e; Sat, 05 Apr 2025 18:30:55 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by tor.source.kernel.org (Postfix) with ESMTP id 1C4CA61135; Sat, 5 Apr 2025 18:30:47 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id 56EB2C4CEEB; Sat, 5 Apr 2025 18:30:54 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877854; bh=/YsQAqEWiFVgtY7NB130SUnvpny2qsjrrQSIe4BWHhI=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=Vt1yl7LhoNBAre23bvVjNQQimbyoMqFuQeFMo/U+MgneXLvBOZdIQ3VzJVLNiCaUo GsXFs93u7q+cyqywulR9bFvE7Ua1VmAEhrKVtimxgM6w2QktG1OdtG4HqaQvsfQA5S 39TBRSiLBNVzVyY+SyJOgdQ3U/smTyqSNW7tHwwdK2RwU9HrUdyIZk1Wa8xP+I3i1j dao9ZIFX6i6tEHk+Dta1g3iUZwl45cY05XQXK4zU2qZPDQXSn+GbytrJrR/WUNY7LF 4T9ySj/RUuedagAt7Kn5YUbeEUD43AefnX1l1o+CCdvSkH+P9+oMLjEvt8lW3feOsE XNGvQ5idyUpVg== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 4/9] crypto: arm64/chacha - remove the skcipher algorithms Date: Sat, 5 Apr 2025 11:26:04 -0700 Message-ID: <20250405182609.404216-5-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Since crypto/chacha.c now registers chacha20-$(ARCH), xchacha20-$(ARCH), and xchacha12-$(ARCH) skcipher algorithms that use the architecture's ChaCha and HChaCha library functions, individual architectures no longer need to do the same. Therefore, remove the redundant skcipher algorithms and leave just the library functions. Signed-off-by: Eric Biggers --- arch/arm64/crypto/Kconfig | 7 -- arch/arm64/crypto/chacha-neon-core.S | 2 +- arch/arm64/crypto/chacha-neon-glue.c | 144 ++------------------------- 3 files changed, 7 insertions(+), 146 deletions(-) diff --git a/arch/arm64/crypto/Kconfig b/arch/arm64/crypto/Kconfig index 3418c8d3c78d4..ce655da0fbeea 100644 --- a/arch/arm64/crypto/Kconfig +++ b/arch/arm64/crypto/Kconfig @@ -187,20 +187,13 @@ config CRYPTO_AES_ARM64_NEON_BLK - NEON (Advanced SIMD) extensions config CRYPTO_CHACHA20_NEON tristate depends on KERNEL_MODE_NEON - select CRYPTO_SKCIPHER select CRYPTO_LIB_CHACHA_GENERIC select CRYPTO_ARCH_HAVE_LIB_CHACHA default CRYPTO_LIB_CHACHA_INTERNAL - help - Length-preserving ciphers: ChaCha20, XChaCha20, and XChaCha12 - stream cipher algorithms - - Architecture: arm64 using: - - NEON (Advanced SIMD) extensions config CRYPTO_AES_ARM64_BS tristate "Ciphers: AES, modes: ECB/CBC/CTR/XCTR/XTS modes (bit-sliced NEON)" depends on KERNEL_MODE_NEON select CRYPTO_SKCIPHER diff --git a/arch/arm64/crypto/chacha-neon-core.S b/arch/arm64/crypto/chacha-neon-core.S index b70ac76f2610c..80079586ecc7a 100644 --- a/arch/arm64/crypto/chacha-neon-core.S +++ b/arch/arm64/crypto/chacha-neon-core.S @@ -1,7 +1,7 @@ /* - * ChaCha/XChaCha NEON helper functions + * ChaCha/HChaCha NEON helper functions * * Copyright (C) 2016-2018 Linaro, Ltd. * * This program is free software; you can redistribute it and/or modify * it under the terms of the GNU General Public License version 2 as diff --git a/arch/arm64/crypto/chacha-neon-glue.c b/arch/arm64/crypto/chacha-neon-glue.c index bb9b52321bda7..a0c336b284027 100644 --- a/arch/arm64/crypto/chacha-neon-glue.c +++ b/arch/arm64/crypto/chacha-neon-glue.c @@ -1,8 +1,7 @@ /* - * ARM NEON and scalar accelerated ChaCha and XChaCha stream ciphers, - * including ChaCha20 (RFC7539) + * ChaCha and HChaCha functions (ARM64 optimized) * * Copyright (C) 2016 - 2017 Linaro, Ltd. * * This program is free software; you can redistribute it and/or modify * it under the terms of the GNU General Public License version 2 as @@ -17,14 +16,12 @@ * it under the terms of the GNU General Public License as published by * the Free Software Foundation; either version 2 of the License, or * (at your option) any later version. */ -#include -#include +#include #include -#include #include #include #include #include @@ -93,151 +90,22 @@ void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, unsigned int bytes, dst += todo; } while (bytes); } EXPORT_SYMBOL(chacha_crypt_arch); -static int chacha_neon_stream_xor(struct skcipher_request *req, - const struct chacha_ctx *ctx, const u8 *iv) -{ - struct skcipher_walk walk; - u32 state[16]; - int err; - - err = skcipher_walk_virt(&walk, req, false); - - chacha_init(state, ctx->key, iv); - - while (walk.nbytes > 0) { - unsigned int nbytes = walk.nbytes; - - if (nbytes < walk.total) - nbytes = rounddown(nbytes, walk.stride); - - if (!static_branch_likely(&have_neon) || - !crypto_simd_usable()) { - chacha_crypt_generic(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, - ctx->nrounds); - } else { - kernel_neon_begin(); - chacha_doneon(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, ctx->nrounds); - kernel_neon_end(); - } - err = skcipher_walk_done(&walk, walk.nbytes - nbytes); - } - - return err; -} - -static int chacha_neon(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - - return chacha_neon_stream_xor(req, ctx, req->iv); -} - -static int xchacha_neon(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - struct chacha_ctx subctx; - u32 state[16]; - u8 real_iv[16]; - - chacha_init(state, ctx->key, req->iv); - hchacha_block_arch(state, subctx.key, ctx->nrounds); - subctx.nrounds = ctx->nrounds; - - memcpy(&real_iv[0], req->iv + 24, 8); - memcpy(&real_iv[8], req->iv + 16, 8); - return chacha_neon_stream_xor(req, &subctx, real_iv); -} - -static struct skcipher_alg algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-neon", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .walksize = 5 * CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = chacha_neon, - .decrypt = chacha_neon, - }, { - .base.cra_name = "xchacha20", - .base.cra_driver_name = "xchacha20-neon", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .walksize = 5 * CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = xchacha_neon, - .decrypt = xchacha_neon, - }, { - .base.cra_name = "xchacha12", - .base.cra_driver_name = "xchacha12-neon", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .walksize = 5 * CHACHA_BLOCK_SIZE, - .setkey = chacha12_setkey, - .encrypt = xchacha_neon, - .decrypt = xchacha_neon, - } -}; - bool chacha_is_arch_optimized(void) { return static_key_enabled(&have_neon); } EXPORT_SYMBOL(chacha_is_arch_optimized); static int __init chacha_simd_mod_init(void) { - if (!cpu_have_named_feature(ASIMD)) - return 0; - - static_branch_enable(&have_neon); - - return IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) ? - crypto_register_skciphers(algs, ARRAY_SIZE(algs)) : 0; + if (cpu_have_named_feature(ASIMD)) + static_branch_enable(&have_neon); + return 0; } - -static void __exit chacha_simd_mod_fini(void) -{ - if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) && cpu_have_named_feature(ASIMD)) - crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); -} - arch_initcall(chacha_simd_mod_init); -module_exit(chacha_simd_mod_fini); -MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (NEON accelerated)"); +MODULE_DESCRIPTION("ChaCha and HChaCha functions (ARM64 optimized)"); MODULE_AUTHOR("Ard Biesheuvel "); MODULE_LICENSE("GPL v2"); -MODULE_ALIAS_CRYPTO("chacha20"); -MODULE_ALIAS_CRYPTO("chacha20-neon"); -MODULE_ALIAS_CRYPTO("xchacha20"); -MODULE_ALIAS_CRYPTO("xchacha20-neon"); -MODULE_ALIAS_CRYPTO("xchacha12"); -MODULE_ALIAS_CRYPTO("xchacha12-neon"); From patchwork Sat Apr 5 18:26:05 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039221 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id EEC87C36010 for ; Sat, 5 Apr 2025 18:43:58 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=WzivNS10NrpISCk5PJ1p/U97/3gZKzHxjSqukklIY2g=; b=lyVLxC9JbuBIX/QfOBcv2+pqAK 91ItINX0c/ate3vzI6D88t/w02NQSqhdpLzZXAJltjn59pf2AaMH7fk79aEcnPi0bNsaT8rZOjYno tcdVhzTdT98mKUZcM2m5kW/T6myeXbhkgOCy0XShY77wMymBEI6HALA8llA/5RAoBMB4EX5PzLBxZ IADta/ZTarVU+fdRGKDv6ZwjLqB7ocNEfnqZsHYTdC5/ktfe6kAE7RIoeHayIFl1Vnqf8NbDFyW6j 2Jkxm3o3GaIPuQf863Z5kdSEkjOvKRr0S82eu0NdSEJ/N/SC6nbLsgkLwzAtGvdXkbYcACgW89NRN Nmhk2rXQ==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18Ug-0000000ENl8-2JrT; Sat, 05 Apr 2025 18:43:50 +0000 Received: from dfw.source.kernel.org ([139.178.84.217]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IB-0000000ELe9-3mWd; Sat, 05 Apr 2025 18:30:57 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by dfw.source.kernel.org (Postfix) with ESMTP id 4DDE45C4D95; Sat, 5 Apr 2025 18:28:38 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id CC72FC4CEE4; Sat, 5 Apr 2025 18:30:54 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877855; bh=3GecDYJPP+f/otGdToWIa9TB2f+in47vXsMRPZd3kDo=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=BMvkeX7PDVea3p39UTaGrF7AVBRn2MBPChS6OFi6ciRP0KDkF+rJawry/gPZGS2bo kSnVQBXr669M4rcIWxd+LCz6OGQ6eqBMUF6SrycfjCv9AajLSUPLCfMApP4PVQB6BY BCtfS7MzDpRRM0oxD0s5frAPm8YZx8Q0dfuwUWekYJaQPMfkO/bWdv8o9Kkm772t+I rxLq5WBMke1+sdyDTXtkWqynjbBAaId1V5cQvdL3Vll9/u6CVRAAKb8w3NmLb6CPZv Aj/VU9CLMWXaFtbR0F06zN2W4xfw/eecvkTamGE9xGSRM+MoJEkfsE8u2L0QyGZAbe npBiy5Aho48zg== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 5/9] crypto: mips/chacha - remove the skcipher algorithms Date: Sat, 5 Apr 2025 11:26:05 -0700 Message-ID: <20250405182609.404216-6-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-CRM114-Version: 20100106-BlameMichelson ( TRE 0.8.0 (BSD) ) MR-646709E3 X-CRM114-CacheID: sfid-20250405_113056_043393_F30A5D7F X-CRM114-Status: GOOD ( 11.42 ) X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Since crypto/chacha.c now registers chacha20-$(ARCH), xchacha20-$(ARCH), and xchacha12-$(ARCH) skcipher algorithms that use the architecture's ChaCha and HChaCha library functions, individual architectures no longer need to do the same. Therefore, remove the redundant skcipher algorithms and leave just the library functions. Signed-off-by: Eric Biggers --- arch/mips/crypto/Kconfig | 6 -- arch/mips/crypto/chacha-glue.c | 131 +-------------------------------- 2 files changed, 3 insertions(+), 134 deletions(-) diff --git a/arch/mips/crypto/Kconfig b/arch/mips/crypto/Kconfig index 545fc0e12422d..0189686de3a12 100644 --- a/arch/mips/crypto/Kconfig +++ b/arch/mips/crypto/Kconfig @@ -54,15 +54,9 @@ config CRYPTO_SHA512_OCTEON Architecture: mips OCTEON using crypto instructions, when available config CRYPTO_CHACHA_MIPS tristate depends on CPU_MIPS32_R2 - select CRYPTO_SKCIPHER select CRYPTO_ARCH_HAVE_LIB_CHACHA default CRYPTO_LIB_CHACHA_INTERNAL - help - Length-preserving ciphers: ChaCha20, XChaCha20, and XChaCha12 - stream cipher algorithms - - Architecture: MIPS32r2 endmenu diff --git a/arch/mips/crypto/chacha-glue.c b/arch/mips/crypto/chacha-glue.c index 64ccaeaeaa1e1..334ecb29fb8fa 100644 --- a/arch/mips/crypto/chacha-glue.c +++ b/arch/mips/crypto/chacha-glue.c @@ -1,152 +1,27 @@ // SPDX-License-Identifier: GPL-2.0 /* - * MIPS accelerated ChaCha and XChaCha stream ciphers, - * including ChaCha20 (RFC7539) + * ChaCha and HChaCha functions (MIPS optimized) * * Copyright (C) 2019 Linaro, Ltd. */ -#include -#include -#include -#include +#include #include #include asmlinkage void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, unsigned int bytes, int nrounds); EXPORT_SYMBOL(chacha_crypt_arch); asmlinkage void hchacha_block_arch(const u32 *state, u32 *stream, int nrounds); EXPORT_SYMBOL(hchacha_block_arch); -static int chacha_mips_stream_xor(struct skcipher_request *req, - const struct chacha_ctx *ctx, const u8 *iv) -{ - struct skcipher_walk walk; - u32 state[16]; - int err; - - err = skcipher_walk_virt(&walk, req, false); - - chacha_init(state, ctx->key, iv); - - while (walk.nbytes > 0) { - unsigned int nbytes = walk.nbytes; - - if (nbytes < walk.total) - nbytes = round_down(nbytes, walk.stride); - - chacha_crypt(state, walk.dst.virt.addr, walk.src.virt.addr, - nbytes, ctx->nrounds); - err = skcipher_walk_done(&walk, walk.nbytes - nbytes); - } - - return err; -} - -static int chacha_mips(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - - return chacha_mips_stream_xor(req, ctx, req->iv); -} - -static int xchacha_mips(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - struct chacha_ctx subctx; - u32 state[16]; - u8 real_iv[16]; - - chacha_init(state, ctx->key, req->iv); - - hchacha_block(state, subctx.key, ctx->nrounds); - subctx.nrounds = ctx->nrounds; - - memcpy(&real_iv[0], req->iv + 24, 8); - memcpy(&real_iv[8], req->iv + 16, 8); - return chacha_mips_stream_xor(req, &subctx, real_iv); -} - -static struct skcipher_alg algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-mips", - .base.cra_priority = 200, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = chacha_mips, - .decrypt = chacha_mips, - }, { - .base.cra_name = "xchacha20", - .base.cra_driver_name = "xchacha20-mips", - .base.cra_priority = 200, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = xchacha_mips, - .decrypt = xchacha_mips, - }, { - .base.cra_name = "xchacha12", - .base.cra_driver_name = "xchacha12-mips", - .base.cra_priority = 200, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha12_setkey, - .encrypt = xchacha_mips, - .decrypt = xchacha_mips, - } -}; - bool chacha_is_arch_optimized(void) { return true; } EXPORT_SYMBOL(chacha_is_arch_optimized); -static int __init chacha_simd_mod_init(void) -{ - return IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) ? - crypto_register_skciphers(algs, ARRAY_SIZE(algs)) : 0; -} - -static void __exit chacha_simd_mod_fini(void) -{ - if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) - crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); -} - -arch_initcall(chacha_simd_mod_init); -module_exit(chacha_simd_mod_fini); - -MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (MIPS accelerated)"); +MODULE_DESCRIPTION("ChaCha and HChaCha functions (MIPS optimized)"); MODULE_AUTHOR("Ard Biesheuvel "); MODULE_LICENSE("GPL v2"); -MODULE_ALIAS_CRYPTO("chacha20"); -MODULE_ALIAS_CRYPTO("chacha20-mips"); -MODULE_ALIAS_CRYPTO("xchacha20"); -MODULE_ALIAS_CRYPTO("xchacha20-mips"); -MODULE_ALIAS_CRYPTO("xchacha12"); -MODULE_ALIAS_CRYPTO("xchacha12-mips"); From patchwork Sat Apr 5 18:26:06 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039235 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id CFB03C36010 for ; Sat, 5 Apr 2025 20:03:18 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=qZp6NfcoO3Mma48KO7kExbEUQr7oAEa2sm8/7e6r6Pg=; b=U/HAjQ9g8HqW5TnmpKe9H8Bk3e 37GsNQZG/7OXeIHIMGwjZExnG8Ouef4i+yXD+E3KbOff0oosrADkxYBbUaQFNzFS4q2tqbo2M+ziC fTjXkRoovz8L2xws60GiIA6RdDexwl+AYhDvYGGPTyb1zZ8l6K/YAcDNRUhiG1yV9etrgzbrt8PAF Ai93jI0I9BRxL2M9CaIogNv0MiB+tVlcQX6P0J7oUfwLWXaD70XPdJ2xkLZLDsuJnjiNohY/B6xbr p+k/zNTyVJLw0nPqVlTDAq4HnshG0PDGRn8JTGQMRYSwG5nVLR7hbeMn1i5Qy6EJnmC4uEaXGbfl4 n0vDTSFw==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u19jR-0000000ES0C-1wQQ; Sat, 05 Apr 2025 20:03:09 +0000 Received: from dfw.source.kernel.org ([2604:1380:4641:c500::1]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IC-0000000ELen-1xQi; Sat, 05 Apr 2025 18:30:58 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by dfw.source.kernel.org (Postfix) with ESMTP id C32765C4D9E; Sat, 5 Apr 2025 18:28:38 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id 4CEA9C4CEEE; Sat, 5 Apr 2025 18:30:55 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877855; bh=sp+YKRB5qh5FHE2Tj1xgoFnzTPaRWdpcQVjEm9KF2PI=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=scYmXbaipYKKc8QpG34jHUIr4k1VmpwvXHSu6rlkPNSfO8c8v5VgDQUYJB9j4cvY2 CpFZ4rahL/U1iZwjRVgCuq/5hAQeA4bDQerVFLRIC5CNFpjguxM5N82CpbA3AcHg/A jZKjVD3cl8WcEbVjs0WIHdPd9B05ahYeKdu0aZKHN0xZyAGuOM0UbHK9wdFO2kLcsJ Jb7btAhBUEogJ6n91F3qdy3dW+V2x8wrRnI1vdUWLlfUEIb1xRNdfw8IYYhfdtzOaO GBJRlXRpaZj7UTR52bavnp4Pmp0MBHN/PCMGQBByWA0O64kb8R7s69sJwWhdTaaXXd ANAb5E+su+P3w== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 6/9] crypto: powerpc/chacha - remove the skcipher algorithms Date: Sat, 5 Apr 2025 11:26:06 -0700 Message-ID: <20250405182609.404216-7-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-CRM114-Version: 20100106-BlameMichelson ( TRE 0.8.0 (BSD) ) MR-646709E3 X-CRM114-CacheID: sfid-20250405_113056_615998_3144F5E2 X-CRM114-Status: GOOD ( 13.85 ) X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Since crypto/chacha.c now registers chacha20-$(ARCH), xchacha20-$(ARCH), and xchacha12-$(ARCH) skcipher algorithms that use the architecture's ChaCha and HChaCha library functions, individual architectures no longer need to do the same. Therefore, remove the redundant skcipher algorithms and leave just the library functions. Signed-off-by: Eric Biggers --- arch/powerpc/crypto/Kconfig | 8 -- arch/powerpc/crypto/chacha-p10-glue.c | 145 ++------------------------ 2 files changed, 6 insertions(+), 147 deletions(-) diff --git a/arch/powerpc/crypto/Kconfig b/arch/powerpc/crypto/Kconfig index 370db8192ce62..47dccdd496374 100644 --- a/arch/powerpc/crypto/Kconfig +++ b/arch/powerpc/crypto/Kconfig @@ -93,21 +93,13 @@ config CRYPTO_AES_GCM_P10 later CPU. This module supports stitched acceleration for AES/GCM. config CRYPTO_CHACHA20_P10 tristate depends on PPC64 && CPU_LITTLE_ENDIAN && VSX - select CRYPTO_SKCIPHER select CRYPTO_LIB_CHACHA_GENERIC select CRYPTO_ARCH_HAVE_LIB_CHACHA default CRYPTO_LIB_CHACHA_INTERNAL - help - Length-preserving ciphers: ChaCha20, XChaCha20, and XChaCha12 - stream cipher algorithms - - Architecture: PowerPC64 - - Power10 or later - - Little-endian config CRYPTO_POLY1305_P10 tristate "Hash functions: Poly1305 (P10 or later)" depends on PPC64 && CPU_LITTLE_ENDIAN && VSX select CRYPTO_HASH diff --git a/arch/powerpc/crypto/chacha-p10-glue.c b/arch/powerpc/crypto/chacha-p10-glue.c index 3355305b6c7f8..9982929573add 100644 --- a/arch/powerpc/crypto/chacha-p10-glue.c +++ b/arch/powerpc/crypto/chacha-p10-glue.c @@ -1,17 +1,14 @@ // SPDX-License-Identifier: GPL-2.0-or-later /* - * PowerPC P10 (ppc64le) accelerated ChaCha and XChaCha stream ciphers, - * including ChaCha20 (RFC7539) + * ChaCha stream cipher (P10 accelerated) * * Copyright 2023- IBM Corp. All rights reserved. */ -#include -#include +#include #include -#include #include #include #include #include #include @@ -76,152 +73,22 @@ void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, unsigned int bytes, dst += todo; } while (bytes); } EXPORT_SYMBOL(chacha_crypt_arch); -static int chacha_p10_stream_xor(struct skcipher_request *req, - const struct chacha_ctx *ctx, const u8 *iv) -{ - struct skcipher_walk walk; - u32 state[16]; - int err; - - err = skcipher_walk_virt(&walk, req, false); - if (err) - return err; - - chacha_init(state, ctx->key, iv); - - while (walk.nbytes > 0) { - unsigned int nbytes = walk.nbytes; - - if (nbytes < walk.total) - nbytes = rounddown(nbytes, walk.stride); - - if (!crypto_simd_usable()) { - chacha_crypt_generic(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, - ctx->nrounds); - } else { - vsx_begin(); - chacha_p10_do_8x(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, ctx->nrounds); - vsx_end(); - } - err = skcipher_walk_done(&walk, walk.nbytes - nbytes); - if (err) - break; - } - - return err; -} - -static int chacha_p10(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - - return chacha_p10_stream_xor(req, ctx, req->iv); -} - -static int xchacha_p10(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - struct chacha_ctx subctx; - u32 state[16]; - u8 real_iv[16]; - - chacha_init(state, ctx->key, req->iv); - hchacha_block_arch(state, subctx.key, ctx->nrounds); - subctx.nrounds = ctx->nrounds; - - memcpy(&real_iv[0], req->iv + 24, 8); - memcpy(&real_iv[8], req->iv + 16, 8); - return chacha_p10_stream_xor(req, &subctx, real_iv); -} - -static struct skcipher_alg algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-p10", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = chacha_p10, - .decrypt = chacha_p10, - }, { - .base.cra_name = "xchacha20", - .base.cra_driver_name = "xchacha20-p10", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = xchacha_p10, - .decrypt = xchacha_p10, - }, { - .base.cra_name = "xchacha12", - .base.cra_driver_name = "xchacha12-p10", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha12_setkey, - .encrypt = xchacha_p10, - .decrypt = xchacha_p10, - } -}; - bool chacha_is_arch_optimized(void) { return static_key_enabled(&have_p10); } EXPORT_SYMBOL(chacha_is_arch_optimized); static int __init chacha_p10_init(void) { - if (!cpu_has_feature(CPU_FTR_ARCH_31)) - return 0; - - static_branch_enable(&have_p10); - - return crypto_register_skciphers(algs, ARRAY_SIZE(algs)); -} - -static void __exit chacha_p10_exit(void) -{ - if (!static_branch_likely(&have_p10)) - return; - - crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); + if (cpu_has_feature(CPU_FTR_ARCH_31)) + static_branch_enable(&have_p10); + return 0; } - arch_initcall(chacha_p10_init); -module_exit(chacha_p10_exit); -MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (P10 accelerated)"); +MODULE_DESCRIPTION("ChaCha stream cipher (P10 accelerated)"); MODULE_AUTHOR("Danny Tsen "); MODULE_LICENSE("GPL v2"); -MODULE_ALIAS_CRYPTO("chacha20"); -MODULE_ALIAS_CRYPTO("chacha20-p10"); -MODULE_ALIAS_CRYPTO("xchacha20"); -MODULE_ALIAS_CRYPTO("xchacha20-p10"); -MODULE_ALIAS_CRYPTO("xchacha12"); -MODULE_ALIAS_CRYPTO("xchacha12-p10"); From patchwork Sat Apr 5 18:26:07 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039220 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id 80942C36010 for ; Sat, 5 Apr 2025 18:42:09 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=3yMQgp8OMvDydSAfRE2fxabHtW1l0t3Mqtp244+I0ug=; b=AnNw3CreGOEGJYq9QmfTYZUFv8 vrtmEEfbPPQSRLSlqzGKnytdDtHemLiynKJ5L5wNEhRDIrBl06Hz9BtgJYlkc4QBbQV3App47rMG6 p0bVwfE/VfW98aKNeCQKXDOVd91EkM9bQ29I44w+1vXm0iD02MyzeRkmJ9s0fYziDVoUKfRjP1Xev NM133nEZfin1DuAcgQ157sbBWRdM5lLSFv//toXuRVYRADBRodrP9l83+hVbfmZ26vBl9OrpvW6WH pvod/u8+VUQ2EIRGtUXqTYx/iGjCmWxBRh+9CunIm1mDLIMlai/xtaictSmdfioT8g4Hamc9XvdFj ZXi7QwUQ==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18St-0000000ENTi-2Hhf; Sat, 05 Apr 2025 18:41:59 +0000 Received: from tor.source.kernel.org ([172.105.4.254]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IC-0000000ELfK-3idU; Sat, 05 Apr 2025 18:30:57 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by tor.source.kernel.org (Postfix) with ESMTP id 63D306115E; Sat, 5 Apr 2025 18:30:48 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id C2AF0C4CEE9; Sat, 5 Apr 2025 18:30:55 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877856; bh=eW0xTEYWjFUyh4AXaM8KqKhQHIAg7OUKLr5XE1L/PVc=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=CE/ooCdCQStDNLZa23kR/ZLdWXC5Cyy8blaw32N9VZ/HZByRudI24Av624U6J6GjI vFHfKUJr5kXkrwEOfMxCzev4cmLQ0AxskdtKyhysD4HzgiRFfYhbCiyZiCraAivggH cZL4pzullvEXBaDJryylkKqnIORc48+kUXizZjdOFAJrM4lGxS+NL6HCg/7JQVso1g NxcrHPNy69kYkSJpwVov9VFJ9KzlJ+FkWUrAnbR7RmSmSy5Frr4XTV9WxxskziNvNK nfAuxRJPNAipQqBcEr96SWMXEapihar2DCrrObyZ3tyQoRC0NtFv8ztDL5bCqP6DLl s+6NA8DqN+72w== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 7/9] crypto: s390/chacha - remove the skcipher algorithms Date: Sat, 5 Apr 2025 11:26:07 -0700 Message-ID: <20250405182609.404216-8-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Since crypto/chacha.c now registers chacha20-$(ARCH), xchacha20-$(ARCH), and xchacha12-$(ARCH) skcipher algorithms that use the architecture's ChaCha and HChaCha library functions, individual architectures no longer need to do the same. Therefore, remove the redundant skcipher algorithms and leave just the library functions. Signed-off-by: Eric Biggers --- arch/s390/crypto/Kconfig | 7 --- arch/s390/crypto/chacha-glue.c | 101 +++++---------------------------- 2 files changed, 13 insertions(+), 95 deletions(-) diff --git a/arch/s390/crypto/Kconfig b/arch/s390/crypto/Kconfig index 8c4db8b64fa21..055b08f259ab2 100644 --- a/arch/s390/crypto/Kconfig +++ b/arch/s390/crypto/Kconfig @@ -108,20 +108,13 @@ config CRYPTO_DES_S390 As of z196 the CTR mode is hardware accelerated. config CRYPTO_CHACHA_S390 tristate depends on S390 - select CRYPTO_SKCIPHER select CRYPTO_LIB_CHACHA_GENERIC select CRYPTO_ARCH_HAVE_LIB_CHACHA default CRYPTO_LIB_CHACHA_INTERNAL - help - Length-preserving cipher: ChaCha20 stream cipher (RFC 7539) - - Architecture: s390 - - It is available as of z13. config CRYPTO_HMAC_S390 tristate "Keyed-hash message authentication code: HMAC" depends on S390 select CRYPTO_HASH diff --git a/arch/s390/crypto/chacha-glue.c b/arch/s390/crypto/chacha-glue.c index 0c68191f2aa4c..b3ffaa5553855 100644 --- a/arch/s390/crypto/chacha-glue.c +++ b/arch/s390/crypto/chacha-glue.c @@ -1,69 +1,23 @@ // SPDX-License-Identifier: GPL-2.0 /* - * s390 ChaCha stream cipher. + * ChaCha stream cipher (s390 optimized) * * Copyright IBM Corp. 2021 */ #define KMSG_COMPONENT "chacha_s390" #define pr_fmt(fmt) KMSG_COMPONENT ": " fmt -#include -#include -#include +#include #include #include #include #include #include #include "chacha-s390.h" -static void chacha20_crypt_s390(u32 *state, u8 *dst, const u8 *src, - unsigned int nbytes, const u32 *key, - u32 *counter) -{ - DECLARE_KERNEL_FPU_ONSTACK32(vxstate); - - kernel_fpu_begin(&vxstate, KERNEL_VXR); - chacha20_vx(dst, src, nbytes, key, counter); - kernel_fpu_end(&vxstate, KERNEL_VXR); - - *counter += round_up(nbytes, CHACHA_BLOCK_SIZE) / CHACHA_BLOCK_SIZE; -} - -static int chacha20_s390(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - u32 state[CHACHA_STATE_WORDS] __aligned(16); - struct skcipher_walk walk; - unsigned int nbytes; - int rc; - - rc = skcipher_walk_virt(&walk, req, false); - chacha_init(state, ctx->key, req->iv); - - while (walk.nbytes > 0) { - nbytes = walk.nbytes; - if (nbytes < walk.total) - nbytes = round_down(nbytes, walk.stride); - - if (nbytes <= CHACHA_BLOCK_SIZE) { - chacha_crypt_generic(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, - ctx->nrounds); - } else { - chacha20_crypt_s390(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, - &state[4], &state[12]); - } - rc = skcipher_walk_done(&walk, walk.nbytes - nbytes); - } - return rc; -} - void hchacha_block_arch(const u32 *state, u32 *stream, int nrounds) { /* TODO: implement hchacha_block_arch() in assembly */ hchacha_block_generic(state, stream, nrounds); } @@ -74,57 +28,28 @@ void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, { /* s390 chacha20 implementation has 20 rounds hard-coded, * it cannot handle a block of data or less, but otherwise * it can handle data of arbitrary size */ - if (bytes <= CHACHA_BLOCK_SIZE || nrounds != 20 || !cpu_has_vx()) + if (bytes <= CHACHA_BLOCK_SIZE || nrounds != 20 || !cpu_has_vx()) { chacha_crypt_generic(state, dst, src, bytes, nrounds); - else - chacha20_crypt_s390(state, dst, src, bytes, - &state[4], &state[12]); -} -EXPORT_SYMBOL(chacha_crypt_arch); + } else { + DECLARE_KERNEL_FPU_ONSTACK32(vxstate); -static struct skcipher_alg chacha_algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-s390", - .base.cra_priority = 900, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, + kernel_fpu_begin(&vxstate, KERNEL_VXR); + chacha20_vx(dst, src, bytes, &state[4], &state[12]); + kernel_fpu_end(&vxstate, KERNEL_VXR); - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = chacha20_s390, - .decrypt = chacha20_s390, + state[12] += round_up(bytes, CHACHA_BLOCK_SIZE) / + CHACHA_BLOCK_SIZE; } -}; +} +EXPORT_SYMBOL(chacha_crypt_arch); bool chacha_is_arch_optimized(void) { return cpu_has_vx(); } EXPORT_SYMBOL(chacha_is_arch_optimized); -static int __init chacha_mod_init(void) -{ - return IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) ? - crypto_register_skciphers(chacha_algs, ARRAY_SIZE(chacha_algs)) : 0; -} - -static void __exit chacha_mod_fini(void) -{ - if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER)) - crypto_unregister_skciphers(chacha_algs, ARRAY_SIZE(chacha_algs)); -} - -arch_initcall(chacha_mod_init); -module_exit(chacha_mod_fini); - -MODULE_DESCRIPTION("ChaCha20 stream cipher"); +MODULE_DESCRIPTION("ChaCha stream cipher (s390 optimized)"); MODULE_LICENSE("GPL v2"); - -MODULE_ALIAS_CRYPTO("chacha20"); From patchwork Sat Apr 5 18:26:08 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039228 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id 17626C36018 for ; Sat, 5 Apr 2025 18:45:52 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=r46e7dak9DFnm2+mkpQoxOm7+bzKpHfmKNESehoVImI=; b=Rd3L+pZDod5vc9RyOrRBajYWqs 3Ko0HgZZbmLy6H0jvs0RF99v4IROjjXUBzkF1P/OJ8eBQAX46KvRi3FAZxzHiaNJwQGsFpT5X0Vkf QECXIxpS2FGrcAQsC96qjE9rtyEiJFTHsxMnizK0iV+yHNjvDJVCQjdN5HPNUgdRwyBp+FlrnP9Gh 691cDIR8/lwkP04ZcbUeJg5ajQMZBxrEFRa0WuOHMIL3bvQWEi7gXBc0wfknoJgMfRtrT2Kp0IX6F 1XHyzlQ+7WWKePpGHCwrTAtE3H+0ijweknfBFLCFt7Gsoo0PCujE7zPIxI+RttIZIcnUoe1Q7TyRk CRRpX6sg==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18WS-0000000ENto-0vWH; Sat, 05 Apr 2025 18:45:40 +0000 Received: from tor.source.kernel.org ([172.105.4.254]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18ID-0000000ELg6-1bKZ; Sat, 05 Apr 2025 18:30:57 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by tor.source.kernel.org (Postfix) with ESMTP id DAD0D61164; Sat, 5 Apr 2025 18:30:48 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id 42FF8C4CEE7; Sat, 5 Apr 2025 18:30:56 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877856; bh=Xwh08sUrwMmQCiHRbYZtr7YEg0mBK+LLgKoP4B4fAZs=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=PQAXJq/nTIrYXPGSD4Rnn6n3p4oa0uFWpwBny/fE1QMrnfmFAqlzJGqii9u2ExoNG VCxf3BI4IuLWmrth2IQ2UWIgWYqoY8u577ch5z3LTqObfLB4+yYjIJh/iNoPuvTyHY HKNBrhd0s8N4ndBt6e4BIcBy/UBCy2Hs84SVVpDfFKKV2MWLBgW6oThbeLlN1d6cy3 cJXJHokTaaCN+H0JLtR7gX603ND3ilHiSKtDygFGgyLfhRn4t2F2xgKehWPgykgrl2 3IGvAudA0Tkhuc4zSu/Gf30BqTSHJv8jBAdAAjhgKIHviWwQfAPkcbw43UA3Saj4X9 BiFzjf/Mh0KSQ== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 8/9] crypto: x86/chacha - remove the skcipher algorithms Date: Sat, 5 Apr 2025 11:26:08 -0700 Message-ID: <20250405182609.404216-9-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers Since crypto/chacha.c now registers chacha20-$(ARCH), xchacha20-$(ARCH), and xchacha12-$(ARCH) skcipher algorithms that use the architecture's ChaCha and HChaCha library functions, individual architectures no longer need to do the same. Therefore, remove the redundant skcipher algorithms and leave just the library functions. Signed-off-by: Eric Biggers --- arch/x86/crypto/Kconfig | 9 --- arch/x86/crypto/chacha_glue.c | 142 +--------------------------------- 2 files changed, 4 insertions(+), 147 deletions(-) diff --git a/arch/x86/crypto/Kconfig b/arch/x86/crypto/Kconfig index 3d948f10c94cd..8cf0a3f55576e 100644 --- a/arch/x86/crypto/Kconfig +++ b/arch/x86/crypto/Kconfig @@ -350,22 +350,13 @@ config CRYPTO_ARIA_GFNI_AVX512_X86_64 Processes 64 blocks in parallel. config CRYPTO_CHACHA20_X86_64 tristate depends on X86 && 64BIT - select CRYPTO_SKCIPHER select CRYPTO_LIB_CHACHA_GENERIC select CRYPTO_ARCH_HAVE_LIB_CHACHA default CRYPTO_LIB_CHACHA_INTERNAL - help - Length-preserving ciphers: ChaCha20, XChaCha20, and XChaCha12 - stream cipher algorithms - - Architecture: x86_64 using: - - SSSE3 (Supplemental SSE3) - - AVX2 (Advanced Vector Extensions 2) - - AVX-512VL (Advanced Vector Extensions-512VL) config CRYPTO_AEGIS128_AESNI_SSE2 tristate "AEAD ciphers: AEGIS-128 (AES-NI/SSE4.1)" depends on X86 && 64BIT select CRYPTO_AEAD diff --git a/arch/x86/crypto/chacha_glue.c b/arch/x86/crypto/chacha_glue.c index 83a07b31cdd3a..71ec959caadaa 100644 --- a/arch/x86/crypto/chacha_glue.c +++ b/arch/x86/crypto/chacha_glue.c @@ -1,17 +1,14 @@ // SPDX-License-Identifier: GPL-2.0-or-later /* - * x64 SIMD accelerated ChaCha and XChaCha stream ciphers, - * including ChaCha20 (RFC7539) + * ChaCha and HChaCha functions (x86_64 optimized) * * Copyright (C) 2015 Martin Willi */ -#include -#include +#include #include -#include #include #include #include #include @@ -152,126 +149,10 @@ void chacha_crypt_arch(u32 *state, u8 *dst, const u8 *src, unsigned int bytes, dst += todo; } while (bytes); } EXPORT_SYMBOL(chacha_crypt_arch); -static int chacha_simd_stream_xor(struct skcipher_request *req, - const struct chacha_ctx *ctx, const u8 *iv) -{ - u32 state[CHACHA_STATE_WORDS] __aligned(8); - struct skcipher_walk walk; - int err; - - err = skcipher_walk_virt(&walk, req, false); - - chacha_init(state, ctx->key, iv); - - while (walk.nbytes > 0) { - unsigned int nbytes = walk.nbytes; - - if (nbytes < walk.total) - nbytes = round_down(nbytes, walk.stride); - - if (!static_branch_likely(&chacha_use_simd) || - !crypto_simd_usable()) { - chacha_crypt_generic(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, - ctx->nrounds); - } else { - kernel_fpu_begin(); - chacha_dosimd(state, walk.dst.virt.addr, - walk.src.virt.addr, nbytes, - ctx->nrounds); - kernel_fpu_end(); - } - err = skcipher_walk_done(&walk, walk.nbytes - nbytes); - } - - return err; -} - -static int chacha_simd(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - - return chacha_simd_stream_xor(req, ctx, req->iv); -} - -static int xchacha_simd(struct skcipher_request *req) -{ - struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - u32 state[CHACHA_STATE_WORDS] __aligned(8); - struct chacha_ctx subctx; - u8 real_iv[16]; - - chacha_init(state, ctx->key, req->iv); - - if (req->cryptlen > CHACHA_BLOCK_SIZE && crypto_simd_usable()) { - kernel_fpu_begin(); - hchacha_block_ssse3(state, subctx.key, ctx->nrounds); - kernel_fpu_end(); - } else { - hchacha_block_generic(state, subctx.key, ctx->nrounds); - } - subctx.nrounds = ctx->nrounds; - - memcpy(&real_iv[0], req->iv + 24, 8); - memcpy(&real_iv[8], req->iv + 16, 8); - return chacha_simd_stream_xor(req, &subctx, real_iv); -} - -static struct skcipher_alg algs[] = { - { - .base.cra_name = "chacha20", - .base.cra_driver_name = "chacha20-simd", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = CHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = chacha_simd, - .decrypt = chacha_simd, - }, { - .base.cra_name = "xchacha20", - .base.cra_driver_name = "xchacha20-simd", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha20_setkey, - .encrypt = xchacha_simd, - .decrypt = xchacha_simd, - }, { - .base.cra_name = "xchacha12", - .base.cra_driver_name = "xchacha12-simd", - .base.cra_priority = 300, - .base.cra_blocksize = 1, - .base.cra_ctxsize = sizeof(struct chacha_ctx), - .base.cra_module = THIS_MODULE, - - .min_keysize = CHACHA_KEY_SIZE, - .max_keysize = CHACHA_KEY_SIZE, - .ivsize = XCHACHA_IV_SIZE, - .chunksize = CHACHA_BLOCK_SIZE, - .setkey = chacha12_setkey, - .encrypt = xchacha_simd, - .decrypt = xchacha_simd, - }, -}; - bool chacha_is_arch_optimized(void) { return static_key_enabled(&chacha_use_simd); } EXPORT_SYMBOL(chacha_is_arch_optimized); @@ -291,27 +172,12 @@ static int __init chacha_simd_mod_init(void) if (IS_ENABLED(CONFIG_AS_AVX512) && boot_cpu_has(X86_FEATURE_AVX512VL) && boot_cpu_has(X86_FEATURE_AVX512BW)) /* kmovq */ static_branch_enable(&chacha_use_avx512vl); } - return IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) ? - crypto_register_skciphers(algs, ARRAY_SIZE(algs)) : 0; + return 0; } - -static void __exit chacha_simd_mod_fini(void) -{ - if (IS_REACHABLE(CONFIG_CRYPTO_SKCIPHER) && boot_cpu_has(X86_FEATURE_SSSE3)) - crypto_unregister_skciphers(algs, ARRAY_SIZE(algs)); -} - arch_initcall(chacha_simd_mod_init); -module_exit(chacha_simd_mod_fini); MODULE_LICENSE("GPL"); MODULE_AUTHOR("Martin Willi "); -MODULE_DESCRIPTION("ChaCha and XChaCha stream ciphers (x64 SIMD accelerated)"); -MODULE_ALIAS_CRYPTO("chacha20"); -MODULE_ALIAS_CRYPTO("chacha20-simd"); -MODULE_ALIAS_CRYPTO("xchacha20"); -MODULE_ALIAS_CRYPTO("xchacha20-simd"); -MODULE_ALIAS_CRYPTO("xchacha12"); -MODULE_ALIAS_CRYPTO("xchacha12-simd"); +MODULE_DESCRIPTION("ChaCha and HChaCha functions (x86_64 optimized)"); From patchwork Sat Apr 5 18:26:09 2025 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Eric Biggers X-Patchwork-Id: 14039234 Return-Path: X-Spam-Checker-Version: SpamAssassin 3.4.0 (2014-02-07) on aws-us-west-2-korg-lkml-1.web.codeaurora.org Received: from bombadil.infradead.org (bombadil.infradead.org [198.137.202.133]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (No client certificate requested) by smtp.lore.kernel.org (Postfix) with ESMTPS id 277F4C36010 for ; Sat, 5 Apr 2025 19:48:24 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=lists.infradead.org; s=bombadil.20210309; h=Sender:List-Subscribe:List-Help :List-Post:List-Archive:List-Unsubscribe:List-Id:Content-Transfer-Encoding: MIME-Version:References:In-Reply-To:Message-ID:Date:Subject:Cc:To:From: Reply-To:Content-Type:Content-ID:Content-Description:Resent-Date:Resent-From: Resent-Sender:Resent-To:Resent-Cc:Resent-Message-ID:List-Owner; bh=nhCL9ALqrw3D/8w7HiPkEzxnjAyvDKamWX6KnYu6nwA=; b=LyB4WlTK6MTxSNy29/eLjzsuvS t0KEEemZqGP1nxOXKBnzR3OmC6qBnH3vLdcL0wx6IuS7dkHrvhfLAYScRTjjQ/VhvpOHWsiH8Suid cYMS3XkdWS0+mHdwqN7xFYONZWYmkF5xCUYjLd2afTYwRYZju//5imzj6/4xSOWmmJ+PK9ymVrhoO 9Fv9FE4mnxbQ4eLKFianWXLZzXVtTFuOR0PXriEXb2+7lLZIUAke7/bzhQtDIsidYbSLE1xvrEZi/ yQBGdVhlOn6KyJn1HS+TY3YfolPg8JGZYGlrHIvatbgPrkJEtfFtx5MkXSvQUOoYEmTldUR0X1+9d 2mV91bLA==; Received: from localhost ([::1] helo=bombadil.infradead.org) by bombadil.infradead.org with esmtp (Exim 4.98.1 #2 (Red Hat Linux)) id 1u19Uv-0000000ERCt-2cty; Sat, 05 Apr 2025 19:48:09 +0000 Received: from tor.source.kernel.org ([2600:3c04:e001:324:0:1991:8:25]) by bombadil.infradead.org with esmtps (Exim 4.98.1 #2 (Red Hat Linux)) id 1u18IE-0000000ELgh-08Mb; Sat, 05 Apr 2025 18:30:58 +0000 Received: from smtp.kernel.org (transwarp.subspace.kernel.org [100.75.92.58]) by tor.source.kernel.org (Postfix) with ESMTP id 80AAD61139; Sat, 5 Apr 2025 18:30:49 +0000 (UTC) Received: by smtp.kernel.org (Postfix) with ESMTPSA id B91BCC4CEEE; Sat, 5 Apr 2025 18:30:56 +0000 (UTC) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=kernel.org; s=k20201202; t=1743877857; bh=xm9pQBjynxcvCeIRTBV15aMlBzjyI8/pKv+7T2s3kQY=; h=From:To:Cc:Subject:Date:In-Reply-To:References:From; b=Ju1cg6yronHNVCwqCj3RXAUddHMdeVnV37A1BlKl8lKfGjcLZ1O5Jr47JPwtgEb8w kBtjrQz3oQTK0rF2GIxt9Jca1cQ2ovIovsG6x4ozwgzI/GbWA8lRcEtmfqMDHrTqpT GkdEJQJ46F2Rn2YbfgjD6ZNJWzdg9KZ4ECZmXWRKimG527EO6mCfbyhULimX30AAin iZ+x9yi8figCx1FLSGpxDTsI1oClBdWm+rRR1K6l7HqAHfcbGiEQ2Yy0vRh9G1DEDV Fz64+CRHAlOrMoZ3dqTecWbhgJP5bMpDNyYjfK3/ZQdfcw+7ZrPkaN6Q+fqq/Aktxg VZhaClah8z4gQ== From: Eric Biggers To: linux-crypto@vger.kernel.org Cc: linux-kernel@vger.kernel.org, linux-arm-kernel@lists.infradead.org, linux-mips@vger.kernel.org, linuxppc-dev@lists.ozlabs.org, linux-riscv@lists.infradead.org, linux-s390@vger.kernel.org, x86@kernel.org, Ard Biesheuvel , "Jason A . Donenfeld " , Linus Torvalds Subject: [PATCH 9/9] crypto: chacha - remove Date: Sat, 5 Apr 2025 11:26:09 -0700 Message-ID: <20250405182609.404216-10-ebiggers@kernel.org> X-Mailer: git-send-email 2.49.0 In-Reply-To: <20250405182609.404216-1-ebiggers@kernel.org> References: <20250405182609.404216-1-ebiggers@kernel.org> MIME-Version: 1.0 X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.34 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+linux-arm-kernel=archiver.kernel.org@lists.infradead.org From: Eric Biggers is now included only by crypto/chacha.c, so fold it into there. Signed-off-by: Eric Biggers --- crypto/chacha.c | 35 +++++++++++++++++++++++++- include/crypto/internal/chacha.h | 43 -------------------------------- 2 files changed, 34 insertions(+), 44 deletions(-) delete mode 100644 include/crypto/internal/chacha.h diff --git a/crypto/chacha.c b/crypto/chacha.c index 2009038c5e56c..5103bc0b2881f 100644 --- a/crypto/chacha.c +++ b/crypto/chacha.c @@ -6,14 +6,47 @@ * Copyright (C) 2018 Google LLC */ #include #include -#include +#include #include #include +struct chacha_ctx { + u32 key[8]; + int nrounds; +}; + +static int chacha_setkey(struct crypto_skcipher *tfm, + const u8 *key, unsigned int keysize, int nrounds) +{ + struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); + int i; + + if (keysize != CHACHA_KEY_SIZE) + return -EINVAL; + + for (i = 0; i < ARRAY_SIZE(ctx->key); i++) + ctx->key[i] = get_unaligned_le32(key + i * sizeof(u32)); + + ctx->nrounds = nrounds; + return 0; +} + +static int chacha20_setkey(struct crypto_skcipher *tfm, + const u8 *key, unsigned int keysize) +{ + return chacha_setkey(tfm, key, keysize, 20); +} + +static int chacha12_setkey(struct crypto_skcipher *tfm, + const u8 *key, unsigned int keysize) +{ + return chacha_setkey(tfm, key, keysize, 12); +} + static int chacha_stream_xor(struct skcipher_request *req, const struct chacha_ctx *ctx, const u8 *iv, bool arch) { struct skcipher_walk walk; diff --git a/include/crypto/internal/chacha.h b/include/crypto/internal/chacha.h deleted file mode 100644 index b085dc1ac1516..0000000000000 --- a/include/crypto/internal/chacha.h +++ /dev/null @@ -1,43 +0,0 @@ -/* SPDX-License-Identifier: GPL-2.0 */ - -#ifndef _CRYPTO_INTERNAL_CHACHA_H -#define _CRYPTO_INTERNAL_CHACHA_H - -#include -#include -#include - -struct chacha_ctx { - u32 key[8]; - int nrounds; -}; - -static inline int chacha_setkey(struct crypto_skcipher *tfm, const u8 *key, - unsigned int keysize, int nrounds) -{ - struct chacha_ctx *ctx = crypto_skcipher_ctx(tfm); - int i; - - if (keysize != CHACHA_KEY_SIZE) - return -EINVAL; - - for (i = 0; i < ARRAY_SIZE(ctx->key); i++) - ctx->key[i] = get_unaligned_le32(key + i * sizeof(u32)); - - ctx->nrounds = nrounds; - return 0; -} - -static inline int chacha20_setkey(struct crypto_skcipher *tfm, const u8 *key, - unsigned int keysize) -{ - return chacha_setkey(tfm, key, keysize, 20); -} - -static inline int chacha12_setkey(struct crypto_skcipher *tfm, const u8 *key, - unsigned int keysize) -{ - return chacha_setkey(tfm, key, keysize, 12); -} - -#endif /* _CRYPTO_CHACHA_H */