From patchwork Mon Nov 11 21:45:49 2019 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 8bit X-Patchwork-Submitter: Kees Cook X-Patchwork-Id: 11237903 X-Patchwork-Delegate: herbert@gondor.apana.org.au Return-Path: Received: from mail.kernel.org (pdx-korg-mail-1.web.codeaurora.org [172.30.200.123]) by pdx-korg-patchwork-2.web.codeaurora.org (Postfix) with ESMTP id 5688216B1 for ; Mon, 11 Nov 2019 21:46:42 +0000 (UTC) Received: from vger.kernel.org (vger.kernel.org [209.132.180.67]) by mail.kernel.org (Postfix) with ESMTP id 295C3206BB for ; Mon, 11 Nov 2019 21:46:42 +0000 (UTC) Authentication-Results: mail.kernel.org; dkim=pass (1024-bit key) header.d=chromium.org header.i=@chromium.org header.b="ZP43Y1V6" Received: (majordomo@vger.kernel.org) by vger.kernel.org via listexpand id S1727121AbfKKVql (ORCPT ); Mon, 11 Nov 2019 16:46:41 -0500 Received: from mail-pf1-f194.google.com ([209.85.210.194]:34353 "EHLO mail-pf1-f194.google.com" rhost-flags-OK-OK-OK-OK) by vger.kernel.org with ESMTP id S1727618AbfKKVqG (ORCPT ); Mon, 11 Nov 2019 16:46:06 -0500 Received: by mail-pf1-f194.google.com with SMTP id n13so11657749pff.1 for ; Mon, 11 Nov 2019 13:46:06 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=chromium.org; s=google; h=from:to:cc:subject:date:message-id:in-reply-to:references :mime-version:content-transfer-encoding; bh=ZcOJA7rw+bZqOv1GdLBsInsGQzz6225k9cIjCh5MhhM=; b=ZP43Y1V6ff1HQ8Xg1Z0ybmTGnTgRWCPigzsp6aXFM0pQLo1PpYLDrNqdglWa7w2kBE ofnkcT7WE6/XgspKsZzZ5hZBEtphVnmkH3/mJOvpJtyAg1t+8XDm9JPGKrQruBHSlYFm lLKvS4N7e27k8xvRFE8aBMz/XQgulBfA9tQ5s= X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:from:to:cc:subject:date:message-id:in-reply-to :references:mime-version:content-transfer-encoding; bh=ZcOJA7rw+bZqOv1GdLBsInsGQzz6225k9cIjCh5MhhM=; b=s55o5sYYNhTK/yeUwOQ6aieC9PFjCQf8j3Y8lKoKCyk3BPCIJLpiekkBsh2+uXPn39 EbE5QQ0Ns15izSKQQW2InThRyvvrgxqEQBYaufevHXoILsLnM/fQE8BB9jp4Q0INmob+ HElKyb6Rebd+P+9R/RRaEUPQuR/dw6L0jJUUBPlztFbAXffMGlE1YP+v/Dh/979CnmRm lt0hivmsUEZjCL0bEvvY9VWEyGuGY0KO1S8jnYsfwGknYE32DXvFPGVDA1FgIfbAXm8v 8p8GO6H42wFMa946FJyMDzPsy5c1YaO/Vj8GSucdI2d7wYCjRLezBKxlt+Mr2h8macdm Cmqw== X-Gm-Message-State: APjAAAWseyYFxFHtbb5e9ZUY6x6ZVbRBCaa8zxxZWuCNCQI465e7U8oL YqWaH6fBnVfJqlavbr+ZiCx3Pg== X-Google-Smtp-Source: APXvYqyffBGHQKnSMZzfXoWRTqJhRDT1ivChICdHWoCl+K+VCWP8Doc5SEkO+zmrUTOghOMCkdnNiQ== X-Received: by 2002:a05:6a00:e:: with SMTP id h14mr30956789pfk.99.1573508765844; Mon, 11 Nov 2019 13:46:05 -0800 (PST) Received: from www.outflux.net (smtp.outflux.net. [198.145.64.163]) by smtp.gmail.com with ESMTPSA id v10sm15268006pfg.11.2019.11.11.13.46.02 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Mon, 11 Nov 2019 13:46:05 -0800 (PST) From: Kees Cook To: Herbert Xu Cc: Kees Cook , =?utf-8?q?Jo=C3=A3o_Moreira?= , Eric Biggers , Sami Tolvanen , "David S. Miller" , Ard Biesheuvel , Stephan Mueller , x86@kernel.org, linux-crypto@vger.kernel.org, linux-kernel@vger.kernel.org, kernel-hardening@lists.openwall.com Subject: [PATCH v4 5/8] crypto: x86/cast6: Use new glue function macros Date: Mon, 11 Nov 2019 13:45:49 -0800 Message-Id: <20191111214552.36717-6-keescook@chromium.org> X-Mailer: git-send-email 2.17.1 In-Reply-To: <20191111214552.36717-1-keescook@chromium.org> References: <20191111214552.36717-1-keescook@chromium.org> MIME-Version: 1.0 Sender: linux-crypto-owner@vger.kernel.org Precedence: bulk List-ID: X-Mailing-List: linux-crypto@vger.kernel.org Convert to function declaration macros from function prototype casts to avoid triggering Control-Flow Integrity checks during indirect function calls. Co-developed-by: João Moreira Signed-off-by: Kees Cook --- arch/x86/crypto/cast6_avx_glue.c | 62 ++++++++++++++------------------ crypto/cast6_generic.c | 6 ++-- include/crypto/cast6.h | 4 +-- 3 files changed, 32 insertions(+), 40 deletions(-) diff --git a/arch/x86/crypto/cast6_avx_glue.c b/arch/x86/crypto/cast6_avx_glue.c index a8a38fffb4a9..841724f335a5 100644 --- a/arch/x86/crypto/cast6_avx_glue.c +++ b/arch/x86/crypto/cast6_avx_glue.c @@ -20,20 +20,15 @@ #define CAST6_PARALLEL_BLOCKS 8 -asmlinkage void cast6_ecb_enc_8way(struct cast6_ctx *ctx, u8 *dst, - const u8 *src); -asmlinkage void cast6_ecb_dec_8way(struct cast6_ctx *ctx, u8 *dst, - const u8 *src); - -asmlinkage void cast6_cbc_dec_8way(struct cast6_ctx *ctx, u8 *dst, - const u8 *src); -asmlinkage void cast6_ctr_8way(struct cast6_ctx *ctx, u8 *dst, const u8 *src, - le128 *iv); - -asmlinkage void cast6_xts_enc_8way(struct cast6_ctx *ctx, u8 *dst, - const u8 *src, le128 *iv); -asmlinkage void cast6_xts_dec_8way(struct cast6_ctx *ctx, u8 *dst, - const u8 *src, le128 *iv); +CRYPTO_FUNC(__cast6_encrypt); +CRYPTO_FUNC(__cast6_decrypt); +CRYPTO_FUNC(cast6_ecb_enc_8way); +CRYPTO_FUNC(cast6_ecb_dec_8way); +CRYPTO_FUNC_CBC(cast6_cbc_dec_8way); +CRYPTO_FUNC_WRAP_CBC(__cast6_decrypt); +CRYPTO_FUNC_CTR(cast6_ctr_8way); +CRYPTO_FUNC_XTS(cast6_xts_enc_8way); +CRYPTO_FUNC_XTS(cast6_xts_dec_8way); static int cast6_setkey_skcipher(struct crypto_skcipher *tfm, const u8 *key, unsigned int keylen) @@ -43,14 +38,12 @@ static int cast6_setkey_skcipher(struct crypto_skcipher *tfm, static void cast6_xts_enc(void *ctx, u128 *dst, const u128 *src, le128 *iv) { - glue_xts_crypt_128bit_one(ctx, dst, src, iv, - GLUE_FUNC_CAST(__cast6_encrypt)); + glue_xts_crypt_128bit_one(ctx, dst, src, iv, __cast6_encrypt); } static void cast6_xts_dec(void *ctx, u128 *dst, const u128 *src, le128 *iv) { - glue_xts_crypt_128bit_one(ctx, dst, src, iv, - GLUE_FUNC_CAST(__cast6_decrypt)); + glue_xts_crypt_128bit_one(ctx, dst, src, iv, __cast6_decrypt); } static void cast6_crypt_ctr(void *ctx, u128 *dst, const u128 *src, le128 *iv) @@ -70,10 +63,10 @@ static const struct common_glue_ctx cast6_enc = { .funcs = { { .num_blocks = CAST6_PARALLEL_BLOCKS, - .fn_u = { .ecb = GLUE_FUNC_CAST(cast6_ecb_enc_8way) } + .fn_u = { .ecb = cast6_ecb_enc_8way } }, { .num_blocks = 1, - .fn_u = { .ecb = GLUE_FUNC_CAST(__cast6_encrypt) } + .fn_u = { .ecb = __cast6_encrypt } } } }; @@ -83,10 +76,10 @@ static const struct common_glue_ctx cast6_ctr = { .funcs = { { .num_blocks = CAST6_PARALLEL_BLOCKS, - .fn_u = { .ctr = GLUE_CTR_FUNC_CAST(cast6_ctr_8way) } + .fn_u = { .ctr = cast6_ctr_8way } }, { .num_blocks = 1, - .fn_u = { .ctr = GLUE_CTR_FUNC_CAST(cast6_crypt_ctr) } + .fn_u = { .ctr = cast6_crypt_ctr } } } }; @@ -96,10 +89,10 @@ static const struct common_glue_ctx cast6_enc_xts = { .funcs = { { .num_blocks = CAST6_PARALLEL_BLOCKS, - .fn_u = { .xts = GLUE_XTS_FUNC_CAST(cast6_xts_enc_8way) } + .fn_u = { .xts = cast6_xts_enc_8way } }, { .num_blocks = 1, - .fn_u = { .xts = GLUE_XTS_FUNC_CAST(cast6_xts_enc) } + .fn_u = { .xts = cast6_xts_enc } } } }; @@ -109,10 +102,10 @@ static const struct common_glue_ctx cast6_dec = { .funcs = { { .num_blocks = CAST6_PARALLEL_BLOCKS, - .fn_u = { .ecb = GLUE_FUNC_CAST(cast6_ecb_dec_8way) } + .fn_u = { .ecb = cast6_ecb_dec_8way } }, { .num_blocks = 1, - .fn_u = { .ecb = GLUE_FUNC_CAST(__cast6_decrypt) } + .fn_u = { .ecb = __cast6_decrypt } } } }; @@ -122,10 +115,10 @@ static const struct common_glue_ctx cast6_dec_cbc = { .funcs = { { .num_blocks = CAST6_PARALLEL_BLOCKS, - .fn_u = { .cbc = GLUE_CBC_FUNC_CAST(cast6_cbc_dec_8way) } + .fn_u = { .cbc = cast6_cbc_dec_8way } }, { .num_blocks = 1, - .fn_u = { .cbc = GLUE_CBC_FUNC_CAST(__cast6_decrypt) } + .fn_u = { .cbc = __cast6_decrypt_cbc } } } }; @@ -135,10 +128,10 @@ static const struct common_glue_ctx cast6_dec_xts = { .funcs = { { .num_blocks = CAST6_PARALLEL_BLOCKS, - .fn_u = { .xts = GLUE_XTS_FUNC_CAST(cast6_xts_dec_8way) } + .fn_u = { .xts = cast6_xts_dec_8way } }, { .num_blocks = 1, - .fn_u = { .xts = GLUE_XTS_FUNC_CAST(cast6_xts_dec) } + .fn_u = { .xts = cast6_xts_dec } } } }; @@ -154,8 +147,7 @@ static int ecb_decrypt(struct skcipher_request *req) static int cbc_encrypt(struct skcipher_request *req) { - return glue_cbc_encrypt_req_128bit(GLUE_FUNC_CAST(__cast6_encrypt), - req); + return glue_cbc_encrypt_req_128bit(__cast6_encrypt, req); } static int cbc_decrypt(struct skcipher_request *req) @@ -199,8 +191,7 @@ static int xts_encrypt(struct skcipher_request *req) struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); struct cast6_xts_ctx *ctx = crypto_skcipher_ctx(tfm); - return glue_xts_req_128bit(&cast6_enc_xts, req, - XTS_TWEAK_CAST(__cast6_encrypt), + return glue_xts_req_128bit(&cast6_enc_xts, req, __cast6_encrypt, &ctx->tweak_ctx, &ctx->crypt_ctx, false); } @@ -209,8 +200,7 @@ static int xts_decrypt(struct skcipher_request *req) struct crypto_skcipher *tfm = crypto_skcipher_reqtfm(req); struct cast6_xts_ctx *ctx = crypto_skcipher_ctx(tfm); - return glue_xts_req_128bit(&cast6_dec_xts, req, - XTS_TWEAK_CAST(__cast6_encrypt), + return glue_xts_req_128bit(&cast6_dec_xts, req, __cast6_encrypt, &ctx->tweak_ctx, &ctx->crypt_ctx, true); } diff --git a/crypto/cast6_generic.c b/crypto/cast6_generic.c index a8248f8e2777..c51121bedf68 100644 --- a/crypto/cast6_generic.c +++ b/crypto/cast6_generic.c @@ -173,8 +173,9 @@ static inline void QBAR(u32 *block, u8 *Kr, u32 *Km) block[2] ^= F1(block[3], Kr[0], Km[0]); } -void __cast6_encrypt(struct cast6_ctx *c, u8 *outbuf, const u8 *inbuf) +void __cast6_encrypt(void *ctx, u8 *outbuf, const u8 *inbuf) { + struct cast6_ctx *c = ctx; const __be32 *src = (const __be32 *)inbuf; __be32 *dst = (__be32 *)outbuf; u32 block[4]; @@ -211,8 +212,9 @@ static void cast6_encrypt(struct crypto_tfm *tfm, u8 *outbuf, const u8 *inbuf) __cast6_encrypt(crypto_tfm_ctx(tfm), outbuf, inbuf); } -void __cast6_decrypt(struct cast6_ctx *c, u8 *outbuf, const u8 *inbuf) +void __cast6_decrypt(void *ctx, u8 *outbuf, const u8 *inbuf) { + struct cast6_ctx *c = ctx; const __be32 *src = (const __be32 *)inbuf; __be32 *dst = (__be32 *)outbuf; u32 block[4]; diff --git a/include/crypto/cast6.h b/include/crypto/cast6.h index c71f6ef47f0f..b6c3a0324959 100644 --- a/include/crypto/cast6.h +++ b/include/crypto/cast6.h @@ -19,7 +19,7 @@ int __cast6_setkey(struct cast6_ctx *ctx, const u8 *key, unsigned int keylen, u32 *flags); int cast6_setkey(struct crypto_tfm *tfm, const u8 *key, unsigned int keylen); -void __cast6_encrypt(struct cast6_ctx *ctx, u8 *dst, const u8 *src); -void __cast6_decrypt(struct cast6_ctx *ctx, u8 *dst, const u8 *src); +void __cast6_encrypt(void *ctx, u8 *dst, const u8 *src); +void __cast6_decrypt(void *ctx, u8 *dst, const u8 *src); #endif