From patchwork Wed Oct 9 18:50:33 2013 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Ard Biesheuvel X-Patchwork-Id: 3011261 Return-Path: X-Original-To: patchwork-linux-arm@patchwork.kernel.org Delivered-To: patchwork-parsemail@patchwork1.web.kernel.org Received: from mail.kernel.org (mail.kernel.org [198.145.19.201]) by patchwork1.web.kernel.org (Postfix) with ESMTP id B453D9F2B7 for ; Wed, 9 Oct 2013 18:55:22 +0000 (UTC) Received: from mail.kernel.org (localhost [127.0.0.1]) by mail.kernel.org (Postfix) with ESMTP id 85D54202B8 for ; Wed, 9 Oct 2013 18:55:21 +0000 (UTC) Received: from casper.infradead.org (casper.infradead.org [85.118.1.10]) (using TLSv1 with cipher DHE-RSA-AES256-SHA (256/256 bits)) (No client certificate requested) by mail.kernel.org (Postfix) with ESMTPS id 51FAD2022A for ; Wed, 9 Oct 2013 18:55:20 +0000 (UTC) Received: from merlin.infradead.org ([2001:4978:20e::2]) by casper.infradead.org with esmtps (Exim 4.80.1 #2 (Red Hat Linux)) id 1VTytF-0006WH-Ej; Wed, 09 Oct 2013 18:53:46 +0000 Received: from localhost ([::1] helo=merlin.infradead.org) by merlin.infradead.org with esmtp (Exim 4.80.1 #2 (Red Hat Linux)) id 1VTysm-0005O0-SV; Wed, 09 Oct 2013 18:53:16 +0000 Received: from mail-wi0-f178.google.com ([209.85.212.178]) by merlin.infradead.org with esmtps (Exim 4.80.1 #2 (Red Hat Linux)) id 1VTysL-0005K1-2M for linux-arm-kernel@lists.infradead.org; Wed, 09 Oct 2013 18:52:50 +0000 Received: by mail-wi0-f178.google.com with SMTP id hn3so1997wib.11 for ; Wed, 09 Oct 2013 11:52:27 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:from:to:cc:subject:date:message-id:in-reply-to :references; bh=yoaVs7e2jkofJ5dWDEsqBK6Mzeza1HTOLas0pid6UrQ=; b=ZiKARk/0Z3jsiAbTJ0o5+TK/gO94qyYu3xSXzd5KYI+91ACiYZZXczrnQu4yRF3JfU iby+CZwt5ne0oH/xiVoI4E/L/BhLQgpYzOXm0pwaRKhIc00L27dnfIsnasGzl7R98tEw YV+7Hnu6KpY5aSv64TMkjDz76tpn1PBObLW/AYP8tu8Wxqw9xlxzI9bhC8Sgy7yXCE+v OlFTSt68IDk34+Tl6ZASNEExwKBnvtZ0Zn0BDKoKtDz4+qQq+hWpPnCkrG352M0I2Gmh W/75Dsi3MWMkhAtYmWdj3N/YyulI2Z12ODDo1rDfapWnKFntGiWq+iLIoYYZ2+knwbju DrzQ== X-Gm-Message-State: ALoCoQlRgDgqJ1q1xLDTRjRZtmzVppeAY6cEuUYJJmp7nNuk9cfElg9hvXWFQqA1LbNDu8CQ21CE X-Received: by 10.180.83.228 with SMTP id t4mr4059062wiy.12.1381344747360; Wed, 09 Oct 2013 11:52:27 -0700 (PDT) Received: from ards-mac-mini.local (cag06-7-83-153-85-71.fbx.proxad.net. [83.153.85.71]) by mx.google.com with ESMTPSA id l9sm17911688wif.10.1969.12.31.16.00.00 (version=TLSv1.1 cipher=ECDHE-RSA-RC4-SHA bits=128/128); Wed, 09 Oct 2013 11:52:26 -0700 (PDT) From: Ard Biesheuvel To: linux-arm-kernel@lists.infradead.org Subject: [RFC v2 PATCH 3/4] ARM64: add Crypto Extensions based synchronous core AES cipher Date: Wed, 9 Oct 2013 20:50:33 +0200 Message-Id: <1381344634-14917-4-git-send-email-ard.biesheuvel@linaro.org> X-Mailer: git-send-email 1.8.1.2 In-Reply-To: <1381344634-14917-1-git-send-email-ard.biesheuvel@linaro.org> References: <1381344634-14917-1-git-send-email-ard.biesheuvel@linaro.org> X-CRM114-Version: 20100106-BlameMichelson ( TRE 0.8.0 (BSD) ) MR-646709E3 X-CRM114-CacheID: sfid-20131009_145249_269702_4D493B77 X-CRM114-Status: GOOD ( 12.77 ) X-Spam-Score: -2.6 (--) Cc: Ard Biesheuvel , linux@arm.linux.org.uk, dave.martin@arm.com, nico@linaro.org X-BeenThere: linux-arm-kernel@lists.infradead.org X-Mailman-Version: 2.1.15 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , MIME-Version: 1.0 Sender: "linux-arm-kernel" Errors-To: linux-arm-kernel-bounces+patchwork-linux-arm=patchwork.kernel.org@lists.infradead.org X-Spam-Status: No, score=-4.4 required=5.0 tests=BAYES_00, RCVD_IN_DNSWL_MED, RP_MATCHES_RCVD, UNPARSEABLE_RELAY autolearn=unavailable version=3.3.1 X-Spam-Checker-Version: SpamAssassin 3.3.1 (2010-03-16) on mail.kernel.org X-Virus-Scanned: ClamAV using ClamSMTP This implements the core AES cipher using the Crypto Extensions, using only NEON register q0 and q1. Signed-off-by: Ard Biesheuvel --- arch/arm64/crypto/Makefile | 12 +++++ arch/arm64/crypto/aes-sync.c | 106 +++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 118 insertions(+) create mode 100644 arch/arm64/crypto/Makefile create mode 100644 arch/arm64/crypto/aes-sync.c diff --git a/arch/arm64/crypto/Makefile b/arch/arm64/crypto/Makefile new file mode 100644 index 0000000..7c636e9 --- /dev/null +++ b/arch/arm64/crypto/Makefile @@ -0,0 +1,12 @@ +# +# linux/arch/arm64/crypto/Makefile +# +# Copyright (C) 2013 Linaro Ltd +# +# This program is free software; you can redistribute it and/or modify +# it under the terms of the GNU General Public License version 2 as +# published by the Free Software Foundation. +# + +aesce-sync-y := aes-sync.o +obj-m += aesce-sync.o diff --git a/arch/arm64/crypto/aes-sync.c b/arch/arm64/crypto/aes-sync.c new file mode 100644 index 0000000..d047d49 --- /dev/null +++ b/arch/arm64/crypto/aes-sync.c @@ -0,0 +1,106 @@ +/* + * linux/arch/arm64/crypto/aes-sync.c + * + * Copyright (C) 2013 Linaro Ltd + * + * This program is free software; you can redistribute it and/or modify + * it under the terms of the GNU General Public License version 2 as + * published by the Free Software Foundation. + */ + +#include +#include +#include +#include + +static void aes_cipher_encrypt(struct crypto_tfm *tfm, u8 dst[], u8 const src[]) +{ + struct crypto_aes_ctx *ctx = crypto_tfm_ctx(tfm); + int rounds = 6 + ctx->key_length / 4; + DEFINE_NEON_STACK_REGS(regs, 2); + + kernel_neon_begin_atomic(regs); + + __asm__(" .arch armv8-a+crypto \n\t" + " ld1 {v0.16b}, [%[in]] \n\t" + " ld1 {v1.16b}, [%[key]], #16 \n\t" + "0: aese v0.16b, v1.16b \n\t" + " subs %[rounds], %[rounds], #1 \n\t" + " ld1 {v1.16b}, [%[key]], #16 \n\t" + " beq 1f \n\t" + " aesmc v0.16b, v0.16b \n\t" + " b 0b \n\t" + "1: eor v0.16b, v0.16b, v1.16b \n\t" + " st1 {v0.16b}, [%[out]] \n\t" + : : + [out] "r"(dst), + [in] "r"(src), + [rounds] "r"(rounds), + [key] "r"(ctx->key_enc)); + + kernel_neon_end_atomic(regs); +} + +static void aes_cipher_decrypt(struct crypto_tfm *tfm, u8 dst[], u8 const src[]) +{ + struct crypto_aes_ctx *ctx = crypto_tfm_ctx(tfm); + int rounds = 6 + ctx->key_length / 4; + DEFINE_NEON_STACK_REGS(regs, 2); + + kernel_neon_begin_atomic(regs); + + __asm__(" .arch armv8-a+crypto \n\t" + " ld1 {v0.16b}, [%[in]] \n\t" + " ld1 {v1.16b}, [%[key]], #16 \n\t" + "0: aesd v0.16b, v1.16b \n\t" + " ld1 {v1.16b}, [%[key]], #16 \n\t" + " subs %[rounds], %[rounds], #1 \n\t" + " beq 1f \n\t" + " aesimc v0.16b, v0.16b \n\t" + " b 0b \n\t" + "1: eor v0.16b, v0.16b, v1.16b \n\t" + " st1 {v0.16b}, [%[out]] \n\t" + : : + [out] "r"(dst), + [in] "r"(src), + [rounds] "r"(rounds), + [key] "r"(ctx->key_dec)); + + kernel_neon_end_atomic(regs); +} + +static struct crypto_alg aes_alg = { + .cra_name = "aes", + .cra_driver_name = "aes-ce", + .cra_priority = 300, + .cra_flags = CRYPTO_ALG_TYPE_CIPHER, + .cra_blocksize = AES_BLOCK_SIZE, + .cra_ctxsize = sizeof(struct crypto_aes_ctx), + .cra_module = THIS_MODULE, + .cra_cipher = { + .cia_min_keysize = AES_MIN_KEY_SIZE, + .cia_max_keysize = AES_MAX_KEY_SIZE, + .cia_setkey = crypto_aes_set_key, + .cia_encrypt = aes_cipher_encrypt, + .cia_decrypt = aes_cipher_decrypt + } +}; + +static int __init aes_mod_init(void) +{ + if (0) // TODO check for crypto extensions + return -ENODEV; + return crypto_register_alg(&aes_alg); +} + +static void __exit aes_mod_exit(void) +{ + crypto_unregister_alg(&aes_alg); +} + +module_init(aes_mod_init); +module_exit(aes_mod_exit); + +MODULE_DESCRIPTION("Synchronous AES using ARMv8 Crypto Extensions"); +MODULE_AUTHOR("Ard Biesheuvel "); +MODULE_LICENSE("GPL");