From patchwork Fri Oct 30 06:21:43 2015 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit X-Patchwork-Submitter: Shannon Zhao X-Patchwork-Id: 7524291 Return-Path: X-Original-To: patchwork-kvm@patchwork.kernel.org Delivered-To: patchwork-parsemail@patchwork1.web.kernel.org Received: from mail.kernel.org (mail.kernel.org [198.145.29.136]) by patchwork1.web.kernel.org (Postfix) with ESMTP id 823F39F71A for ; Fri, 30 Oct 2015 06:23:44 +0000 (UTC) Received: from mail.kernel.org (localhost [127.0.0.1]) by mail.kernel.org (Postfix) with ESMTP id 8DD8820854 for ; Fri, 30 Oct 2015 06:23:43 +0000 (UTC) Received: from vger.kernel.org (vger.kernel.org [209.132.180.67]) by mail.kernel.org (Postfix) with ESMTP id 78A1F2084B for ; Fri, 30 Oct 2015 06:23:42 +0000 (UTC) Received: (majordomo@vger.kernel.org) by vger.kernel.org via listexpand id S1758625AbbJ3GXk (ORCPT ); Fri, 30 Oct 2015 02:23:40 -0400 Received: from szxga02-in.huawei.com ([119.145.14.65]:4823 "EHLO szxga02-in.huawei.com" rhost-flags-OK-OK-OK-OK) by vger.kernel.org with ESMTP id S1758554AbbJ3GXj (ORCPT ); Fri, 30 Oct 2015 02:23:39 -0400 Received: from 172.24.1.47 (EHLO SZXEML429-HUB.china.huawei.com) ([172.24.1.47]) by szxrg02-dlp.huawei.com (MOS 4.3.7-GA FastPath queued) with ESMTP id CVJ31580; Fri, 30 Oct 2015 14:22:52 +0800 (CST) Received: from HGHY1Z002260041.china.huawei.com (10.177.16.142) by SZXEML429-HUB.china.huawei.com (10.82.67.184) with Microsoft SMTP Server id 14.3.235.1; Fri, 30 Oct 2015 14:22:43 +0800 From: Shannon Zhao To: CC: , , , , , , , , , , Subject: [PATCH v4 01/21] ARM64: Move PMU register related defines to asm/pmu.h Date: Fri, 30 Oct 2015 14:21:43 +0800 Message-ID: <1446186123-11548-2-git-send-email-zhaoshenglong@huawei.com> X-Mailer: git-send-email 1.9.0.msysgit.0 In-Reply-To: <1446186123-11548-1-git-send-email-zhaoshenglong@huawei.com> References: <1446186123-11548-1-git-send-email-zhaoshenglong@huawei.com> MIME-Version: 1.0 X-Originating-IP: [10.177.16.142] X-CFilter-Loop: Reflected Sender: kvm-owner@vger.kernel.org Precedence: bulk List-ID: X-Mailing-List: kvm@vger.kernel.org X-Spam-Status: No, score=-7.9 required=5.0 tests=BAYES_00, RCVD_IN_DNSWL_HI, RP_MATCHES_RCVD, UNPARSEABLE_RELAY autolearn=unavailable version=3.3.1 X-Spam-Checker-Version: SpamAssassin 3.3.1 (2010-03-16) on mail.kernel.org X-Virus-Scanned: ClamAV using ClamSMTP From: Shannon Zhao To use the ARMv8 PMU related register defines from the KVM code, we move the relevant definitions to asm/pmu.h header file. Signed-off-by: Anup Patel Signed-off-by: Shannon Zhao --- arch/arm64/include/asm/pmu.h | 45 ++++++++++++++++++++++++++++++++++++++++++ arch/arm64/kernel/perf_event.c | 35 -------------------------------- 2 files changed, 45 insertions(+), 35 deletions(-) diff --git a/arch/arm64/include/asm/pmu.h b/arch/arm64/include/asm/pmu.h index b7710a5..b9f394a 100644 --- a/arch/arm64/include/asm/pmu.h +++ b/arch/arm64/include/asm/pmu.h @@ -19,6 +19,51 @@ #ifndef __ASM_PMU_H #define __ASM_PMU_H +#define ARMV8_MAX_COUNTERS 32 +#define ARMV8_COUNTER_MASK (ARMV8_MAX_COUNTERS - 1) + +/* + * Per-CPU PMCR: config reg + */ +#define ARMV8_PMCR_E (1 << 0) /* Enable all counters */ +#define ARMV8_PMCR_P (1 << 1) /* Reset all counters */ +#define ARMV8_PMCR_C (1 << 2) /* Cycle counter reset */ +#define ARMV8_PMCR_D (1 << 3) /* CCNT counts every 64th cpu cycle */ +#define ARMV8_PMCR_X (1 << 4) /* Export to ETM */ +#define ARMV8_PMCR_DP (1 << 5) /* Disable CCNT if non-invasive debug*/ +#define ARMV8_PMCR_N_SHIFT 11 /* Number of counters supported */ +#define ARMV8_PMCR_N_MASK 0x1f +#define ARMV8_PMCR_MASK 0x3f /* Mask for writable bits */ + +/* + * PMCNTEN: counters enable reg + */ +#define ARMV8_CNTEN_MASK 0xffffffff /* Mask for writable bits */ + +/* + * PMINTEN: counters interrupt enable reg + */ +#define ARMV8_INTEN_MASK 0xffffffff /* Mask for writable bits */ + +/* + * PMOVSR: counters overflow flag status reg + */ +#define ARMV8_OVSR_MASK 0xffffffff /* Mask for writable bits */ +#define ARMV8_OVERFLOWED_MASK ARMV8_OVSR_MASK + +/* + * PMXEVTYPER: Event selection reg + */ +#define ARMV8_EVTYPE_MASK 0xc80003ff /* Mask for writable bits */ +#define ARMV8_EVTYPE_EVENT 0x3ff /* Mask for EVENT bits */ + +/* + * Event filters for PMUv3 + */ +#define ARMV8_EXCLUDE_EL1 (1 << 31) +#define ARMV8_EXCLUDE_EL0 (1 << 30) +#define ARMV8_INCLUDE_EL2 (1 << 27) + #ifdef CONFIG_HW_PERF_EVENTS /* The events for a given PMU register set. */ diff --git a/arch/arm64/kernel/perf_event.c b/arch/arm64/kernel/perf_event.c index f9a74d4..534e8ad 100644 --- a/arch/arm64/kernel/perf_event.c +++ b/arch/arm64/kernel/perf_event.c @@ -741,9 +741,6 @@ static const unsigned armv8_pmuv3_perf_cache_map[PERF_COUNT_HW_CACHE_MAX] #define ARMV8_IDX_COUNTER0 1 #define ARMV8_IDX_COUNTER_LAST (ARMV8_IDX_CYCLE_COUNTER + cpu_pmu->num_events - 1) -#define ARMV8_MAX_COUNTERS 32 -#define ARMV8_COUNTER_MASK (ARMV8_MAX_COUNTERS - 1) - /* * ARMv8 low level PMU access */ @@ -754,38 +751,6 @@ static const unsigned armv8_pmuv3_perf_cache_map[PERF_COUNT_HW_CACHE_MAX] #define ARMV8_IDX_TO_COUNTER(x) \ (((x) - ARMV8_IDX_COUNTER0) & ARMV8_COUNTER_MASK) -/* - * Per-CPU PMCR: config reg - */ -#define ARMV8_PMCR_E (1 << 0) /* Enable all counters */ -#define ARMV8_PMCR_P (1 << 1) /* Reset all counters */ -#define ARMV8_PMCR_C (1 << 2) /* Cycle counter reset */ -#define ARMV8_PMCR_D (1 << 3) /* CCNT counts every 64th cpu cycle */ -#define ARMV8_PMCR_X (1 << 4) /* Export to ETM */ -#define ARMV8_PMCR_DP (1 << 5) /* Disable CCNT if non-invasive debug*/ -#define ARMV8_PMCR_N_SHIFT 11 /* Number of counters supported */ -#define ARMV8_PMCR_N_MASK 0x1f -#define ARMV8_PMCR_MASK 0x3f /* Mask for writable bits */ - -/* - * PMOVSR: counters overflow flag status reg - */ -#define ARMV8_OVSR_MASK 0xffffffff /* Mask for writable bits */ -#define ARMV8_OVERFLOWED_MASK ARMV8_OVSR_MASK - -/* - * PMXEVTYPER: Event selection reg - */ -#define ARMV8_EVTYPE_MASK 0xc80003ff /* Mask for writable bits */ -#define ARMV8_EVTYPE_EVENT 0x3ff /* Mask for EVENT bits */ - -/* - * Event filters for PMUv3 - */ -#define ARMV8_EXCLUDE_EL1 (1 << 31) -#define ARMV8_EXCLUDE_EL0 (1 << 30) -#define ARMV8_INCLUDE_EL2 (1 << 27) - static inline u32 armv8pmu_pmcr_read(void) { u32 val;