Series |
accel/tcg: Convert victim tlb to IntervalTree
|
expand
-
[for-10.0,v2,00/54] accel/tcg: Convert victim tlb to IntervalTree
-
[v2,01/54] util/interval-tree: Introduce interval_tree_free_nodes
-
[v2,02/54] accel/tcg: Split out tlbfast_flush_locked
-
[v2,03/54] accel/tcg: Split out tlbfast_{index,entry}
-
[v2,04/54] accel/tcg: Split out tlbfast_flush_range_locked
-
[v2,05/54] accel/tcg: Fix flags usage in mmu_lookup1, atomic_mmu_lookup
-
[v2,06/54] accel/tcg: Assert non-zero length in tlb_flush_range_by_mmuidx*
-
[v2,07/54] accel/tcg: Assert bits in range in tlb_flush_range_by_mmuidx*
-
[v2,08/54] accel/tcg: Flush entire tlb when a masked range wraps
-
[v2,09/54] accel/tcg: Add IntervalTreeRoot to CPUTLBDesc
-
[v2,10/54] accel/tcg: Populate IntervalTree in tlb_set_page_full
-
[v2,11/54] accel/tcg: Remove IntervalTree entry in tlb_flush_page_locked
-
[v2,12/54] accel/tcg: Remove IntervalTree entries in tlb_flush_range_locked
-
[v2,13/54] accel/tcg: Process IntervalTree entries in tlb_reset_dirty
-
[v2,14/54] accel/tcg: Process IntervalTree entries in tlb_set_dirty
-
[v2,15/54] accel/tcg: Use tlb_hit_page in victim_tlb_hit
-
[v2,16/54] accel/tcg: Pass full addr to victim_tlb_hit
-
[v2,17/54] accel/tcg: Replace victim_tlb_hit with tlbtree_hit
-
[v2,18/54] accel/tcg: Remove the victim tlb
-
[v2,19/54] accel/tcg: Remove tlb_n_used_entries_inc
-
[v2,20/54] include/exec/tlb-common: Move CPUTLBEntryFull from hw/core/cpu.h
-
[v2,21/54] accel/tcg: Delay plugin adjustment in probe_access_internal
-
[v2,22/54] accel/tcg: Call cpu_ld*_code_mmu from cpu_ld*_code
-
[v2,23/54] accel/tcg: Check original prot bits for read in atomic_mmu_lookup
-
[v2,24/54] accel/tcg: Preserve tlb flags in tlb_set_compare
-
[v2,25/54] accel/tcg: Return CPUTLBEntryFull not pointer in probe_access_full_mmu
-
[v2,26/54] accel/tcg: Return CPUTLBEntryFull not pointer in probe_access_full
-
[v2,27/54] accel/tcg: Return CPUTLBEntryFull not pointer in probe_access_internal
-
[v2,28/54] accel/tcg: Introduce tlb_lookup
-
[v2,29/54] accel/tcg: Partially unify MMULookupPageData and TLBLookupOutput
-
[v2,30/54] accel/tcg: Merge mmu_lookup1 into mmu_lookup
-
[v2,31/54] accel/tcg: Always use IntervalTree for code lookups
-
[v2,32/54] accel/tcg: Link CPUTLBEntry to CPUTLBEntryTree
-
[v2,33/54] accel/tcg: Remove CPUTLBDesc.fulltlb
-
[v2,34/54] target/alpha: Convert to TCGCPUOps.tlb_fill_align
-
[v2,35/54] target/avr: Convert to TCGCPUOps.tlb_fill_align
-
[v2,36/54] target/i386: Convert to TCGCPUOps.tlb_fill_align
-
[v2,37/54] target/loongarch: Convert to TCGCPUOps.tlb_fill_align
-
[v2,38/54] target/m68k: Convert to TCGCPUOps.tlb_fill_align
-
[v2,39/54] target/m68k: Do not call tlb_set_page in helper_ptest
-
[v2,40/54] target/microblaze: Convert to TCGCPUOps.tlb_fill_align
-
[v2,41/54] target/mips: Convert to TCGCPUOps.tlb_fill_align
-
[v2,42/54] target/openrisc: Convert to TCGCPUOps.tlb_fill_align
-
[v2,43/54] target/ppc: Convert to TCGCPUOps.tlb_fill_align
-
[v2,44/54] target/riscv: Convert to TCGCPUOps.tlb_fill_align
-
[v2,45/54] target/rx: Convert to TCGCPUOps.tlb_fill_align
-
[v2,46/54] target/s390x: Convert to TCGCPUOps.tlb_fill_align
-
[v2,47/54] target/sh4: Convert to TCGCPUOps.tlb_fill_align
-
[v2,48/54] target/sparc: Convert to TCGCPUOps.tlb_fill_align
-
[v2,49/54] target/tricore: Convert to TCGCPUOps.tlb_fill_align
-
[v2,50/54] target/xtensa: Convert to TCGCPUOps.tlb_fill_align
-
[v2,51/54] accel/tcg: Drop TCGCPUOps.tlb_fill
-
[v2,52/54] accel/tcg: Unexport tlb_set_page*
-
[v2,53/54] accel/tcg: Merge tlb_fill_align into callers
-
[v2,54/54] accel/tcg: Return CPUTLBEntryTree from tlb_set_page_full
|