In order to reduce the cost of TLB invalidation, the ARMv8.4 TTL feature allows TLBs to be issued with a level allowing for quicker invalidation. This series provide support for this feature. Patch 1 and Patch 2 was provided by Marc on his NV series[1] patches, which detect the TTL feature and add __tlbi_level interface. Patch 4-7 passes struct mmu_gather to flush_tlb_range, which can pass the level of tlbi invalidations. Arm64 and power9 can benefit from this. Patch 8 set the TTL field in arm64 by using the cleared_* values in struct mmu_gather. See patches for details, Thanks. [1] https://lore.kernel.org/linux-arm-kernel/20200211174938.27809-1-maz@xxxxxxxxxx/ [2] https://lore.kernel.org/linux-arm-kernel/7859561b-78b4-4a12-2642-3741d7d3e7b8@xxxxxxxxxx/ -- ChangeList: v1: add support for TTL feature in arm64. v2: build the patch on Marc's NV series[1]. v3: use vma->vm_flags to replace mm->context.flags. v4: add Marc's patches into my series. v5: pass struct mmu_gather to flush_tlb_range, then set the TTL field by using infos in struct mmu_gather. Marc Zyngier (2): arm64: Detect the ARMv8.4 TTL feature arm64: Add level-hinted TLB invalidation helper Zhenyu Ye (6): arm64: Add tlbi_user_level TLB invalidation helper mm: tlb: Pass struct mmu_gather to flush_pmd_tlb_range mm: tlb: Pass struct mmu_gather to flush_pud_tlb_range mm: tlb: Pass struct mmu_gather to flush_hugetlb_tlb_range mm: tlb: Pass struct mmu_gather to flush_tlb_range arm64: tlb: Set the TTL field in flush_tlb_range Documentation/core-api/cachetlb.rst | 8 ++- arch/alpha/include/asm/tlbflush.h | 8 +-- arch/alpha/kernel/smp.c | 3 +- arch/arc/include/asm/hugepage.h | 4 +- arch/arc/include/asm/tlbflush.h | 11 ++-- arch/arc/mm/tlb.c | 8 +-- arch/arm/include/asm/tlbflush.h | 12 ++-- arch/arm/kernel/smp_tlb.c | 4 +- arch/arm/mach-rpc/ecard.c | 8 ++- arch/arm64/crypto/aes-glue.c | 1 - arch/arm64/include/asm/cpucaps.h | 3 +- arch/arm64/include/asm/sysreg.h | 1 + arch/arm64/include/asm/tlb.h | 39 +++++++++++- arch/arm64/include/asm/tlbflush.h | 63 +++++++++++++------ arch/arm64/kernel/cpufeature.c | 11 ++++ arch/arm64/mm/hugetlbpage.c | 10 ++- arch/csky/include/asm/tlb.h | 2 +- arch/csky/include/asm/tlbflush.h | 6 +- arch/csky/mm/tlb.c | 4 +- arch/hexagon/include/asm/tlbflush.h | 2 +- arch/hexagon/mm/vm_tlb.c | 4 +- arch/ia64/include/asm/tlbflush.h | 6 +- arch/ia64/mm/tlb.c | 5 +- arch/m68k/include/asm/tlbflush.h | 10 +-- arch/microblaze/include/asm/tlbflush.h | 5 +- arch/mips/include/asm/hugetlb.h | 6 +- arch/mips/include/asm/tlbflush.h | 9 +-- arch/mips/kernel/smp.c | 3 +- arch/nds32/include/asm/tlbflush.h | 3 +- arch/nios2/include/asm/tlbflush.h | 9 +-- arch/nios2/mm/tlb.c | 8 ++- arch/openrisc/include/asm/tlbflush.h | 10 +-- arch/openrisc/kernel/smp.c | 2 +- arch/parisc/include/asm/tlbflush.h | 2 +- arch/parisc/kernel/cache.c | 13 +++- arch/powerpc/include/asm/book3s/32/tlbflush.h | 4 +- arch/powerpc/include/asm/book3s/64/tlbflush.h | 9 ++- arch/powerpc/include/asm/nohash/tlbflush.h | 7 ++- arch/powerpc/mm/book3s32/tlb.c | 6 +- arch/powerpc/mm/book3s64/pgtable.c | 8 ++- arch/powerpc/mm/book3s64/radix_tlb.c | 2 +- arch/powerpc/mm/nohash/tlb.c | 6 +- arch/riscv/include/asm/tlbflush.h | 7 ++- arch/riscv/mm/tlbflush.c | 4 +- arch/s390/include/asm/tlbflush.h | 5 +- arch/sh/include/asm/tlbflush.h | 8 +-- arch/sh/kernel/smp.c | 2 +- arch/sparc/include/asm/tlbflush_32.h | 2 +- arch/sparc/include/asm/tlbflush_64.h | 3 +- arch/sparc/mm/tlb.c | 5 +- arch/um/include/asm/tlbflush.h | 6 +- arch/um/kernel/tlb.c | 4 +- arch/unicore32/include/asm/tlbflush.h | 5 +- arch/x86/include/asm/tlbflush.h | 4 +- arch/x86/mm/pgtable.c | 10 ++- arch/xtensa/include/asm/tlbflush.h | 10 +-- arch/xtensa/kernel/smp.c | 2 +- include/asm-generic/pgtable.h | 10 +-- include/asm-generic/tlb.h | 2 +- mm/huge_memory.c | 19 +++++- mm/hugetlb.c | 17 +++-- mm/mapping_dirty_helpers.c | 23 ++++--- mm/migrate.c | 8 ++- mm/mprotect.c | 8 ++- mm/mremap.c | 17 ++++- mm/pgtable-generic.c | 51 ++++++++++++--- mm/rmap.c | 6 +- 67 files changed, 409 insertions(+), 174 deletions(-) -- 2.19.1