This is a follow-up to [0], but given that the arm64 architectural pieces have been merged for arm64, the only remaining changes are crypto specific. Therefore, the audience has been reduced to those people who are somewhat more likely to care about these specifics. The AEAD and skcipher APIs may only be called from task or softirq context. This permits the arm64 AEAD and skcipher code to get rid of all scalar fallbacks, given that on this architecture, softirqs are now no longer served while the SIMD unit is being used in kernel mode, which means that the scalar fallbacks are never needed. These are removed in this series. Changes since v5: - add Eric's R-b to patches #1 to #3 - split CCM changes into 3 separate patches Changes since v4: - drop skcipher_walk layer change to deal with zero sized walks - drop aead/skcipher layer sanity checks on invocations from hardirq context - add patch to clean up CCM a bit more after removing the SIMD code path Changes since v3: - clarify the nature of the issue addressed by patch #1, and apply the same fix to the skcipher walker - update patches #2 and #3 so that the failures can be observed by the crypto stats code [0] https://lore.kernel.org/linux-arm-kernel/20210302090118.30666-1-ardb@xxxxxxxxxx/ Ard Biesheuvel (6): crypto: arm64/gcm-aes-ce - remove non-SIMD fallback path crypto: arm64/aes-neonbs - stop using SIMD helper for skciphers crypto: arm64/aes-ce - stop using SIMD helper for skciphers crypto: arm64/aes-ccm - remove non-SIMD fallback path crypto: arm64/aes-ccm - reduce NEON begin/end calls for common case crypto: arm64/aes-ccm - avoid by-ref argument for ce_aes_ccm_auth_data arch/arm64/crypto/Kconfig | 6 - arch/arm64/crypto/aes-ce-ccm-core.S | 24 +-- arch/arm64/crypto/aes-ce-ccm-glue.c | 196 ++++++------------ arch/arm64/crypto/aes-glue.c | 102 ++-------- arch/arm64/crypto/aes-neonbs-glue.c | 122 +----------- arch/arm64/crypto/ghash-ce-glue.c | 209 +++++--------------- 6 files changed, 141 insertions(+), 518 deletions(-) -- 2.20.1