On Wed, Nov 13, 2019 at 10:25:14AM -0800, Kees Cook wrote: > In order to remove the callsite function casts, regularize the function > prototypes for helpers to avoid triggering Control-Flow Integrity checks > during indirect function calls. Where needed, to avoid changes to > pointer math, u8 pointers are internally cast back to u128 pointers. > > Signed-off-by: Kees Cook <keescook@xxxxxxxxxxxx> > --- > arch/x86/crypto/aesni-intel_glue.c | 45 +++++++++++++----------------- > 1 file changed, 19 insertions(+), 26 deletions(-) > > diff --git a/arch/x86/crypto/aesni-intel_glue.c b/arch/x86/crypto/aesni-intel_glue.c > index 3e707e81afdb..f47afa5ae8ca 100644 > --- a/arch/x86/crypto/aesni-intel_glue.c > +++ b/arch/x86/crypto/aesni-intel_glue.c > @@ -83,10 +83,8 @@ struct gcm_context_data { > > asmlinkage int aesni_set_key(struct crypto_aes_ctx *ctx, const u8 *in_key, > unsigned int key_len); > -asmlinkage void aesni_enc(struct crypto_aes_ctx *ctx, u8 *out, > - const u8 *in); > -asmlinkage void aesni_dec(struct crypto_aes_ctx *ctx, u8 *out, > - const u8 *in); > +asmlinkage void aesni_enc(void *ctx, u8 *out, const u8 *in); > +asmlinkage void aesni_dec(void *ctx, u8 *out, const u8 *in); > asmlinkage void aesni_ecb_enc(struct crypto_aes_ctx *ctx, u8 *out, > const u8 *in, unsigned int len); > asmlinkage void aesni_ecb_dec(struct crypto_aes_ctx *ctx, u8 *out, > @@ -107,7 +105,7 @@ asmlinkage void aesni_ctr_enc(struct crypto_aes_ctx *ctx, u8 *out, > const u8 *in, unsigned int len, u8 *iv); > > asmlinkage void aesni_xts_crypt8(struct crypto_aes_ctx *ctx, u8 *out, > - const u8 *in, bool enc, u8 *iv); > + const u8 *in, bool enc, le128 *iv); These functions in aesni-intel_asm.S have comments that show the function prototypes. Can you please keep them in sync? > -static void aesni_xts_tweak(void *ctx, u8 *out, const u8 *in) > +static void aesni_xts_enc(void *ctx, u8 *dst, const u8 *src, le128 *iv) > { > - aesni_enc(ctx, out, in); > + glue_xts_crypt_128bit_one(ctx, (u128 *)dst, (const u128 *)src, iv, > + aesni_enc); > } For the src and dst, how about making glue_xts_crypt_128bit_one() take u8 instead of u128? That would avoid having to add these u8 => u128 casts to all 10 callers of glue_xts_crypt_128bit_one(). - Eric