Re: [PATCH] crypto: chacha20 - Fix chacha20_block() keystream alignment (again)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

Le mardi 11 septembre 2018 à 20:05 -0700, Eric Biggers a écrit :
> From: Eric Biggers <ebiggers@xxxxxxxxxx>
> 
> In commit 9f480faec58c ("crypto: chacha20 - Fix keystream alignment for
> chacha20_block()"), I had missed that chacha20_block() can be called
> directly on the buffer passed to get_random_bytes(), which can have any
> alignment.  So, while my commit didn't break anything, it didn't fully
> solve the alignment problems.
> 
> Revert my solution and just update chacha20_block() to use
> put_unaligned_le32(), so the output buffer need not be aligned.
> This is simpler, and on many CPUs it's the same speed.
> 
> But, I kept the 'tmp' buffers in extract_crng_user() and
> _get_random_bytes() 4-byte aligned, since that alignment is actually
> needed for _crng_backtrack_protect() too.
> 
> Reported-by: Stephan Müller <smueller@xxxxxxxxxx>
> Cc: Theodore Ts'o <tytso@xxxxxxx>
> Signed-off-by: Eric Biggers <ebiggers@xxxxxxxxxx>
> ---
>  crypto/chacha20_generic.c |  7 ++++---
>  drivers/char/random.c     | 24 ++++++++++++------------
>  include/crypto/chacha20.h |  3 +--
>  lib/chacha20.c            |  6 +++---
>  4 files changed, 20 insertions(+), 20 deletions(-)
> 
> diff --git a/crypto/chacha20_generic.c b/crypto/chacha20_generic.c
> index e451c3cb6a56..3ae96587caf9 100644
> --- a/crypto/chacha20_generic.c
> +++ b/crypto/chacha20_generic.c
> @@ -18,20 +18,21 @@
>  static void chacha20_docrypt(u32 *state, u8 *dst, const u8 *src,
>  			     unsigned int bytes)
>  {
> -	u32 stream[CHACHA20_BLOCK_WORDS];
> +	/* aligned to potentially speed up crypto_xor() */
> +	u8 stream[CHACHA20_BLOCK_SIZE] __aligned(sizeof(long));
>  
>  	if (dst != src)
>  		memcpy(dst, src, bytes);
>  
>  	while (bytes >= CHACHA20_BLOCK_SIZE) {
>  		chacha20_block(state, stream);
> -		crypto_xor(dst, (const u8 *)stream, CHACHA20_BLOCK_SIZE);
> +		crypto_xor(dst, stream, CHACHA20_BLOCK_SIZE);
>  		bytes -= CHACHA20_BLOCK_SIZE;
>  		dst += CHACHA20_BLOCK_SIZE;
>  	}
>  	if (bytes) {
>  		chacha20_block(state, stream);
> -		crypto_xor(dst, (const u8 *)stream, bytes);
> +		crypto_xor(dst, stream, bytes);
>  	}
>  }
>  
> diff --git a/drivers/char/random.c b/drivers/char/random.c
> index bf5f99fc36f1..d22d967c50f0 100644
> --- a/drivers/char/random.c
> +++ b/drivers/char/random.c
> @@ -1003,7 +1003,7 @@ static void extract_crng(__u32 out[CHACHA20_BLOCK_WORDS])
>   * enough) to mutate the CRNG key to provide backtracking protection.
>   */
>  static void _crng_backtrack_protect(struct crng_state *crng,
> -				    __u32 tmp[CHACHA20_BLOCK_WORDS], int used)
> +				    __u8 tmp[CHACHA20_BLOCK_SIZE], int used)
>  {
>  	unsigned long	flags;
>  	__u32		*s, *d;
> @@ -1015,14 +1015,14 @@ static void _crng_backtrack_protect(struct crng_state *crng,
>  		used = 0;
>  	}
>  	spin_lock_irqsave(&crng->lock, flags);
> -	s = &tmp[used / sizeof(__u32)];
> +	s = (__u32 *) &tmp[used];

This introduces a alignment issue: tmp is not aligned for __u32, but is
dereferenced as such later.

>  	d = &crng->state[4];
>  	for (i=0; i < 8; i++)
>  		*d++ ^= *s++;
>  	spin_unlock_irqrestore(&crng->lock, flags);
>  }
>  

Regards.

-- 
Yann Droneaud
OPTEYA






[Index of Archives]     [Kernel]     [Gnu Classpath]     [Gnu Crypto]     [DM Crypt]     [Netfilter]     [Bugtraq]

  Powered by Linux