On Fri, Apr 1, 2022 at 11:16 PM Jason A. Donenfeld <Jason@xxxxxxxxx> wrote: > Prior, the "input_pool_data" array needed no real initialization, and so > it was easy to mark it with __latent_entropy to populate it during > compile-time. As I see it, that was the correct approach. > In switching to using a hash function, this required us to > specifically initialize it to some specific state, Hash functions do not require that. Any such function must work correctly with a new input block and a more-or-less random state from hashing previous blocks. In general, except perhaps at boot time, I do not think any of the hopefully-random data structures -- input pool, hash context or chacha context -- should ever be set to any specific state. Update them only with += or ^= and preferably not with constants. What requires a fixed initialisation is your decision to eliminate the input pool & just collect entropy in a hash context. In effect you are reducing the driver to a Yarrow-like design, which I think is an error. Yarrow is a good design, but it has limitations; in particular the Yarrow paper says the cryptographic strength is limited to the size of the hash context, 160 bits for their SHA-1 & 512 for our Blake. 512 bits is more than enough for nearly all use cases, but we may have some where it is not. How many random bits are needed to generate a 4k-bit PGP key? Will some users try to generate one-time pads from /dev/random? The OTP security proof requires truly random data as long as the message; with anything short of that the proof fails & you get a stream cipher. Patches will follow, but likely not soon; I'm busy with other things.