Re: [PATCH 4/5] tsc: wire up entropy generation function

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, 2011-06-14 at 16:04 -0400, Kent Borg wrote:
> Matt Mackall wrote:
>  > [network adapters are] a great source of potential entropy, a bad
>  > source of guaranteed entropy. The current RNG tries to do
>  > accounting on the latter. Accounting on the former is extremely
>  > suspect.
> 
> So we need a patch that:
> 
>  - Deletes the IRQF_SAMPLE_RANDOM mention in feature-removal-schedule.txt,
> 
>  - Restores instances of IRQF_SAMPLE_RANDOM in drivers, and
> 
>  - Changes the credit_entropy_bits() to credit less entropy*.
> 
>    * The code seems to only handle integer values of entropy.  Maybe 
> when crediting, choose between 1 and 0 credits. 

No, that's not what I'm saying here. I'm saying we need to rethink
entropy accounting entirely.

An alternate plan is:

- add an add_network_randomness hook in the network stack that feeds
samples with 0 entropy credits (strengthen /dev/urandom)

- add similar sources elsewhere in the kernel

- remove the entropy accounting framework so that /dev/random is the
same as /dev/urandom


I currently don't think either of these is quite right for lots of
interesting systems and the answer is somewhere in the middle. 


Let me describle one extremely common pathological case: Wifi routers.
Many of these have effectively no storage and don't preserve pool data
across boots or a battery-backed RTC so they come up in a 'known' pool
state. Many of them also don't have a high-resolution time source or
cycle counter available. So guessing the timestamp on a network packet
is just about trivial. State extension attacks here are probably quite
easy (though I don't know that anyone's tried it).

So I think what we want to instead do is simply count samples until
we've got enough unleaked internal state that state extension becomes
hopeless. If we assume an attacker can guess each sample with 99%
accuracy (.08 bits per sample), pool state extension becomes intractable
(> 2^64 guesses) at a batch size on the order of 1000 samples. If we
have inputs from a variety of sources (network, keyboard, mouse, etc),
this number is probably quite a bit smaller.

This is a lot like the existing catastrophic reseed logic, except that
the effective multiplier is much larger and probably implies another
pool in the chain.

-- 
Mathematics is the supreme nostalgia of our time.


--
To unsubscribe from this list: send the line "unsubscribe linux-crypto" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Kernel]     [Gnu Classpath]     [Gnu Crypto]     [DM Crypt]     [Netfilter]     [Bugtraq]

  Powered by Linux