On Wed, 2006-05-31 at 10:20 -0400, Deskin Miller wrote: > On 5/31/06, Marc Schwartz <MSchwartz@xxxxxxxxx> wrote: > > Jan Reusch wrote: > > > > > Marc Schwartz schrieb: > > > > > >> (thing about filling the partition with random data before creating > > >> crypted device) > > > > > > they say /dev/zero is much faster than /dev/urandom (which should be > > > true ;) ) > > > so they whipe the first few m'bytes of the plan device with > > > /dev/urandom (thats where the luks header will be written to), > > > luksFormat the device, set it up and whipe the new encrypted device > > > with zeros. > > > so the zeros get encrypted and voila, random data on the partition. > > > > > > my question is: > > > somebody who knows this technique now knows the plain "data" (or at > > > least for a long time huge parts of it) could he get information about > > > the master key he else wouldn't have? > > > > This seems to be going after convenience at the expense of security. > > Right-- known plaintext attacks, perhaps other weaknesses in the > encryption with just zeros-- to messy to figure out. > > So how about this? We do like they suggest with /dev/zero, but we do > it with a randomly-generated key, which has nothing to do with the key > used to actually encrypt data, and furthermore doesn't need to be > remembered: we throw it away after writing the random (encrypted) > data. > > When the disk is actually mounted, it uses the real key, generated > from a passphrase or some other method, business as usual- the data > written to the drive is still zeros, but with a cryptographic key > which we didn't remember, and aren't using anyway, so with the real > key the data is seemingly random. > > Known plaintext or attacks against the algorithm can't reveal the > actual key, because it wasn't used to randomize the drive. This issue with this approach (if correctly I understand what you are proposing) is that it presumes that, in effect, a double encryption process using two different keys, is better than the "simple" though more time consuming process of using /dev/urandom. That presumption may not be valid and would need to be exhaustively tested with various algorithms. For why such a process can be problematic, you might want to review this entry on Triple DES, focusing on the comments on Double DES (2DES): http://en.wikipedia.org/wiki/Triple_DES It would take some searching for tests involving other algorithms to assess possible weaknesses of double encryption, either with a single key or with two different keys, though such evaluations likely exist. You could consider two different algorithms, but this would be subject to the same evaluation issues. This is why "home grown" algorithms are classically weak. They need to be tested against known attack types by experts. This is why the evaluation of algorithms can take years, as it did with AES as a replacement for DES. HTH, Marc Schwartz --------------------------------------------------------------------- - http://www.saout.de/misc/dm-crypt/ To unsubscribe, e-mail: dm-crypt-unsubscribe@xxxxxxxx For additional commands, e-mail: dm-crypt-help@xxxxxxxx