Re: block ciphers & plaintext attacks

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Tue, Nov 28, 2000 at 06:08:22PM +0000, Marc Mutz wrote:
> John Kennedy wrote:
> <snip>
> >   As far as passphrase entropy, I'm a bit ignorant at the moment.  I
> > initially coded to be compatible with losetup, presuming that it would
> > be coded in a secure fashion.  Without a lot more reading, I can't say
> > if that is true or not though.
> > 
> >   I'm under the impression that the 1.3 bits/char problem is pretty common
> > and that one of the first things you do is run it through a one-way hash,
> > generating something that looks far more like random bits.  I see the
> > two calls to rmd160_hash_buffer(), but I haven't confirmed that what
> > they're actually doing is something like what I've been told.
> > 
> 
> losetup takes the passphrase-string, calculates the ripe-md hash
> function of that string and uses the resulting bitstring as the key to
> the cipher (serpent, in your case).
> 
> <snip>
> >   Enough to spend my time productively on the passphrase, yes.  (:  Either
> > the current losetup code will be secure or not and, if not, I'll just
> > add another layer with a passphrase protecting an encrypted passphrase
> > to the real data on the disk.
> 
> The losetup code is as secure as it can get.

  Ok, I'm not wasting anybody's time up through here.  (:

> >   (Yes, you could still try to brute-force the first passphrase, but it
> >    and the encrypted 2nd passphrase can easily be kept apart from the
> >    hard-drive encrypted with the 2nd passphrase.  If rmd160_hash_buffer()
> >    doesn't introduce enough entropy, that ought to help a lot.)
> 
> Hashing _never_ (!!!) introduces entropy. It might even _decrease_ your
> entropy by a few bits. Entropy can be rougly defined as the number of
> bits you need to store a certain data set. E.g. a random 32 bit value
> has 32 bits of entropy, because you can store 2^32 states with it. Yet a
> 4-character string containing only the 26 letters can have at most
> lb(26^4)=18.75 bits of entropy (where lb is logarithm w.r.t. base 2),
> since you can only encode 26^4 ~= 2^18.75 states with this. The famous
> 1.3 bits/char are valid for english running text.

  Here I'm just using the wrong keyword and wasting your time, making
you excitable and giving you the wrong impression.

  I thought, for whatever reason, we were comparing the non-merits of
plain ascii text as the key to the cipher serpent rather than the result
of the ripe-md hash and talking about the non-merits of [A-Za-z0-9]*4
vs. a 32-bit number generated by the hash as the key contents.

  Of course, being ignorant got the following gem out of you:

> So, if you want to have a passphrase that is as secure as the cipher is
> (i.e attacking the passphrase is not much faster than attacking the key
> directly), you have to enter approx. 100 characters of english text.
> This will give you a passphrase of ca. 130 bits entropy which in turn is
> fed into the hash function to produce a bit string with around 128 bits
> of entropy (hashes lose entropy because they are not bijective from
> GF(2^128) onto itself). Yet that string is 160 bits long and only the
> frst 128 bits of it are taken as the passphrase. Thus, your key's
> entropy can be approximated to be 128*(128/160)=102 bits. But that is
> good enough.

  So, to try and convert that to a horrible generalization, if I use
the letter `A' as my passphrase, my true entropy is somewhere between
1.3 and 8 bits (depending on if you know I'm using english text or not).
Yes, it may be hashed out and effectively padded to 128 bits, but it looks
far more random than it actually is.

  I'm not going to touch the math without cracking the AC book some more.

  No, my longest passphrase is ~50 characters, which gives me about
half the entropy I could have if I'm lucky.


  In addition, my blabbing about using two keys really doesn't make things
more secure from a mathematical point since we're talking about the
weakest link.  Break the first key (using it to decrypt the 2nd) and
it doesn't really matter how much entropy the 2nd key may really have,
you have it handed to you on a platter.

  As far as a real (but not mathematical) brute-force attempt goes,
it may do some good.  If all they have is the data encrypted by the 2nd
key, it (the 2nd key) has more entropy in the cipher key (and presumably
passes on the benefits of that in a mathematically tangible way).
They have to have possession of the 2nd key encrypted with the 1st key
to exploit the lack of entropy in the 1st key.  The 2nd key (encrypted)
is more mobile than the data encrypted by the 2nd key, which means that
it is physically much more difficult to get (something that can't be
factored into the mathematical attacks, but may be irrelevant from their
point of view anyway).

  Oh hell, I'll just shut up now.  Trying not to trip over the differences
between a mathematical/academic attack vs. real brute-force decryption
is just going to get me in trouble anyway.  (:

> You see, there's a _lot_ more to security than ciphers. You should
> really read (and understand) Applied Cryptography, if you take your
> thing serious.

  I went out and bought it last night.  (:

Linux-crypto:  cryptography in and on the Linux system
Archive:       http://mail.nl.linux.org/linux-crypto/


[Index of Archives]     [Kernel]     [Linux Crypto]     [Gnu Crypto]     [Gnu Classpath]     [Netfilter]     [Bugtraq]
  Powered by Linux