[21498] in cryptography@c2.net mail archive

home help back first fref pref prev next nref lref last post

Re: passphrases with more than 160 bits of entropy

daemon@ATHENA.MIT.EDU (Perry E. Metzger)
Wed Mar 22 13:59:23 2006

X-Original-To: cryptography@metzdowd.com
X-Original-To: cryptography@metzdowd.com
To: leichter_jerrold@emc.com
Cc: aramperez@mac.com, cryptography@metzdowd.com
From: "Perry E. Metzger" <perry@piermont.com>
Date: Wed, 22 Mar 2006 13:58:32 -0500
In-Reply-To: <Pine.SOL.4.61.0603221328310.619@mental> (leichter jerrold's
 message of "Wed, 22 Mar 2006 13:32:51 -0500")


leichter_jerrold@emc.com writes:
> | Let me rephrase my sequence. Create a sequence of 256 consecutive  
> | bytes, with the first byte having the value of 0, the second byte the  
> | value of 1, ... and the last byte the value of 255. If you measure  
> | the entropy (according to Shannon) of that sequence of 256 bytes, you  
> | have maximum entropy.
>
> Shannon entropy is a property of a *source*, not a particular sequence
> of values.  The entropy is derived from a sum of equivocations about
> successive outputs.
>
> If we read your "create a sequence...", then you've described a source -
> a source with exactly one possible output.  All the probabilities will
> be 1 for the actual value, 0 for all other values; the equivocations are
> all 0.  So the resulting Shannon entropy is precisely 0.

Shannon information certainly falls to zero as the probability with
which a message is expected approaches 1. Kolmogorov-Chaitin
information cannot fall to zero, though it can get exceedingly small.

In either case, though, I suspect we're in agreement on what entropy
means, but Mr. Perez is not familiar with the same definitions that
the rest of us are using.

Perry

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo@metzdowd.com

home help back first fref pref prev next nref lref last post