[21500] in cryptography@c2.net mail archive

home help back first fref pref prev next nref lref last post

Re: passphrases with more than 160 bits of entropy

daemon@ATHENA.MIT.EDU (Matt Crawford)
Wed Mar 22 16:30:58 2006

X-Original-To: cryptography@metzdowd.com
X-Original-To: cryptography@metzdowd.com
Date: Wed, 22 Mar 2006 13:58:26 -0600
From: Matt Crawford <crawdad@fnal.gov>
In-reply-to: <545120B2-83EE-4E68-AE15-B3BB97D6FF7A@mac.com>
To: Aram Perez <aramperez@mac.com>
Cc: Cryptography <cryptography@metzdowd.com>

> Let me rephrase my sequence. Create a sequence of 256 consecutive  
> bytes, with the first byte having the value of 0, the second byte  
> the value of 1, ... and the last byte the value of 255. If you  
> measure the entropy (according to Shannon) of that sequence of 256  
> bytes, you have maximum entropy.

I so often get irritated when non-physicists discuss entropy.  The  
word is almost always misused. I looked at Shannon's definition and  
it is fine, from a physics point of view.  But if you apply  
thoughtfully to a single fixed sequence, you correctly get the answer  
zero.

If your sequence is defined to be { 0, 1, 2, ..., 255 }, the  
probability of getting that sequence is 1 and of any other sequence,  
0.  Plug it in.

If you have a generator of 8-bit random numbers and every sample is  
independent and uniformly distributed, and you ran this for a  
gazillion iterations and wrote to the list one day saying the special  
sequence { 0, 1, 2, ..., 255 } had appeared in the output, that's a  
different story.  But still, we would talk about the entropy of your  
generator, not of one particular sequence of outputs.


---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo@metzdowd.com

home help back first fref pref prev next nref lref last post