[21620] in cryptography@c2.net mail archive
Re: Entropy Definition (was Re: passphrases with more than 160 bits of entropy)
daemon@ATHENA.MIT.EDU (David Malone)
Mon Mar 27 10:06:45 2006
X-Original-To: cryptography@metzdowd.com
X-Original-To: cryptography@metzdowd.com
Date: Mon, 27 Mar 2006 14:20:48 +0100
From: David Malone <dwmalone@maths.tcd.ie>
To: John Denker <jsd@av8n.com>
Cc: John Kelsey <kelsey.j@ix.netcom.com>, cryptography@metzdowd.com
In-Reply-To: <4425DFCB.4080003@av8n.com>
On Sat, Mar 25, 2006 at 07:26:51PM -0500, John Denker wrote:
> Executive summary: Small samples do not always exhibit "average" behavior.
That's not the whole problem - you have to be looking at the right
"average" too.
For the long run encodability of a set of IID symbols produced with
probability p_i, then that average is the Shannon Entropy. If
you're interested in the mean number of guesses (per symbol) required
to guess a long word formed from these symbols, then you should be
looking at (\sum_i \sqrt(p_i))^2. Other metrics (min entropy, work
factor, ...) require other "averages".
To see this behaviour, you both need a large sample and the right
type of average to match your problem (and I've assumed IID).
David.
---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo@metzdowd.com