[149150] in cryptography@c2.net mail archive
Re: [Cryptography] cheap sources of entropy
daemon@ATHENA.MIT.EDU (John Denker)
Tue Jan 21 02:41:51 2014
X-Original-To: cryptography@metzdowd.com
Date: Mon, 20 Jan 2014 14:38:19 -0700
From: John Denker <jsd@av8n.com>
To: cryptography@metzdowd.com
In-Reply-To: <808714277.20140120205050@gmail.com>
Errors-To: cryptography-bounces+crypto.discuss=bloom-picayune.mit.edu@metzdowd.com
On 01/20/2014 01:55 PM, Jerry Leichter wrote:
> If I had a *choice* between a carefully implemented physical circuit =
> based on shot noise or some similar well-understood source of "core =
> randomness" or something fairly ad hoc based on sensors and human =
> interaction, *of course* I would choose the former. =
OK.
> But that may
> not be available. Still, the latter isn't bad, even if the actual =
> randomness available can't be as easily quantified.
I still claim it is worth quantifying things as much as you
can. The calculation is not that hard to do, and it tells you
things you wouldn't otherwise know. =
It tells you that if you are using an accelerometer to capture
the human interaction, the physics of the sensor is a better =
source of entropy than the human is. More specifically, it
tells you that choosing a sensor with more gain and/or more =
bandwidth buys you a lot more than any amount of wild arm-waving.
The arm-waving increases the amount of squish, but it does not
appreciably increase the amount of hard-core entropy.
> A sound field in a complex natural environment, if looked at to high
> precision, is extremely variable from place to place and from moment
> to moment.
Given a high-precision microphone preamp, it provides better =
randomness if the input is open-circuited, rather than attached
to an actual microphone, no matter how "complex" the acoustic
environment is ... and it continues to work even in non-complex =
environments.
> firing of multiple nerve cells triggering multiple muscle cells
That suffers from the fact that the bandwidth is not large.
The mass of the arm produces a low-pass filter, in accordance
with Newton's second law of motion. If you did the calculation, =
you would notice this immediately.
If you don't know how to do the calculation, that does not
give you a license to just give up. If you don't know how =
to do the calculation, collaborate with somebody who does.
On 01/20/2014 12:50 PM, Kriszti=E1n Pint=E9r wrote:
> theoretically and strictly speaking, there is no such thing as lower
> bound on entropy.
Nonsense.
> we can only put upper bounds
Speak for yourself, Kemosabe.
The fact is:
a) Statistical tests provide only upper bounds.
b) Security depends on establishing a lower bound.
c) Under reasonable conditions, physics can provide =
the required lower bound.
> if the world is deterministic, entropy is pretty hard to even
> interpret.
Petitio principii is not an acceptable substitute for actual
evidence.
The world I live in is not deterministic. This is required
by the laws of thermodynamics, not to mention quantum mechanics.
Computers are deterministic, to a good-enough approximation,
because engineers go to a lot of trouble to make them so.
Everyone should please refrain from assuming that the real
world works the same way a computer does. Specifically:
Any Turing-computable model of the universe would be at best
a hidden-variable theory. It appears that any such theory
that is subject to relativistic causality would violate the =
laws of quantum mechanics in general, and the Bell inequalities
in particular.
So what's the deal? Are you assuming everything we know =
about QM is wrong, or everything we know about relativity
is wrong? Do you have any actual evidence?
> we always talk about the entropy as seen by some specific observer
That would be the /conditional entropy/. That is related
to conditional probability in the same way that entropy
is related to probability.
A good TRNG produces unconditional entropy. I could not
predict the output of my own TRNG even if I wanted to.
> you can hardly say anything looking at the data.
Statistical tests on the RNG output (or the raw sensor
data) can provide upper bounds on the entropy. This is =
sometimes interesting, but it is nowhere near sufficient =
for building a secure system.
> if a scientist comes up with a better model of handwaving or a better
> model of how noise on a video camera works, the effective entropy
> drops.
Not necessarily, unless you plan on violating basic, long-
established laws of physics.
As James Randi is fond of saying: Extraordinary claims require =
extraordinary proof. I haven't seen any proof -- or indeed any
evidence -- that any "better model" can remove Johnson noise
from a signal.
_______________________________________________
The cryptography mailing list
cryptography@metzdowd.com
http://www.metzdowd.com/mailman/listinfo/cryptography