1993-12-26 - Random numbers and “dirty” books

Header Data

From: jpinson@fcdarwin.org.ec
To: cypherpunks@toad.com
Message Hash: 442335e516493e26bba2207e4b577373febcacd91f3cba348e702f065393dc61
Message ID: <9312261726.AB14748@toad.com>
Reply To: N/A
UTC Datetime: 1993-12-26 17:27:20 UTC
Raw Date: Sun, 26 Dec 93 09:27:20 PST

Raw message

From: jpinson@fcdarwin.org.ec
Date: Sun, 26 Dec 93 09:27:20 PST
To: cypherpunks@toad.com
Subject: Random numbers and "dirty" books
Message-ID: <9312261726.AB14748@toad.com>
MIME-Version: 1.0
Content-Type: text/plain


I have been working off and on for some time on a one-time pad 
application (that I will release next week).   Since the success 
of an OTP is dependent on having a good source of random numbers, I
have kept my eyes open for any information related to random 
numbers, and their generation.

Conventional wisdom has it that the best way to generate random 
numbers is to measure some physical phenomenon such as radioactive
decay, and use these measurements to form your "random" number 
sequence.

Recently, I came across an interesting article in the "New 
Scientist" (May 8, 1993) entitled "The half Life of a Dirty 
Book".  This article indicates that phenomena-based
measurements may have some interesting properties.

A number of years ago, when people used log tables (any of you 
remember the CRC tables?), some astute observers noticed that
the front section of the log tables were more heavily used than 
the back section.  This seemed odd, since it would seem 
reasonable that all portions of the table should have an equal 
chance of being examined.

In 1938, the physicist Frank Benford made a study of numerous 
measurements based on natural phenomena.  He looked at surface 
areas of lakes, molecular weights of compounds etc.

He discovered that the first digit of any such measurement was 
most likely to be a one, and least likely to be a nine.  The 
probability  that this digit takes a given value decreases in a 
regular manner as the digits increase from one to nine.  This
is called Benford's Law.

Later, a report in the European Journal for Physics (vol 14, page 
59) showed that that alpha decay half lives, both predicted and 
observed, followed Benford's law.

What was also interesting was that Benford's law is scale- 
invariant.  According to the article: "The same law applies 
whether you measure the areas of lakes in hectares or square 
yards, whether you multiply house numbers by seven or 93, or 
whether you count the half-lives of alpha particles in seconds or 
centuries".

The article concludes that Benford's law may actually be a part 
of chaos theory.  Fractals, they also point out, are scale - 
invariant.

All this may indicate that random number produced from 
observations of physical phenomena may have characteristics that  
set them apart from software-derived random numbers.

Does this give a tool for breaking random number based encryption?
I can't see how, since you still can't figure out the exact sequence.
But it perhaps means there may be fractal patterns there, at least 
for numbers derived from physical phenomena.

Interesting.

Of course, with an 8 bit random number sequence (with a maximum number
of 255), it would be hard to generate a number sequence that did
not follow Benford's law.. You would have to throw in a lot of 9's , and
99's to get as as many numbers begining with '9' as you do that begin
with a '1' (1, 10-19,100-199).

So let's see..... first we had flat distributions vs. normal 
distributions, now Benford vs. non Benford.....

Jim Pinson     Confused in Galapagos.





Thread