1995-10-06 - Re: Simple Hardware RNG Idea

Header Data

From: norm@netcom.com (Norman Hardy)
To: Simon Spero <ses@tipper.oit.unc.edu>
Message Hash: 442a3ad4a3498e615fb25fda27e1ba5b356df58db2d4a02a943798de08f251fa
Message ID: <ac9ab28c02021004ba9d@DialupEudora>
Reply To: N/A
UTC Datetime: 1995-10-06 10:28:38 UTC
Raw Date: Fri, 6 Oct 95 03:28:38 PDT

Raw message

From: norm@netcom.com (Norman Hardy)
Date: Fri, 6 Oct 95 03:28:38 PDT
To: Simon Spero <ses@tipper.oit.unc.edu>
Subject: Re: Simple Hardware RNG Idea
Message-ID: <ac9ab28c02021004ba9d@DialupEudora>
MIME-Version: 1.0
Content-Type: text/plain


At 8:20 PM 10/5/95, Simon Spero wrote:
>On Thu, 5 Oct 1995, Norman Hardy wrote:
....
>> You presumably use the oddness of the count for your random bit in some
>> predetermined time interval. External radiation can change, but not bias
>> the parity. If the counter saturates, the counter may be biased towards one
>
>Hmmm. But isn't this method slightly biased? If the probability of  N
>events < the probability of N+1 events, wouldn't you need a large number
>of events per bit to make the bias insignificant?
....
What you really need is entropy (information). I propose concatenating
several counts and sending them thru MD5. The counts are distributed the
same way but are independent so that the entropy of the concatenation is
the sum of the entropies. Each count has a Poisson distribution. That tells
you how many bits of entropy there are in the input to the MD5. Take that
many bits, rounded down, as your random bits.

If there are an average of x bits in a time interval then the probability
that the count will be exactly K is (x^K/(K!))exp(-x). That is the Poisson
distribution. The entropy is then:

- sum[i=0 to infinity]  (x^K/(K!))exp(-x)log( (x^K/(K!))exp(-x))
= - sum[i=0 to infinity] (x^K/(K!))exp(-x)(log(x^K/(K!)) - x)
= - sum[i=0 to infinity] (x^K/(K!))exp(-x)(K*log(x) - log(K!) - x)

Here is a klutzy Scheme program to evaluate these:
(define (sum g)(letrec ((ss (lambda (n)
         (if (= n 0) (g 0) (+ (g n) (ss (- n 1)))))))
           (ss 30)))
(define (log2 x)(/ (log x)(log 2)))
(define (fact n)(if (= n 0) 1 (* n (fact (- n 1)))))
(define (p x k) (* (/ (expt x k)(fact k))(exp (- x))))
(define (en n)(sum (lambda(x) (let ((c (p x n)))
  (if (= c 0) 0 (* c (log2 c)))))))

(en 1) => 2.07
(en 3) => 2.92
(en 10) => 3.73
(en 15) => 4.0


I.e. if 1 count is expected on average there are two bits of entropy
in the count (supprising!) and if the count averages 10 then there
are 3.7 bits worth. It goes up as the log.

Before you bet your enterprise on this scheme consider that the math
was done at 03:30 AM.







Thread