1993-10-21 - What is the maximum # of bits a key could ever need to be?

Header Data

From: Robert J Woodhead <trebor@foretune.co.jp>
To: cypherpunks@toad.com
Message Hash: 258e9dc0f01b2e6b808ddd56948a51a258c0af948e3cfd9672ac4aab31b04742
Message ID: <9310211444.AA21786@dink.foretune.co.jp>
Reply To: N/A
UTC Datetime: 1993-10-21 14:47:56 UTC
Raw Date: Thu, 21 Oct 93 07:47:56 PDT

Raw message

From: Robert J Woodhead <trebor@foretune.co.jp>
Date: Thu, 21 Oct 93 07:47:56 PDT
To: cypherpunks@toad.com
Subject: What is the maximum # of bits a key could ever need to be?
Message-ID: <9310211444.AA21786@dink.foretune.co.jp>
MIME-Version: 1.0
Content-Type: text/plain



This occurred to me this morning in the inner sanctum of inspired cogitation,
aka the shower.  "What is the maximum # of bit a public-key would ever need
to be, given no breakthroughs in factoring?"

I came up with an answer, but it depends on some numbers that I don't have
handy; perhaps other people on the list can fill in the blanks.

First, we need an equation that tell us how difficult it is, in # of operations,
to factor a number of N bits.  eg: N_ops(N) = # of operations it will take.

Then all we need to do is find the N for which N_ops(N) is greater than

U_Duration * U_Particles * (1 / P_time)

Where U_Duration is the expected duration of the universe, U_Particles is the
number of particles in the universe (I am assuming that every particle can
be used as a processor; the programming I leave as an exercise to the
alert reader), and P_Time is the Planck time (damned if I can remember it)
in seconds, which ought to be a good upper bound for clock speed on the
Universal CPU.

A most likely useless number, but it would be interesting to know what it
comes out to.

Best,
R





Thread