From: “Perry E. Metzger” <perry@imsi.com>
To: schneier@chinet.chinet.com
Message Hash: 515b468ed0e2539df2259709c209b1c39921d6e37ba216424d4466d63914f211
Message ID: <9502130145.AA19584@snark.imsi.com>
Reply To: <m0rdnqt-000k5xC@mailbox.mcs.com>
UTC Datetime: 1995-02-13 01:45:56 UTC
Raw Date: Sun, 12 Feb 95 17:45:56 PST
From: "Perry E. Metzger" <perry@imsi.com>
Date: Sun, 12 Feb 95 17:45:56 PST
To: schneier@chinet.chinet.com
Subject: Re: Factoring - State of the Art and Predictions
In-Reply-To: <m0rdnqt-000k5xC@mailbox.mcs.com>
Message-ID: <9502130145.AA19584@snark.imsi.com>
MIME-Version: 1.0
Content-Type: text/plain
schneier@chinet.chinet.com says:
> Making predictions beyond the near future is even more foolish.
> Who knows what kind of advances in computing, networking, and
> mathematics are going to happen by 2020? However, if you look at
> the broad picture, in every decade we can factor numbers twice as
> long as in the previous decade. This leads to Table 5.
I'm not sure I agree with this assumption. From current knowledge, it
seems that factoring is still exponential -- we've just progressed on
the algorithms a bit. That can't continue forever, though. Assuming
algorithms remain stable on your most optimistic estimate (which would
require some advances even so), we would assume that factoring would
remain exponential, and that adding a constant number of bits to a
number would add a constant factor to the increase in
complexity. Since computing speeds are also rising exponentially, but
not superexponentially, we would assume that the number of bits we
could factor would grow linearly with time -- that is, each decade
would see numbers about another 60-80 digits long factored. This would
mean that every doubling in key length would give us more than just a
constant increase in the safety factor.
Perry
Return to February 1995
Return to “Zachary <zachary@pentagon.io.com>”