From: “Perry E. Metzger” <perry@piermont.com>
To: IPG Sales <ipgsales@cyberstation.net>
Message Hash: c2d1f955746d17a8ff55e80cbfc285d04206f6e37e3c5b639d03ab41ac5c30a9
Message ID: <199602212043.PAA10048@jekyll.piermont.com>
Reply To: <Pine.BSD/.3.91.960221132344.3814A-100000@citrine.cyberstation.net>
UTC Datetime: 1996-02-22 00:31:58 UTC
Raw Date: Thu, 22 Feb 1996 08:31:58 +0800
From: "Perry E. Metzger" <perry@piermont.com>
Date: Thu, 22 Feb 1996 08:31:58 +0800
To: IPG Sales <ipgsales@cyberstation.net>
Subject: Re: Internet Privacy Guaranteed ad (POTP Jr.)
In-Reply-To: <Pine.BSD/.3.91.960221132344.3814A-100000@citrine.cyberstation.net>
Message-ID: <199602212043.PAA10048@jekyll.piermont.com>
MIME-Version: 1.0
Content-Type: text/plain
IPG Sales writes:
> Stubborness and stupidity are twins - Sophocles
IPG is both stubborn and stupid. How appropriate.
Let me give you another quote. It is acually a long extract of a
document written by Phil Zimmermann. Read it.
Beware of Snake Oil
===================
When examining a cryptographic software package, the question always
remains, why should you trust this product? Even if you examined the
source code yourself, not everyone has the cryptographic experience
to judge the security. Even if you are an experienced cryptographer,
subtle weaknesses in the algorithms could still elude you.
When I was in college in the early seventies, I devised what I
believed was a brilliant encryption scheme. A simple pseudorandom
number stream was added to the plaintext stream to create
ciphertext. This would seemingly thwart any frequency analysis of
the ciphertext, and would be uncrackable even to the most resourceful
Government intelligence agencies. I felt so smug about my
achievement. So cock-sure.
Years later, I discovered this same scheme in several introductory
cryptography texts and tutorial papers. How nice. Other
cryptographers had thought of the same scheme. Unfortunately, the
scheme was presented as a simple homework assignment on how to use
elementary cryptanalytic techniques to trivially crack it. So much
for my brilliant scheme.
From this humbling experience I learned how easy it is to fall into a
false sense of security when devising an encryption algorithm. Most
people don't realize how fiendishly difficult it is to devise an
encryption algorithm that can withstand a prolonged and determined
attack by a resourceful opponent. Many mainstream software engineers
have developed equally naive encryption schemes (often even the very
same encryption scheme), and some of them have been incorporated into
commercial encryption software packages and sold for good money to
thousands of unsuspecting users.
This is like selling automotive seat belts that look good and feel
good, but snap open in even the slowest crash test. Depending on
them may be worse than not wearing seat belts at all. No one
suspects they are bad until a real crash. Depending on weak
cryptographic software may cause you to unknowingly place sensitive
information at risk. You might not otherwise have done so if you had
no cryptographic software at all. Perhaps you may never even
discover your data has been compromised.
Sometimes commercial packages use the Federal Data Encryption
Standard (DES), a fairly good conventional algorithm recommended by
the Government for commercial use (but not for classified
information, oddly enough-- hmmm). There are several "modes of
operation" the DES can use, some of them better than others. The
Government specifically recommends not using the weakest simplest
mode for messages, the Electronic Codebook (ECB) mode. But they do
recommend the stronger and more complex Cipher Feedback (CFB) or
Cipher Block Chaining (CBC) modes.
Unfortunately, most of the commercial encryption packages I've looked
at use ECB mode. When I've talked to the authors of a number of
these implementations, they say they've never heard of CBC or CFB
modes, and didn't know anything about the weaknesses of ECB mode.
The very fact that they haven't even learned enough cryptography to
know these elementary concepts is not reassuring. And they sometimes
manage their DES keys in inappropriate or insecure ways. Also, these
same software packages often include a second faster encryption
algorithm that can be used instead of the slower DES. The author of
the package often thinks his proprietary faster algorithm is as
secure as the DES, but after questioning him I usually discover that
it's just a variation of my own brilliant scheme from college days.
Or maybe he won't even reveal how his proprietary encryption scheme
works, but assures me it's a brilliant scheme and I should trust it.
I'm sure he believes that his algorithm is brilliant, but how can I
know that without seeing it?
In all fairness I must point out that in most cases these terribly
weak products do not come from companies that specialize in
cryptographic technology.
Even the really good software packages, that use the DES in the
correct modes of operation, still have problems. Standard DES uses a
56-bit key, which is too small by today's standards, and may now be
easily broken by exhaustive key searches on special high-speed
machines. The DES has reached the end of its useful life, and so has
any software package that relies on it.
There is a company called AccessData (87 East 600 South, Orem, Utah
84058, phone 1-800-658-5199) that sells a package for $185 that
cracks the built-in encryption schemes used by WordPerfect, Lotus
1-2-3, MS Excel, Symphony, Quattro Pro, Paradox, and MS Word 2.0. It
doesn't simply guess passwords-- it does real cryptanalysis. Some
people buy it when they forget their password for their own files.
Law enforcement agencies buy it too, so they can read files they
seize. I talked to Eric Thompson, the author, and he said his
program only takes a split second to crack them, but he put in some
delay loops to slow it down so it doesn't look so easy to the
customer. He also told me that the password encryption feature of
PKZIP files can often be easily broken, and that his law enforcement
customers already have that service regularly provided to them from
another vendor.
In some ways, cryptography is like pharmaceuticals. Its integrity
may be absolutely crucial. Bad penicillin looks the same as good
penicillin. You can tell if your spreadsheet software is wrong, but
how do you tell if your cryptography package is weak? The ciphertext
produced by a weak encryption algorithm looks as good as ciphertext
produced by a strong encryption algorithm. There's a lot of snake
oil out there. A lot of quack cures. Unlike the patent medicine
hucksters of old, these software implementors usually don't even know
their stuff is snake oil. They may be good software engineers, but
they usually haven't even read any of the academic literature in
cryptography. But they think they can write good cryptographic
software. And why not? After all, it seems intuitively easy to do
so. And their software seems to work okay.
Anyone who thinks they have devised an unbreakable encryption scheme
either is an incredibly rare genius or is naive and inexperienced.
Unfortunately, I sometimes have to deal with would-be cryptographers
who want to make "improvements" to PGP by adding encryption
algorithms of their own design.
I remember a conversation with Brian Snow, a highly placed senior
cryptographer with the NSA. He said he would never trust an
encryption algorithm designed by someone who had not "earned their
bones" by first spending a lot of time cracking codes. That did make
a lot of sense. I observed that practically no one in the commercial
world of cryptography qualified under this criterion. "Yes", he said
with a self assured smile, "And that makes our job at NSA so much
easier." A chilling thought. I didn't qualify either.
The Government has peddled snake oil too. After World War II, the US
sold German Enigma ciphering machines to third world governments.
But they didn't tell them that the Allies cracked the Enigma code
during the war, a fact that remained classified for many years. Even
today many Unix systems worldwide use the Enigma cipher for file
encryption, in part because the Government has created legal
obstacles against using better algorithms. They even tried to
prevent the initial publication of the RSA algorithm in 1977. And
they have squashed essentially all commercial efforts to develop
effective secure telephones for the general public.
The principal job of the US Government's National Security Agency is
to gather intelligence, principally by covertly tapping into people's
private communications (see James Bamford's book, "The Puzzle
Palace"). The NSA has amassed considerable skill and resources for
cracking codes. When people can't get good cryptography to protect
themselves, it makes NSA's job much easier. NSA also has the
responsibility of approving and recommending encryption algorithms.
Some critics charge that this is a conflict of interest, like putting
the fox in charge of guarding the hen house. NSA has been pushing a
conventional encryption algorithm that they designed, and they won't
tell anybody how it works because that's classified. They want
others to trust it and use it. But any cryptographer can tell you
that a well-designed encryption algorithm does not have to be
classified to remain secure. Only the keys should need protection.
How does anyone else really know if NSA's classified algorithm is
secure? It's not that hard for NSA to design an encryption algorithm
that only they can crack, if no one else can review the algorithm.
Are they deliberately selling snake oil?
There are three main factors that have undermined the quality of
commercial cryptographic software in the US. The first is the
virtually universal lack of competence of implementors of commercial
encryption software (although this is starting to change since the
publication of PGP). Every software engineer fancies himself a
cryptographer, which has led to the proliferation of really bad
crypto software. The second is the NSA deliberately and
systematically suppressing all the good commercial encryption
technology, by legal intimidation and economic pressure. Part of
this pressure is brought to bear by stringent export controls on
encryption software which, by the economics of software marketing,
has the net effect of suppressing domestic encryption software. The
other principle method of suppression comes from the granting all the
software patents for all the public key encryption algorithms to a
single company, affording a single choke point to suppress the spread
of this technology. The net effect of all this is that before PGP
was published, there was almost no highly secure general purpose
encryption software available in the US.
I'm not as certain about the security of PGP as I once was about my
brilliant encryption software from college. If I were, that would be
a bad sign. But I'm pretty sure that PGP does not contain any
glaring weaknesses (although it may contain bugs). The crypto
algorithms were developed by people at high levels of civilian
cryptographic academia, and have been individually subject to
extensive peer review. Source code is available to facilitate peer
review of PGP and to help dispel the fears of some users. It's
reasonably well researched, and has been years in the making. And I
don't work for the NSA. I hope it doesn't require too large a "leap
of faith" to trust the security of PGP.
-- Phil Zimmerman, in the PGP manual.
IPG Sales writes:
> 1. The OTPs are generated on a standalone diskette only system,
They are not One Time Pads. They are keys for a random number
generator. Your continued assertion that they are One Time Pads is
fraudulent. They are not one time pads by any definition ever
previously used.
Furthermore, as has been stated, it is completely unacceptable for
keys to be generated by third parties.
> 2. From there, they go to QC - we perform:
The QC you perform is irrelevant.
The system you sell is insecure in a practical sense, likely uses an
insecure PRNG, and uses names and makes claims that come very close to
being fraudlent. It is harder, not easier, to manage the keys from
your system that supposedly "eliminates key management", and you don't
even have any shame about the fact that you are ignorant of the field
you work in.
> Ralph
Perry
Return to February 1996
Return to “Tim Philp <bplib@wat.hookup.net>”