1996-12-20 - Re: Executing Encrypted Code

Header Data

From: ph@netcom.com (Peter Hendrickson)
To: Hal Finney <cypherpunks@toad.com
Message Hash: a18bd973ada256cc788059cf1b8327ccb42862b004d872aad41c149bdfcc1362
Message ID: <v02140b04aee09fe91fdf@[192.0.2.1]>
Reply To: N/A
UTC Datetime: 1996-12-20 20:26:26 UTC
Raw Date: Fri, 20 Dec 1996 12:26:26 -0800 (PST)

Raw message

From: ph@netcom.com (Peter Hendrickson)
Date: Fri, 20 Dec 1996 12:26:26 -0800 (PST)
To: Hal Finney <cypherpunks@toad.com
Subject: Re: Executing Encrypted Code
Message-ID: <v02140b04aee09fe91fdf@[192.0.2.1]>
MIME-Version: 1.0
Content-Type: text/plain


At 7:43 AM 12/20/1996, Hal Finney wrote:
>> The manufacturer is going to publish a list of ALL of the public keys?
>> We're talking one key per chip, right?  Isn't that an AWFUL lot of
>> keys, like, in the millions range?

> Probably an easier way would be for the chip manufacturer to issue a
> key certificate (signature) on the chip keys.  Then it is a trivial
> matter for any software manufacturer to verify that a proferred chip
> key is legit; just check the cert.

Now why didn't I think of that!

>>> One approach is for the manufacturer to authenticate software submitted
>>> by approved vendors.  The vendors are then tasked with encrypting it
>>> for the correct processor.

>> I'm not sure the "approved" bit would go over too well... one idea
>> would be to license the compiler writers, who would build the
>> encryption into compilers.  It's still not horribly great, but
>> better.

> Hey, it's a free world, right?  Some people only run authenticated
> code from big companies; other people turn off the authentication
> bit in the CPU and can run any old thing they stumble across on the net.
> Everybody's happy.

Maybe not everybody.  My scheme would not let you turn off the authentication
bit.  That means that if somebody does find way to get at the secret
key, they still can't run the code without doing something expensive with
particular processors.  Basically, this won't be worth the trouble.

> The first group doesn't have to worry about viruses,
> or at least they have somebody to sue if they see one, and the second
> group gets to run all the freeware and PD code they can today.

The second group would have to buy a different processor altogether,
the way I proposed it.  But, that does not seem unreasonable.  One
could even imagine systems which have a "free" processor and a
decrypting processor.  While they would have to have completely different
instruction sets, they could be pin compatible.

>> Right; the only reason I could see people using this would be for
>> economical reasons.

> Yes, I think this is a point often missed in these discussions.  People
> say, why would I want a CPU which will limit the software I can run,
> something which will let a software maker give me a version of his
> program which will only run on my CPU and which I have no ability to
> share with others?  What's in it for me?

> The answer presumably is that the software manufacturer will sell software
> with such limits for much less than he will sell unlimited software.  That's
> because software piracy is such a major problem, and this way he can be
> protected against piracy from this copy of his program.  So people with
> these CPU's can buy their software a lot cheaper.

I agree with all of this.  Some people might be happier if they think
of this as just another kind of agreement that people can make.  Right
now the software development agreement is weak because piracy is so
easy.  We rely a little bit on the law and a whole lot on the integrity
of customers.  Really, that isn't ideal.  It would be nice if people
could make strong agreements to write software.

My judgement is that software piracy is less of a problem in the U.S.
than it is elsewhere.  My impression is that there are many other
countries in the world where the whole "I should help pay for the
development of this software because I am using it" idea just doesn't
really show up on the screen.

This processor would make it possible to sell software to the whole
world without really worrying too much about how well each government
enforces its laws or how ethical the people in foreign countries are.

> Of course the big downside is that the track record of tamper resistant
> hardware has not been too strong lately!  If a system like this gets into
> widespread use and somebody finds out that shooting X-rays at the chip
> exposes its secret key, you've got a big problem.

Processors are really only good for 3 years or so.  The vendor is only
betting that attacks won't become widespread in that time period.  That
is not a perfectly safe bet, but it isn't bad.  The attacks can exist,
but so long as they are expensive or hard to generalize to all processors,
the software vendors are safe.

Peter Hendrickson
ph@netcom.com







Thread