1996-12-20 - Re: Executing Encrypted Code

Header Data

From: Hal Finney <hal@rain.org>
To: cypherpunks@toad.com
Message Hash: ee5b9a0c5b2c3ff356b871072df78aa78f6573b57e7d8f91fb43ca5c23f3e91f
Message ID: <199612201543.HAA02076@crypt.hfinney.com>
Reply To: N/A
UTC Datetime: 1996-12-20 19:39:24 UTC
Raw Date: Fri, 20 Dec 1996 11:39:24 -0800 (PST)

Raw message

From: Hal Finney <hal@rain.org>
Date: Fri, 20 Dec 1996 11:39:24 -0800 (PST)
To: cypherpunks@toad.com
Subject: Re: Executing Encrypted Code
Message-ID: <199612201543.HAA02076@crypt.hfinney.com>
MIME-Version: 1.0
Content-Type: text/plain


From: Ben Byer <root@bushing.plastic.crosslink.net>
> [Quoting Peter Hendrickson:]
> > The software vendor would be wise to check that the public key was
> > legal.  It would be a simple matter for the manufacturer to publicize
> > all public keys that had been installed on chips.
>
> The manufacturer is going to publish a list of ALL of the public keys?
> We're talking one key per chip, right?  Isn't that an AWFUL lot of
> keys, like, in the millions range?

Probably an easier way would be for the chip manufacturer to issue a
key certificate (signature) on the chip keys.  Then it is a trivial
matter for any software manufacturer to verify that a proferred chip
key is legit; just check the cert.

> > One approach is for the manufacturer to authenticate software submitted
> > by approved vendors.  The vendors are then tasked with encrypting it
> > for the correct processor.
>
> I'm not sure the "approved" bit would go over too well... one idea
> would be to license the compiler writers, who would build the
> encryption into compilers.  It's still not horribly great, but
> better.

Hey, it's a free world, right?  Some people only run authenticated
code from big companies; other people turn off the authentication
bit in the CPU and can run any old thing they stumble across on the net.
Everybody's happy.  The first group doesn't have to worry about viruses,
or at least they have somebody to sue if they see one, and the second
group gets to run all the freeware and PD code they can today.

> Right; the only reason I could see people using this would be for
> economical reasons.

Yes, I think this is a point often missed in these discussions.  People
say, why would I want a CPU which will limit the software I can run,
something which will let a software maker give me a version of his
program which will only run on my CPU and which I have no ability to
share with others?  What's in it for me?

The answer presumably is that the software manufacturer will sell software
with such limits for much less than he will sell unlimited software.  That's
because software piracy is such a major problem, and this way he can be
protected against piracy from this copy of his program.  So people with
these CPU's can buy their software a lot cheaper.

Now if you only use pirated software anyway, which you get for free, then
obviously this is not much of an incentive.  It is only for people who
pay for their software.  But that is a significant market.

Of course the big downside is that the track record of tamper resistant
hardware has not been too strong lately!  If a system like this gets into
widespread use and somebody finds out that shooting X-rays at the chip
exposes its secret key, you've got a big problem.

Hal





Thread