1992-11-26 - Re: chip verification ( Was: Tollhouse Cookies :-)

Header Data

From: gnu
To: Richard Childers <rchilder@us.oracle.com>
Message Hash: 64e086f828eb224341a4c48ef72502166d2abe172820d00bb9b1862f3f409bc3
Message ID: <9211262140.AA12038@toad.com>
Reply To: <9211260206.AA08221@rchilder.us.oracle.com>
UTC Datetime: 1992-11-26 21:40:36 UTC
Raw Date: Thu, 26 Nov 92 13:40:36 PST

Raw message

From: gnu
Date: Thu, 26 Nov 92 13:40:36 PST
To: Richard Childers <rchilder@us.oracle.com>
Subject: Re: chip verification ( Was: Tollhouse Cookies :-)
In-Reply-To: <9211260206.AA08221@rchilder.us.oracle.com>
Message-ID: <9211262140.AA12038@toad.com>
MIME-Version: 1.0
Content-Type: text/plain


> It seems to me that an ostensibly digital device with a fixed number
> of pins could be regarded as a finite state system, and systematically
> analyzed accordingly, IE, traverse the set of possible combinations of
> pins and signal levels and verify that it behaves in accord with pub-
> -lically available specifications.

This is not useful, because devices can have internal state.  For example, 
a device could do the same thing in response to an input for 1000 times,
and then do something different when a counter reaches 1001.

I've been employed writing test vectors for chips.  It *is* possible
to write a set of tests that will verify that a piece of hardware
matches its design (which, BTW, is software, stored on a disk in a
database).  Every chip vendor does this on every chip they make, to
detect manufacturing defects.  These tests will not detect hardware
that has been deliberately modified to pass the tests, but to do
something different in actual operation.  This is an obvious risk when
the tests are necessarily publicly available.

Of course, if a particular form of modification is suspected, people
can write new tests that look for it.  But it becomes like the virus
protection game:  looking for the signatures of known viruses doesn't
protect you from unknown ones.  There's no proven way to detect all
modifications.  

	John Gilmore

PS: See also Ken Thompson's paper, "reflections on trusting trust",
which was published as an ACM Turing Award lecture among other places.
Your CAD software could have been modified so that it inserts a mod
into your chip that doesn't appear in the chip's source code.  The
compiler used to build the CAD software could've been modified to
insert the above mod into your CAD software binaries as they are
compiled.  A compiler binary used to build the *compiler* sources into
binaries could have been modified to insert the compiler mod described
above.  Even if you have full sources, you can't trust the system --
you have to trust the people who bootstrapped it, and all the people
who wrote the tools *they* used.  "Proving" anything becomes very slippery
here.





Thread