From: dat@ebt.com (David Taffs)
To: tcmay@netcom.com
Message Hash: 7a0d80e98aa6affe101228b4f638a887076f17a530fe40276fb88d084c5bb823
Message ID: <9404240025.AA01558@helpmann.ebt.com>
Reply To: <199404232246.PAA28690@mail.netcom.com>
UTC Datetime: 1994-04-24 00:26:42 UTC
Raw Date: Sat, 23 Apr 94 17:26:42 PDT
From: dat@ebt.com (David Taffs)
Date: Sat, 23 Apr 94 17:26:42 PDT
To: tcmay@netcom.com
Subject: Re: T-Shirts, Neil Young, Asilomar, and Smalltalk
In-Reply-To: <199404232246.PAA28690@mail.netcom.com>
Message-ID: <9404240025.AA01558@helpmann.ebt.com>
MIME-Version: 1.0
Content-Type: text/plain
Pardon me for getting on a soapbox (again)
T. C. May, for whom I have the utmost respect (and whose messages
are always enlighting and enjoyable), says (in part):
And "Byzantine
Agreement" (is this the same thing as Byzantine Generals?) shows up, I
recall, in some crypto papers.
Yes, they are the same. You have N mutually suspicious individuals
trying to reach concensus about something -- what protocol do you use?
I believe the seminal paper (or at least some really good, polished,
early work) was by Leslie Lamport at Xerox Parc (et al.), but I may be
wrong.
> Objects are Great; C++ (using objects, in I believe the way you mean)
> is clearly the language of choice for the virtually the entire
Yes, of course this is what I meant. That's why I mentioned the
Smalltalk approach. (I won't get into issues of performance of C++
over Smalltalk and Lisp systems...my contention is that there's a vast
amount of computer power out there and a (relative) shortage of good
programmers and their time, and that this implies that only truly
time-critical things or many-times-replicated programs warrant writing
in lower--level languages. A religious point, no doubt.)
Also a practical (== economic) point. When I worked at Mentor Graphics
(MGC), I was amazed at the enormous percentage of effort devoted to
optimization of our products (MGC builds the software to help design
circuits that go in workstations that run MGC software that helps
design circuits...). The _entire company_ (many hundreds of engineers)
just about spent _years_ making a recent release small enough and fast
enough to be commercially viable (luckily for me and them they
succeeded -- of course, there were bug fixes and some enhancements
added during the same time period).
At MGC and now at EBT, efficiency (= responsiveness, = salability) of the
delivered product is a virtually paramount goal, right up there with enough
functionality. If functionality cannot be delivered with adequate efficiency,
then nobody will buy it (except a few leading edge weirdos), and you go broke
(MGC lost big bucks during this time period, and experienced at two or three
waves of layoffs).
If anybody can afford large, expensive workstations to improve the
productivity of their superacheivers, it is computer manufacturers and
their circuit designers (one of the highest paid engineering fields I
know of). Their whole company depends (you may have guessed what I'm
about to say) on the efficiency (production efficiency and efficiency
in their target application) of the chips they are producing, for which
MGC tools were (at least the primary) design vehicle.
And yet it was cost effective to have me and many other engineers
(also comparatively highly paid, but not compared to circuit designers
I'm sure) spend several years trying to reduce the size of the object
code (and working data structure size) for the tools.
Earlier, when MGC was in the desktop publishing business for awhile
(which is where I was most of the time), efficiency was a major,
major concern. Keeping the size of data structures and code to a
minimum was well worth the effort it took to design more complex
systems. Every customer seemed to really care how fast our product
ran, which essentially translated into how much physical memory it
took to run the product. One of the major competitive advantages of
our (now discontinued) product was that it handled extremely large
documents relatively efficiently. But customers were always asking
to make certain operations more efficient, and this was often on
their top N list of enhancements.
So, even using a "lower level language" like C++, even for a high
end programming shop like MGC, even for not-many-times-replicated
programs (I don't know how many seats MGC has installed, but it is
somewhere in the tens of thousands), memory space was at a premium.
I still _can not believe_ that after all the progress semiconductor
manufacturers have made in the past 30 years that they cannot
manufacture enough RAM cheaply enough to hold our software. This
is truly INCREDIBLE! RAMs are still (at least as of a year or two
ago) sufficiently expensive that a significant fraction (maybe 1/3)
of programming effort must be wasted merely trying to keep memory
utilization as small as possible.
Ask how much time DBMS vendors spend on optimizations; it is huge!
(Arguably, it is their entire business.) Compiler writers -- same
thing (I did this in a previous job too). GUIs have to be speedy
too, and people I know spend a lot of time adding performance
hacks to speed them up. For real tools used in real applications,
apparently customer expectations have increased _significantly
faster_ than our ability to manufacture semicondutor components.
People have always said that "sufficient" computing capacity (or
network capacity, or what have you) will be Here Real Soon Now(tm),
but it hasn't happened yet, and I'm not sure it ever will in the real
critical applications where the rubber meets the road (and computer
circuit design is one of them -- data retrieval, publishing, and
networking are also).
Of course, this is all relative, and Internet clearly has the
bandwidth to support the CP list. My point is that in the real
world, efficiency (however measured) is still a major concern
for economic survival.
I predict that efficiency of cryptography will be important,
and it will be a long while before enough computer power is
widely available to encrypt all data, sensitive or not (i.e.
cryptography is cheap enough to not worry about whether to
use it or not).
> Food for thought. I'm wondering if a project to implement a kind of
> "Digital Money World," perhaps in SmalltalkAgents, wouldn't be an
> interesting project. (Many will probably tell me that a collection of
> Perl scripts would be more "portable" and more useful to the current
> Unixcentric community....something I'd like to see more discussion
> of.)
>
> I suspect the framework of choice would be some sort of MOO or MUD. Of
> course, once it hit production status, then transliteration into Perl
> install scripts would be appropriate.
I would agree, except the history of "develop it in an
ultra-high-level language/environment and then port it later" has not
been too encouraging: for whatever and various reasons, the ports
rarely take place.
Right. Remember, Fred Brooks (in his classic on software engineering
_The Mythical Man Month_) says to plan to throw one away. So you build
the first one, and instead of porting it you redesign it from scratch.
(Of course, then you might perhaps want to worry about his "second system
syndrome".)
> Of course, we'd better get strong crypto distributed before the Second
> Coming -- you think the current US government is involved in a power
> grab, you just wait!!! This new government will really know how to
> take care of non-conformists -- Waco is nothing compared to what they
> are planning (read: fiery brimstone)...
You'll find many on this list who agree with every point here.
I hope my implied smiley was apparent here, and the McElwaine-like addendum
(deleted by Tim) was hopefully enough to convey my true attitude...
> I wonder if Jesus can create a number so large he can't factor it?
>
I haven't found one yet.
What haven't you found -- a number you can't factor? Or a number
that Jesus can't factor? (I bet at this moment there are a lot of
them, for example "12".) Or a number that your deity (if any) can't
factor? Or is this an implied-smiley-bearing reference to a potential
delusion of grandeur on your part? Or are you and he really working
on this problem collaboratively, in some metaphysical domain?
If you are saying that you can't find a "Jesus" who can create a
number so large he can't factor it, I would tend to strongly agree
with you. On the other hand, virtually every person who ever lived
can (with a little coaching, perhaps) create a number they can't
factor, and there are plenty of living people named Jesus.
Maybe it is just because you aren't looking in the right places... :-)
> Pardon my excursion into various religious topics -- arguably this
> list is also about religion ("religion is what you do" -- "cypherpunks
> write code" -- belief that strong crypto should be widely distributed
> is certainly a religious tenet for some on this list). I hope I
> haven't offended anybody important...
I enjoyed your comments, for one.
Thanks -- I always enjoy yours.
--Tim May
--
dat@ebt.com (David Taffs)
Return to April 1994
Return to “tcmay@netcom.com (Timothy C. May)”