From: adwestro@ouray.cudenver.edu (Alan Westrope)
To: cypherpunks@toad.com
Message Hash: 24350ce67ee266e0bc2b34c53b21370df14a3a699abac285b817f91eb78166ff
Message ID: <JhbZwkkAsOII085yn@ouray.cudenver.edu>
Reply To: N/A
UTC Datetime: 1995-09-24 20:09:19 UTC
Raw Date: Sun, 24 Sep 95 13:09:19 PDT
From: adwestro@ouray.cudenver.edu (Alan Westrope)
Date: Sun, 24 Sep 95 13:09:19 PDT
To: cypherpunks@toad.com
Subject: Entropy VI: "Shannon's Choice" (with apologies to Wm. Styron)
Message-ID: <JhbZwkkAsOII085yn@ouray.cudenver.edu>
MIME-Version: 1.0
Content-Type: text/plain
I'm always glad to see stuff here about entropy: it's a topic that
comes up in PGP 101 when locals ask, "How long should my passphrase be?"
and, as we've seen recently, failure to generate adequate entropy for
pseudo-random numbers is often the Achilles' Heel of otherwise solid
cryptosystems.
I found some info about Shannon's choice of the word "entropy" in an
unexpected source lately. I had some free time this week due to the
surprise snowstorm here, and used it to type in a few paragraphs.
This is intended less for immediate discussion than for the list's
archives, so that I can point folks to it in the future. And I'm
not going to cite the source, to avoid copyright infringement
nastygrams...if you know who the author is, keep it to yourself:
you'll only suffer public ridicule and loss of "reputation capital"
for Not Watching Enough TV!!! :-)
N.B. -- some of this impresses me as pompous bullshit, and I'm not in
agreement with it. But the story about von Neumann made it a worthwhile
read, as least to me.
==================================================================
I want now to turn to another juncture in the cascading bifurcations
that mark interpretations of Maxwell's Demon. The juncture occurs when
Leon Brillouin and Claude Shannon diverge in their opinions about what
the relationship between information and entropy should be. In Brillouin's
analysis of Maxwell's Demon, the Demon's information allowed him to sort
molecules, thus decreasing the system's entropy; but this information had
to be paid for by an even greater increase in entropy elsewhere in the
system. For Brillouin, then, information and entropy are opposites and
should have opposite signs. He emphasized the inverse connection
between information and entropy by coining "negentropy" (from negative
entropy) as a synonym for information.
To Shannon, an engineer at Bell Laboratories who published a two-part
paper that was to form the basis of modern information theory (1948),
information and entropy were not opposites. They were identical. When
Shannon devised a probability function that he identified with information,
he chose to call the quantity calculated by the function the "entropy"
of the message. Why he made this choice is unclear. Rumor has it that
von Neumann told Shannon to use the word because "no one knows what entropy
is, so in a debate you will always have the advantage." One could argue
that von Neumann's comment was only one element and that the choice of
"entropy" was overdetermined, with multiple factors leading to its
conflation with "information." On a conceptual level, an important
consideration was the similarity between Shannon's equation for infor-
mation and Boltzmann's equation for entropy. Because the two equations
had similar forms, it was tempting to regard the entities they defined
as the same. On the level of language, entropy was compelling because
it was a term of recognized legitimacy to the concept of information.
On a cultural level, Shannon's choice anticipated the contemporary
insight that proliferating information is associated with the
production of entropy. [...]
Whatever the reasons for Shannon's choice, it is regarded by many
commentators within our scientific tradition as a scandal, for it
led to the (metaphoric) knotting together of concepts that are partly
similar and partly dissimilar. Typical is K. G. and J. S. Denbigh's
reaction in their careful study of the way the quantity defined by
Shannon's equation differs from thermodynamic entropy. Recounting the
story about von Neumann's advice, they write that thus, "confusion
entered in and von Neumann had done science a disservice!" Jeffrey S.
Wicken is even more explicit, calling Shannon's choice "loose language"
that served "the dark god of obfuscation." "As a result of its
independent lines of development in thermodynamics and information
theory, there are in science today two 'entropies,'" Wicken writes.
"This is one too many. It is not science's habit to affix the same
name to different concepts. Shared names suggest shared meanings, and
the connotative field of the old tends inevitably to intrude on the
denotative terrain of the new."
Clearly Wicken's concern is to restore scientific univocality by closing
off the ability of the information-entropy connection to act as a
metaphor rather than a congruence. Yet at the same time he admits that
shared language creates an inevitable "intrusion" into the "denotative
terrain" of one term by the "connotative field" of another. The problem
is more scandalous than he recognizes, for whenever a heuristic is
proposed, it necessarily uses "shared names" that cause scientific
denotation to be interpenetrated by cultural connotations. For what
else is language but "shared names"? As Wittgenstein has observed,
there are no private languages. Moreover, the distinction between
denotative and connotative language is itself part of the distinction
between language-as-vehicle and language-as-concept which metaphors,
and particularly self-reflexive metaphors, bring into question. To
turn Wicken's argument on its head, we might say he recognizes that
metaphors in general, and the information-entropy connection in
particular, directly threaten science's ability to separate ideas from
the language it uses to express them.
In his anxiety to suppress the metaphoric potential of Shannon's choice,
Wicken misses the richly complex and suggestive connections that were
instrumental in enabling a new view of chaos to emerge. By the simple
device of using "information" and "entropy" as if they were interchan-
geable terms, Shannon's choice gave rise to decades of interpretative
commentary that sought to explain why information should be identified
with disorder rather than order. For the alliance between entropy and
information to be effective, information first had to be divorced from
meaning (a premise made explicit in Shannon's 1948 papers) and had to
be associated instead with novelty. Recall the random number generator,
mentioned earlier, that produces a tape we can read. No matter how long
we watch the tape, numbers keep appearing in unpredictable sequence.
From one point of view this situation represents chaos; from another,
maximum information.
Once randomness was understood as maximum information, it was possible
to envision chaos (as Robert Shaw does) as the source of all that is
new in the world. Wicken is correct in noting that denotative and
connotative fields overlap; in the case of information, the connotation
that "intruded" upon the denotative field of chaos was complexity.
Whereas chaos had traditionally meant simply disorder, complexity
implied a mingling of symmetry with asymmetry, predictable periodicity
with unpredictable variation. As we have seen, chaotic or complex
systems are disordered in the sense that they are unpredictable, but
they are ordered in the sense that they possess recursive symmetries
that almost, but not quite, replicate themselves over time. The
metaphoric joining of entropy and information was instrumental in
bringing about these developments, for it allowed complexity to be
seen as rich in information rather than deficient in order.
Sources cited:
K. G. Denbigh and J. S. Denbigh, _Entropy in Relation to Incomplete
Knowledge_ (Cambridge University Press, 1985)
Jeffrey S. Wicken, "Entropy and Information: Suggestions for a
Common Language." Philosophy of Science 54:176-193 (1987)
==================================================================
Alan Westrope <awestrop@nyx10.cs.du.edu>
__________/|-, <adwestro@ouray.cudenver.edu>
(_) \|-' 2.6.2 public key: finger / servers
PGP 0xB8359639: D6 89 74 03 77 C8 2D 43 7C CA 6D 57 29 25 69 23
Return to September 1995
Return to “adwestro@ouray.cudenver.edu (Alan Westrope)”
1995-09-24 (Sun, 24 Sep 95 13:09:19 PDT) - Entropy VI: “Shannon’s Choice” (with apologies to Wm. Styron) - adwestro@ouray.cudenver.edu (Alan Westrope)