1995-09-26 - RE: More on “Entropy”

Header Data

From: tomw@orac.engr.sgi.com (Tom Weinstein)
To: cypherpunks@toad.com
Message Hash: 320b7a76827a858c5af07f5652f4c37c2f9e3f46a030c1ca91691ad8ad6f4eb8
Message ID: <199509261622.JAA02167@orac.engr.sgi.com>
Reply To: N/A
UTC Datetime: 1995-09-26 16:24:33 UTC
Raw Date: Tue, 26 Sep 95 09:24:33 PDT

Raw message

From: tomw@orac.engr.sgi.com (Tom Weinstein)
Date: Tue, 26 Sep 95 09:24:33 PDT
To: cypherpunks@toad.com
Subject: RE: More on "Entropy"
Message-ID: <199509261622.JAA02167@orac.engr.sgi.com>
MIME-Version: 1.0
Content-Type: text/plain


In article <DFHJDD.Ap@sgi.sgi.com>, David Van Wie <dvw@hamachi.epr.com> writes:

> David Van Wie wrote:

>>> The entropy E is defined by the sum across n states of -P_i log_2(P_i),

> Timothy C. May wrote:

>> Hah! Another physicist converted to the information-theoretic view of 
> entropy!

> Indeed.  I was able to track down the literature, and it is most 
> interesting.  I am still a little bit skeptical of the "superset including 
> thermodynamic entropy" school of thought, but I haven't finished reading all 
> of the materials yet!  Clearly, the IT "version" of entropy is a well 
> defined and useful thing....

We used this formulation of entropy in Statistical Mechanics.  It's
especially useful in Quantum Thermo where you can actually enumerate all
of the states instead of relying on probabilistic arguments.

-- 
Sure we spend a lot of money, but that doesn't mean    |  Tom Weinstein
we *do* anything.  --  Washington DC motto             |  tomw@engr.sgi.com





Thread