From: daw@cs.berkeley.edu (David Wagner)
To: cypherpunks@toad.com
Message Hash: d0df7403143101cb00065970777ed3cd5db1217c21dbc174fb0659a39a8b0882
Message ID: <4la5b6$1sb@joseph.cs.berkeley.edu>
Reply To: <199604182255.PAA13373@jobe.shell.portal.com>
UTC Datetime: 1996-04-20 10:06:24 UTC
Raw Date: Sat, 20 Apr 1996 18:06:24 +0800
From: daw@cs.berkeley.edu (David Wagner)
Date: Sat, 20 Apr 1996 18:06:24 +0800
To: cypherpunks@toad.com
Subject: Re: why compression doesn't perfectly even out entropy
In-Reply-To: <199604182255.PAA13373@jobe.shell.portal.com>
Message-ID: <4la5b6$1sb@joseph.cs.berkeley.edu>
MIME-Version: 1.0
Content-Type: text/plain
In article <199604182255.PAA13373@jobe.shell.portal.com>,
Hal <hfinney@shell.portal.com> wrote:
> So I think the lesson is that there is only one way to estimate entropy,
> and that is to study your source. I have to agree with Perry that this
> filtering concept is not the way to go. It is a red herring that lures
> you in the direction of automatic entropy estimation, and that is really
> not safe.
Excellent point! Very nicely put.
You've convinced me: I was looking at the problem the wrong way.
Thanks for correcting & educating me...
Appreciative of the "signal",
-- Dave Wagner
Return to April 1996
Return to ““Perry E. Metzger” <perry@piermont.com>”