From: “Perry E. Metzger” <perry@piermont.com>
To: Hal <hfinney@shell.portal.com>
Message Hash: 1aba963cf368d41b23524588b0da95c109ca6d855db2ea032a6df832fa122bf9
Message ID: <199604190041.UAA08754@jekyll.piermont.com>
Reply To: <199604182255.PAA13373@jobe.shell.portal.com>
UTC Datetime: 1996-04-19 07:31:18 UTC
Raw Date: Fri, 19 Apr 1996 15:31:18 +0800
From: "Perry E. Metzger" <perry@piermont.com>
Date: Fri, 19 Apr 1996 15:31:18 +0800
To: Hal <hfinney@shell.portal.com>
Subject: Re: why compression doesn't perfectly even out entropy
In-Reply-To: <199604182255.PAA13373@jobe.shell.portal.com>
Message-ID: <199604190041.UAA08754@jekyll.piermont.com>
MIME-Version: 1.0
Content-Type: text/plain
Hal writes:
> The first is whether this mysterious black box, the entropy estimator,
> is really possible. In practice the only way to know how much entropy
> you've gotten is to have a model for how the data is being generated,
> and to deduce from that an estimate of the entropy rate. So the entropy
> estimator can't be a general-purpose calcluation, but it must be one
> which is specifically chosen, developed and tuned for the specific source
> of entropy you are dealing with.
I couldn't possibly say that better. Its the central point.
> So I think the lesson is that there is only one way to estimate entropy,
> and that is to study your source. I have to agree with Perry that this
> filtering concept is not the way to go. It is a red herring that lures
> you in the direction of automatic entropy estimation, and that is really
> not safe.
Thank you; you are making the point far better than I did.
.pm
Return to April 1996
Return to ““Perry E. Metzger” <perry@piermont.com>”