[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Entropy vs Random Bits




>Your use of the word random is incorrect:  The throw of a dice is
>random, but only contains 2.6 bits of entropy.

The throw isn't random, the data read from the die after it is thrown is 
random.  The use of the term in many of the postings I have read indicate 
the need for an "unpredictable" quantity in most cases.  This quantity may 
be drawn from a source that has entropy, but it is random.

>> why invent new terms?  Why use them to mean at least two different 
things?

>This is old term of the art, a term of information theory:  We use
>the same word because entropy in information theory has the same
>measure as entropy in thermodynamics.
>
>In both cases the entropy, measured in bits, of an ensemble of
>possible states is sum of  - P(i) * lg[P(i)] over all the possible states.

In thermodynamics, counting states in this fashion is a dicey proposition, 
but I appreciate the clarification.  Still, it seems to me that the property 
"bits of entropy" is often substituted for the actual "bits of random data" 
and is just as puzzling as gathering the "entropy of cool steam"!  One can't 
_do_ anything with a dimensionless measurement.  By which I mean, the 
measure of a property of data is not the data itself, so it still seems like 
the usage is odd, at times.  However, your explanation does address some of 
the phrases I have seen.

Does this mean that entropy is conserved in information theory?

dvw