[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: More on "Entropy"
At 5:29 PM 9/25/95, David Van Wie wrote:
>David Van Wie wrote:
>
>>>The entropy E is defined by the sum across n states of -P_i log_2(P_i),
>
>Timothy C. May wrote:
>
>>Hah! Another physicist converted to the information-theoretic view of
>entropy!
>
>Indeed. I was able to track down the literature, and it is most
>interesting. I am still a little bit skeptical of the "superset including
>thermodynamic entropy" school of thought, but I haven't finished reading all
>of the materials yet! Clearly, the IT "version" of entropy is a well
>defined and useful thing....
Well, the more you adapt to the information theory point of view, the more
the Shannon-Kolmogoroff-Chaitin definitions become the natural ones, then
the more the whole "thermodynamic" definition of entropy will seem the odd
one.
One is left with the conclusion that Gibbs-style entropy has _something_
fundamental to do with information theory, and can then consider what those
relationships may be.
But, perforce, one is left with the most basic interpretation of
algorithmic complexity: the complexity of a system is related to the length
of the algorithm describing it. A "random" system is one which has no
shorter algorithmic description than itself.
(The connection of this statement to IQ test questions about describing a
sequence is left as an IQ test question for the reader.)
--Tim May
---------:---------:---------:---------:---------:---------:---------:----
Timothy C. May | Crypto Anarchy: encryption, digital money,
[email protected] 408-728-0152 | anonymous networks, digital pseudonyms, zero
Corralitos, CA | knowledge, reputations, information markets,
Higher Power: 2^756839 | black markets, collapse of governments.
"National borders are just speed bumps on the information superhighway."