Physical Information - Physical Information and Entropy

Physical Information and Entropy

An easy way to understand the underlying unity between physical (as in thermodynamic) entropy and information-theoretic entropy is as follows: Entropy is simply that portion of the (classical) physical information contained in a system of interest (whether it is an entire physical system, or just a subsystem delineated by a set of possible messages) whose identity (as opposed to amount) is unknown (from the point of view of a particular knower). This informal characterization corresponds to both von Neumann's formal definition of the entropy of a mixed quantum state (which is just a statistical mixture of pure states; see von Neumann entropy), as well as Claude Shannon's definition of the entropy of a probability distribution over classical signal states or messages (see information entropy). Incidentally, the credit for Shannon's entropy formula (though not for its use in an information theory context) really belongs to Boltzmann, who derived it much earlier for use in his H-theorem of statistical mechanics. (Shannon himself references Boltzmann in his monograph.)

Furthermore, even when the state of a system is known, we can say that the information in the system is still effectively entropy if that information is effectively incompressible, that is, if there are no known or feasibly determinable correlations or redundancies between different pieces of information within the system. Note that this definition of entropy can even be viewed as equivalent to the previous one (unknown information) if we take a meta-perspective, and say that for observer A to "know" the state of system B means simply that there is a definite correlation between the state of observer A and the state of system B; this correlation could thus be used by a meta-observer (that is, whoever is discussing the overall situation regarding A's state of knowledge about B) to compress his own description of the joint system AB.

Due to this connection with algorithmic information theory, entropy can be said to be that portion of a system's information capacity which is "used up," that is, unavailable for storing new information (even if the existing information content were to be compressed). The rest of a system's information capacity (aside from its entropy) might be called extropy, and it represents the part of the system's information capacity which is potentially still available for storing newly derived information. The fact that physical entropy is basically "used-up storage capacity" is a direct concern in the engineering of computing systems; e.g., a computer must first remove the entropy from a given physical subsystem (eventually expelling it to the environment, and emitting heat) in order for that subsystem to be used to store some newly computed information.

Read more about this topic:  Physical Information

Famous quotes containing the words physical, information and/or entropy:

    Toddlerhood resembles adolescence because of the rapidity of physical growth and because of the impulse to break loose of parental boundaries. At both ages, the struggle for independence exists hand in hand with the often hidden wish to be contained and protected while striving to move forward in the world. How parents and toddlers negotiate their differences sets the stage for their ability to remain partners during childhood and through the rebellions of the teenage years.
    Alicia F. Lieberman (20th century)

    Many more children observe attitudes, values and ways different from or in conflict with those of their families, social networks, and institutions. Yet today’s young people are no more mature or capable of handling the increased conflicting and often stimulating information they receive than were young people of the past, who received the information and had more adult control of and advice about the information they did receive.
    James P. Comer (20th century)

    Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
    Václav Havel (b. 1936)