© 2008, 2009 KnowledgeToTheMax

 

Missing information

Under the probabilistic logic, an inference has a unique measure. In work published in 1948, Claude Shannon tells us how to interpret this measure. It is, Shannon shows, the missing information in this inference, for a deductive conclusion. This result is of great significance for the philosophy of science, for it lays a path toward a solution to the problem of induction. Learning what is meant by “missing information” requires absorption of some ideas in set theory and measure theory.

Shannon’s result comes from an application of measure theory in which the mathematical function Sh(.) designates the measure called “Shannon’s measure.” The collection of sets that are measurable by Shannon’s measure contains the observed state-space X and the unobserved state-space Y. In the derivation, each state-space is taken to be a set. The set difference Y - X is the set of all elements of Y that do not belong to X. The intersection Y∩X of the two state-spaces is the set of elements that are common to the two sets.

Let the mathematical function Sh(.) signify Shannon’s measure of each set in the collection. Let X designate an observed state-space and Y an unobserved state-space. It follows from the precept of measure theory called “additivity” that

Sh(Y - X) := Sh(Y) – Sh(Y∩X)

where the symbol := designates assignment. Sh(Y - X) is the conditional entropy function. From the form of this function (not shown here), Sh(Y - X) is the measure of an inference; thus the set difference Y - X must be an inference!  Sh(Y) is called the “entropy” of the state-space Y. Under semantics imparted to the word “information” by Shannon, Sh(Y∩X) is the information about the state in the unobserved state-space Y, given the state in the observed state-space X. By inspection of the equation shown above, Sh(Y - X) varies inversely with Sh(Y∩X). It follows from Shannon’s semantics that the conditional entropy Sh(Y - X) is the missing information in the inference Y - X, for a deductive conclusion.

The missing information varies in the interval between 0, at the bottom end and the entropy Sh(Y) at the top end. In the latter case, Y - X reduces to Y and the conditional entropy Sh(Y - X) reduces to the entropy Sh(Y).

The word “entropy” was coined by the discoverer of thermodynamics, Rudolf Clausius; he derived it from the ancient Greek word meaning “transformation.” Often, students of engineering and the physical sciences are taught that the word means “disorder.” Under Shannon’s, more apt description, the word signifies the missing information in an inference. In thermodynamics the inference is to an unobserved state-space whose states describe a physical body in microscopic detail.

 

Home