1. Limited role of entropy in information economics
- Author
-
Jacob Marschak
- Subjects
Conditional entropy ,Principle of maximum entropy ,General Social Sciences ,General Decision Sciences ,Min entropy ,Joint entropy ,Computer Science Applications ,Differential entropy ,Arts and Humanities (miscellaneous) ,Maximum entropy probability distribution ,Statistics ,Developmental and Educational Psychology ,Entropy (information theory) ,General Economics, Econometrics and Finance ,Applied Psychology ,Entropy rate ,Mathematics - Abstract
‘Information transmitted’ is defined as the amount by which added evidence (or ‘message received’) diminishes ‘uncertainty’. The latter is characterized by some properties intuitively suggested by this word and possessed by conditional entropy, a parameter of the posterior probability distribution. However, conditional entropy shares these properties with some other concave symmetric functions on the probability space. Moreover, a given transmission channel (or, in the context of statistical inference, a given experiment) yields a higher maximum expected benefit than anotherto any user if and only ifall concave functions of the posterior probability vector have higher values for the former channel (or experiment). Hence one information system (channel, experiment) may be preferable to another for a given user although its transmission rate, in entropy terms, is lower. But only entropy has the economically relevant property of measuring, in the limit, the expected length of efficiently coded messages sent in long sequences. Thus, while irrelevant to the value (maximum expected benefit) of an information system and to the costs of observing, estimating, and deciding, entropy formulas are indeed relevant to the cost of communicating, i.e., of storing, coding and transmitting messages.
- Published
- 1974