Author(s)
Amos Golan
Duncan Foley

The principle of maximum entropy, developed more than six decades ago, provides a systematic approach to modeling inference, and data analysis grounded in the principles of information theory, Bayesian probability and constrained optimization. Since its formulation, criticisms about the consistency of that method and the role of constraints have been raised. Among these, the chief criticism is that maximum entropy does not satisfy the principle of causation, or similarly, that maximum entropy updating is inconsistent due to an inadequate representation of causal information. We show that these criticisms rest on misunderstanding and misapplication of the way constraints have to be specified within the maximum entropy method. Correction of these problems eliminates the seeming paradoxes and inconsistencies critics claim to have detected. We demonstrate that properly formulated maximum entropy models satisfy the principle of causation.

Publication Type
Article
Journal
IEEE Transactions on Pattern Analysis and Machine Intelligence
Volume
Early Access
JEL Codes
D80: Information, Knowledge, and Uncertainty: General
C11: Bayesian Analysis: General
Keywords
causation
information theory