Academic journal article Geographical Analysis

Information Entropy of Non-Probabilistic Processes

Academic journal article Geographical Analysis

Information Entropy of Non-Probabilistic Processes

Article excerpt

We derive an expression for the entropy of non-probabilistic distributions encountered in spatial and mathematical mappings. The entropy of non-probabilistic distributions can be formulated using probabilistic notions of the hypothetical random redistribution of finite information. We show that the discrete approximation to the information content of spatial maps can be based on the discrete hypergeometric distribution. The resultant "associative" entropy is distinct from the Shannon entropy for probability distributions and addresses several shortcomings of the current entropy paradigm as applied to spatial analysis. The associative entropy statistic is distributed approximately as a chi-squared random variable under limitations of variation. We formulate a univariate logical equivalent of the associative entropy statistic, freeing the paradigm from the degrees of freedom constraint to which it has been traditionally shackled. This entropy has application in spatial analysis and fuzzy set theory. The associative entropy is based on the concept of proportional information and is related to the Getis G-statistics of spatial association and the Chi-squared statistics of sample means. We explore the utility of the theory when applied to spatial distribution of vegetation in New Brunswick, Canada. The limitations and implications of the entropy expression are discussed and suggestions are made for future applications of the theory. This work is part of the development of an information theory framework for the analysis of landscape patterns of animal habitat.

1. INTRODUCTION

Entropy is an elusive notion, changing in conceptual nature according to the perspective of the viewer. It has been variously considered as a measure of order, uncertainty, pattern, and randomness. At its mathematical foundation, it is a measure of probability, specifically a log-likelihood of a given state of nature, most frequently a multistate ensemble of microstates. It is a human propensity to consider the subjective significance of patterns of distributed information as inversely related to the likelihood of their random occurrence. However, see Kosko (1992) for a counter-opinion. This probabilistic essence of entropy presents a semantic difficulty when discussing the entropy of distributed information because information itself can be substantively probabilistic or non-probabilistic.

Applications of the entropy notion to non-probabilistic distributions in spatial analysis and fuzzy systems have generally relied on the Shannon expression of the entropy of a probability distribution (Shannon and Weaver 1949) as a foundation for the analysis. The well-known Shannon expression has the form

(1) [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]

where [p.sub.i] is the probability of discrete class i [member of] [1,2,3......N] and N is the total number of classes. The Shannon entropy has been applied to problems in fields from communication to ecology (Kapur and Kesavan 1992). Kapur and Kesavan have also noted the linkages between Shannon entropy optimization, characterizing moments, and, the principle of maximum likelihood of R. A. Fisher. We show below that the Shannon expression does not accurately reflect the entropy of non-probabilistic distributions often encountered in possibilistic, associative and physical mappings. The Shannon entropy suffers from several shortcomings that handicap its universal applicability. The expression is somewhat limited to discrete probability distributions because it does not extend well to the continuous probability density ease. It is also difficult to compare entropies of ensembles of different degrees of freedom. The Shannon entropy evolves from consideration of the discrete multinomial distribution of information. However, not all distributions of' information follow a multinomial distribution when approximated in the discrete ease. Einstein, Bose, Fermi, and Dirac formulated alternative expressions of entropy of real-world phenomena in quantum physics where the distributional problem is under quantum constraints and distinctly non-multinominal. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.