Original Article

Year: 2014 | Month: December | Volume 59 | Issue 4

An economic analysis of input structure in context to information inaccuracy, improvement and predictions



During the last six decades, the information theory has attracted the researchers from worldwide and its literature is growing leaps and bounds. Some of its terminologies even have become part of our daily language. Every probability distribution has some uncertainty associated with it. The concept of ‘entropy’ is introduced here to provide a quantitative measure of this uncertainty. Different approaches for measure of entropy and its development has been made, viz: 1.An axiomatic approach, 2.Measure of entropy via measure of inaccuracy and directed divergence and 3.Information measures and coding theorem. A hypothetical data of agricultural, fisheries and forestry sectors, in each of nine years were framed. All inputs bought to fisheries and forestry sectors were supplied by other firms of the same sector. It was worked out that the smaller the distance of probability distribution P from Q, the greater will be the uncertainty and greater the entropy. This is always positive and vanishes if and only if P = Q.

© This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited

Print This Article Email This Article to Your Friend

Economic Affairs, Quarterly Journal of Economics| In Association with AESSRA

2817457 - Visitors since February 20, 2019