Redundancy of experimental data is the basic statistic from which the complexity of a natural phenomenon and the proper number of experiments needed for its exploration can be estimated. The redundancy is expressed by the entropy of information pertaining to the probability density function of experimental variables. Since the calculation of entropy is inconvenient due to integration over a range of variables, an approximate expression for redundancy is derived that includes only a sum over the set of experimental data about these variables. The approximation makes feasible an efficient estimation of the redundancy of data along with the related experimental information and information cost function. From the experimental information the complexity of the phenomenon can be simply estimated, while the proper number of experiments needed for its exploration can be determined from the minimum of the cost function. The performance of the approximate estimation of these statistics is demonstrated on two-dimensional normally distributed random data.
[1]
David J. C. MacKay,et al.
Information Theory, Inference, and Learning Algorithms
,
2004,
IEEE Transactions on Information Theory.
[2]
Thomas M. Cover,et al.
Elements of Information Theory
,
2005
.
[3]
J C G Lesurf.
Information and measurement
,
1995
.
[4]
I. Grabec.
Experimental modeling of physical laws
,
2001
.
[5]
D. Signorini,et al.
Neural networks
,
1995,
The Lancet.
[6]
Igor Grabec,et al.
Synergetics of Measurement, Prediction and Control
,
1997
.
[7]
I. Grabec.
Extraction of physical laws from joint experimental data
,
2005
.