Rényi information measure for a used item

Recently, in the literature, a measure of tail heaviness has been proposed based on Renyi entropy. This measure is very useful in the sense that it can be used to measure tail heaviness even for the distributions for which the kurtosis measure @b"2=@m"4/@m"2^2 does not exist. Nadarajah and Zografos [Nadarajah, Zografos, Information Sciences 153 (2003), 119-138] have derived the expression for this measure for different univariate continuous distributions. But, this measure can only be used for the lifetime of a new item. In case of used item, this measure needs some modification. In this paper, we have modified the measure accordingly so that it can be used in the case of used item and also for the new item. We have also derived expressions for this measure for sixteen different univariate distributions and ten other standard distributions derived from the general distributions used in reliability and survival analysis. The most of the results obtained in the literature in this direction can be obtained as particular cases of our general results.

[1]  C. R. Rao,et al.  On the convexity of some divergence measures based on entropy functions , 1982, IEEE Trans. Inf. Theory.

[2]  I. Vajda,et al.  Some New Statistics for Testing Hypotheses in Parametric Models , 1997 .

[3]  R. Hartley Transmission of information , 1928 .

[4]  Kostas Zografos,et al.  Formulas for Rényi information and related measures for univariate distributions , 2003, Inf. Sci..

[5]  H. L. Le Roy,et al.  Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability; Vol. IV , 1969 .

[6]  Claude E. Shannon,et al.  A Mathematical Theory of Communications , 1948 .

[7]  H. Nyquist,et al.  Certain factors affecting telegraph speed , 1924, Journal of the A.I.E.E..

[8]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[9]  N. L. Johnson,et al.  Encyclopedia of Statistical Sciences 2. , 1984 .

[10]  J. Kurths,et al.  Quantitative analysis of heart rate variability. , 1995, Chaos.

[11]  P. E. Hart Moment Distributions in Economics: An Exposition , 1975 .

[12]  M. Menéndez,et al.  (h, Φ)-entropy differential metric , 1997 .

[13]  K. Song Rényi information, loglikelihood and an intrinsic distribution measure , 2001 .

[14]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[15]  P. Bickel,et al.  DESCRIPTIVE STATISTICS FOR NONPARAMETRIC MODELS IV. SPREAD , 1979 .

[16]  Jorge Navarro,et al.  Some results on residual entropy function , 2004 .

[17]  Pushpa N. Rathie,et al.  On the entropy of continuous probability distributions (Corresp.) , 1978, IEEE Trans. Inf. Theory.

[18]  Federico Flückiger,et al.  Mathematical Foundation of Information Theory A Set Theoretical Approach , 2005 .

[19]  Asok K. Nanda,et al.  Some results on generalized residual entropy , 2006, Inf. Sci..

[20]  A. Rényi On Measures of Entropy and Information , 1961 .

[21]  Erich L. Lehmann,et al.  Descriptive Statistics for Nonparametric Models I. Introduction , 1975 .

[22]  Iapan Kumar Nayak On diversity measures based on entropy functions , 1985 .

[23]  H. Nyquist,et al.  Certain Topics in Telegraph Transmission Theory , 1928, Transactions of the American Institute of Electrical Engineers.

[24]  E. Soofi Principal Information Theoretic Approaches , 2000 .