The Fisher information matrix (FIM) has long been of interest in statistics and other areas. It is widely used to measure the amount of information and calculate the lower bound for the variance for maximum likelihood estimation (MLE). In practice, we do not always know the actual FIM. This is often because obtaining the first or second-order derivatives of the log-likelihood function is difficult, or simply because the calculation of FIM is too formidable. In such cases, we need to utilize the approximation of FIM. In general, there are two ways to estimate FIM. One is to use the product of gradient and the transpose of itself, and the other is to calculate the Hessian matrix and then take negative sign. Mostly people use the latter method in practice. However, this is not necessarily the optimal way. To find out which of the two methods is better, we need to conduct a theoretical study to compare their efficiency. In this paper we mainly focus on the case where the unknown parameter that needs to be estimated by MLE is scalar and the random variables we have are independent. In this scenario, Fisher information matrix is virtually Fisher information number (FIN). Using the Central Limit Theorem (CLT), we get asymptotic variances for the two methods, by which we compare their accuracy. Taylor expansion assists in estimating the two asymptotic variances. A numerical study is provided as an illustration of the conclusion. The next is a summary of limitations of this paper. We also enumerate several fields of interest for future study in the end of this paper.
Key words: Fisher information matrix, Fisher information number, the Central Limit Theorem, Taylor expansion, asymptotic variance.
[1]
Y. Favennec.
Hessian and Fisher Matrices For Error Analysis in Inverse Heat Conduction Problems
,
2007
.
[2]
B. Morgan,et al.
Negative Score Test Statistic
,
2007
.
[3]
J. Spall.
Monte Carlo Computation of the Fisher Information Matrix in Nonstandard Settings
,
2005
.
[4]
James C. Spall,et al.
Parameter identification for state-space models with nuisance parameters
,
1990
.
[5]
Anirban DasGupta.
Maximum Likelihood Estimates
,
2008
.
[6]
J. L. Maryak,et al.
A Feasible Bayesian Estimator of Quantiles for Projectile Accuracy From Non-iid Data
,
1992
.
[7]
Geoffrey J. McLachlan,et al.
Finite Mixture Models
,
2019,
Annual Review of Statistics and Its Application.
[8]
S. D. Hill,et al.
Least-informative Bayesian prior distributions for finite samples based on information theory
,
1990
.
[9]
W. Rudin.
Principles of mathematical analysis
,
1964
.
[10]
B. Efron,et al.
Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information
,
1978
.
[11]
James C. Spall,et al.
Preliminary results on relative performance of expected and observed fisher information
,
2009,
Proceedings of the 48h IEEE Conference on Decision and Control (CDC) held jointly with 2009 28th Chinese Control Conference.
[12]
James C. Spall,et al.
Identification for Systems With Binary Subsystems
,
2014,
IEEE Transactions on Automatic Control.
[13]
James C. Spall,et al.
Relative performance of expected and observed fisher information in covariance estimation for maximum likelihood estimates
,
2012,
2012 American Control Conference (ACC).