Hostname: page-component-77c89778f8-7drxs Total loading time: 0 Render date: 2024-07-17T02:36:47.309Z Has data issue: false hasContentIssue false

Fisher information and statistical inference for phase-type distributions

Published online by Cambridge University Press:  14 July 2016

Mogens Bladt
Affiliation:
Universidad Nacional Autónoma de México, Instituto de Investigaciones en Matemáticas Aplicadas y en Sistemas, Universidad Nacional Autónoma de México, A.P. 20-726, 01000 México DF, Mexico. Email address: bladt@sigma.iimas.unam.mx
Luz Judith R. Esparza
Affiliation:
Technical University of Denmark, Department of Informatics and Mathematical Modeling, Technical University of Denmark, Richard Petersens Plads, Building 305, DK-2800 Kgs. Lyngby, Denmark
Bo Friis Nielsen
Affiliation:
Technical University of Denmark, Department of Informatics and Mathematical Modeling, Technical University of Denmark, Richard Petersens Plads, Building 305, DK-2800 Kgs. Lyngby, Denmark
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

This paper is concerned with statistical inference for both continuous and discrete phase-type distributions. We consider maximum likelihood estimation, where traditionally the expectation-maximization (EM) algorithm has been employed. Certain numerical aspects of this method are revised and we provide an alternative method for dealing with the E-step. We also compare the EM algorithm to a direct Newton–Raphson optimization of the likelihood function. As one of the main contributions of the paper, we provide formulae for calculating the Fisher information matrix both for the EM algorithm and Newton–Raphson approach. The inverse of the Fisher information matrix provides the variances and covariances of the estimated parameters.

Type
Part 6. Statistics
Copyright
Copyright © Applied Probability Trust 2011 

References

[1] Akaike, H., (1974). A new look at the statistical model identification. IEEE Trans. Automatic Control 19, 716723.Google Scholar
[2] Asmussen, S., Nerman, O. and Olsson, M., (1996). Fitting phase-type distributions via the EM algorithm. Scand. J. Statist. 23, 419441.Google Scholar
[3] Callut, J. and Dupont, P., (2006). Sequence discrimination using phase-type distributions. In Machine Learning: ECML 2006 (Lecture Notes Artificial Intelligence 4212), Springer, Berlin, pp. 7889.Google Scholar
[4] Cumani, A., (1982). On the canonical representation of homogeneous Markov processes modelling failure-time distributions. Microelectron. Reliab. 22, 583602.CrossRefGoogle Scholar
[5] Dempster, A. P., Laird, N. M. and Rubin, D. B., (1977). Maximum likelihood from incomplete data via the EM algorithm. J. R. Statist. Soc. B 39, 138.Google Scholar
[6] Latouche, G. and Ramaswami, V., (1999). Introduction to Matrix Analytic Methods in Stochastic Modeling. ASA–SIAM Series on Statistics and Applied Probability. Society for Industrial and Applied Mathematics, Philadelphia, PA.CrossRefGoogle Scholar
[7] Madsen, K., Nielsen, H. and Sondergaard, J., (2002). Robust subroutines for non-linear optimization. Tech. Rep. IMM-REP-2002-02, Technical University of Denmark.Google Scholar
[8] Neuts, M. F., (1981). Matrix-Geometric Solutions in Stochastic Models (Johns Hopkins Ser. Math. Sci. 2). Johns Hopkins University Press, Baltimore, MD.Google Scholar
[9] Nielsen, B. F. and Beyer, J. E., (2005). Estimation of interrupted Poisson process parameters from counts. Tech. Rep. IML-R-21-04/05-SE+fall, Institute Mittag–Leffler.Google Scholar
[10] Oakes, D., (1999). Direct calculation of the information matrix via the EM algorithm. J. R. Statist. Soc. B 61, 479482.Google Scholar