Hostname: page-component-848d4c4894-ndmmz Total loading time: 0 Render date: 2024-05-23T02:10:28.099Z Has data issue: false hasContentIssue false

Number of successes in Markov trials

Published online by Cambridge University Press:  01 July 2016

U. Narayan Bhat*
Affiliation:
Southern Methodist University
Ram Lal*
Affiliation:
Southern Methodist University
*
Postal address for both authors: Department of Statistical Science, Southern Methodist University Dallas, TX 75275, USA.
Postal address for both authors: Department of Statistical Science, Southern Methodist University Dallas, TX 75275, USA.
Rights & Permissions [Opens in a new window]

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

Markov trials are a sequence of dependent trials with two outcomes, success and failure, which are the states of a Markov chain. The distribution of the number of successes in n Markov trials and the first-passage time for a specified number of successes are obtained using an augmented Markov chain model.

Type
Letters to the Editor
Copyright
Copyright © Applied Probability Trust 1988 

References

Bhat, U. N. and Lal, R. (1988) Control charts for Markov dependent production processes. Tech. Rep. # SMU/DS/TR/213, Department of Statistical Science, SMU, Dallas, Texas 75275, USA.Google Scholar
Bhat, U. N., Lal, R., and Karunaratne, M. (1987) A sequential inspection plane for Markov dependent production processes. Tech. Rep. # SMU/DS/TR/209 (Rev.), Department of Statistical Science, SMU, Dallas, Texas 75275, USA.Google Scholar
Gabriel, K. R. (1959) The distribution of the number of successes in a sequence of dependent trials. Biometrika 46, 454460.Google Scholar
Kemeny, J. G. and Snell, J. L. (1960) Finite Markov Chains. Van Nostrand, New York.Google Scholar