Hostname: page-component-cd9895bd7-gbm5v Total loading time: 0 Render date: 2024-12-22T02:30:07.276Z Has data issue: false hasContentIssue false

The problem of random intervals on a line

Published online by Cambridge University Press:  24 October 2008

C. Domb
Affiliation:
Pembroke CollegeCambridge

Extract

Suppose that events occur at random points on a line from t = −∞ to +∞, the probability of an event occurring between t and t + dt being λdt. If we select any interval of the line, say the interval [0, y], there will be a finite probability that it contains 0, 1, 2,…,r,…events; in fact, it is not difficult to show that these probabilities form a Poisson distribution, the probability that the interval contains r events being (see e.g. (1)). Consider the case when each event consists of an interval of length α (an event being characterized by its first point). What is the probability that the covered portion of the interval [0, y] lies between x and x + dx?

Type
Research Article
Copyright
Copyright © Cambridge Philosophical Society 1947

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

(1)Bateman, H.Phil. Mag. 20 (1910), 698.CrossRefGoogle Scholar
(2)Stevens, W. L.Ann. Eugen., London, 9 (1939), 315–20.Google Scholar
(3)Robbins, H. E.Ann. Math. Statist. 15 (1944), 70.Google Scholar
(4)Votaw, D. F.Ann. Math. Statist. 17 (1946), 240.Google Scholar
(5)Bromwich, T. J. I. A.Infinit. series (London, 1926), § 36, p. 95; § 55, p. 156.Google Scholar
(6)van der Pol, B.Phil. Mag. 8 (1929), 861.CrossRefGoogle Scholar