Please note, due to essential maintenance online transactions will not be possible between 02:30 and 04:00 BST, on Tuesday 17th September 2019 (22:30-00:00 EDT, 17 Sep, 2019). We apologise for any inconvenience.
We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send this article to your account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this paper we introduce the transparent dead leaves (TDL) random field, a new germ-grain model in which the grains are combined according to a transparency principle. Informally, this model may be seen as the superposition of infinitely many semitransparent objects. It is therefore of interest in view of the modeling of natural images. Properties of this new model are established and a simulation algorithm is proposed. The main contribution of the paper is to establish a central limit theorem, showing that, when varying the transparency of the grain from opacity to total transparency, the TDL model ranges from the dead leaves model to a Gaussian random field.
Given two independent Poisson point processes Φ(1), Φ(2) in , the AB Poisson Boolean model is the graph with the points of Φ(1) as vertices and with edges between any pair of points for which the intersection of balls of radius 2r centered at these points contains at least one point of Φ(2). This is a generalization of the AB percolation model on discrete lattices. We show the existence of percolation for all d ≥ 2 and derive bounds for a critical intensity. We also provide a characterization for this critical intensity when d = 2. To study the connectivity problem, we consider independent Poisson point processes of intensities n and τn in the unit cube. The AB random geometric graph is defined as above but with balls of radius r. We derive a weak law result for the largest nearest-neighbor distance and almost-sure asymptotic bounds for the connectivity threshold.
Most finite spatial point process models specified by a density are locally stable, implying that the Papangelou intensity is bounded by some integrable function β defined on the space for the points of the process. It is possible to superpose a locally stable spatial point process X with a complementary spatial point process Y to obtain a Poisson process X ⋃ Y with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt, Yt) which converges towards the distribution of (X, Y). We study the joint distribution of X and Y, and their marginal and conditional distributions. In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well-known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking in the case of a Strauss process.
We consider a stochastic SIR (susceptible → infective → removed) epidemic model with several types of individuals. Infectious individuals can make infectious contacts on two levels, within their own ‘household’ and with their neighbours in a random graph representing additional social contacts. This random graph is an extension of the well-known configuration model to allow for several types of individuals. We give a strong approximation theorem which leads to a threshold theorem for the epidemic model and a method for calculating the probability of a major outbreak given few initial infectives. A multitype analogue of a theorem of Ball, Sirl and Trapman (2009) heuristically motivates a method for calculating the expected size of such a major outbreak. We also consider vaccination and give some short numerical illustrations of our results.
In this work we analyze a class of 2 × 2 Pólya-Eggenberger urn models with ball replacement matrix and c = pa with . We determine limiting distributions by obtaining a precise recursive description of the moments of the considered random variables, which allows us to deduce asymptotic expansions of the moments. In particular, we obtain limiting distributions for the pills problem a = c = d = 1, originally proposed by Knuth and McCarthy. Furthermore, we also obtain limiting distributions for the well-known sampling without replacement urn, a = d = 1 and c = 0, and generalizations of it to arbitrary and c = 0. Moreover, we obtain a recursive description of the moment sequence for a generalized problem.
We consider regular variation of a Lévy process X := (Xt)t≥0 in with Lévy measure Π, emphasizing the dependence between jumps of its components. By transforming the one-dimensional marginal Lévy measures to those of a standard 1-stable Lévy process, we decouple the marginal Lévy measures from the dependence structure. The dependence between the jumps is modeled by a so-called Pareto Lévy measure, which is a natural standardization in the context of regular variation. We characterize multivariate regularly variation of X by its one-dimensional marginal Lévy measures and the Pareto Lévy measure. Moreover, we define upper and lower tail dependence coefficients for the Lévy measure, which also apply to the multivariate distributions of the process. Finally, we present graphical tools to visualize the dependence structure in terms of the spectral density and the tail integral for homogeneous and nonhomogeneous Pareto Lévy measures.
In a random graph, counts for the number of vertices with given degrees will typically be dependent. We show via a multivariate normal and a Poisson process approximation that, for graphs which have independent edges, with a possibly inhomogeneous distribution, only when the degrees are large can we reasonably approximate the joint counts as independent. The proofs are based on Stein's method and the Stein-Chen method with a new size-biased coupling for such inhomogeneous random graphs, and, hence, bounds on the distributional distance are obtained. Finally, we illustrate that apparent (pseudo-)power-law-type behaviour can arise in such inhomogeneous networks despite not actually following a power-law degree distribution.
We develop techniques for computing the asymptotics of the first and second moments of the number TN of coupons that a collector has to buy in order to find all N existing different coupons as N → ∞. The probabilities (occurring frequencies) of the coupons can be quite arbitrary. From these asymptotics we obtain the leading behavior of the variance V[TN] of TN (see Theorems 3.1 and 4.4). Then, we combine our results with the general limit theorems of Neal in order to derive the limit distribution of TN (appropriately normalized), which, for a large class of probabilities, turns out to be the standard Gumbel distribution. We also give various illustrative examples.
We present a numerical method to compute the survival function and the moments of the exit time for a piecewise-deterministic Markov process (PDMP). Our approach is based on the quantization of an underlying discrete-time Markov chain related to the PDMP. The approximation we propose is easily computable and is even flexible with respect to the exit time we consider. We prove the convergence of the algorithm and obtain bounds for the rate of convergence in the case of the moments. We give an academic example and a model from the reliability field to illustrate the results of the paper.
We consider the uniqueness and extinction properties of the interacting branching collision process (IBCP), which consists of two strongly interacting components: an ordinary Markov branching process and a collision branching process. We establish that there is a unique IBCP, and derive necessary and sufficient conditions for it to be nonexplosive that are easily checked. Explicit expressions are obtained for the extinction probabilities for both regular and irregular cases. The associated expected hitting times are also considered. Examples are provided to illustrate our results.
Foschini gave a lower bound for the channel capacity of an N-transmit M-receive antenna system in a Raleigh fading environment with independence at both transmitters and receivers. We show that this bound is approximately normal.
We investigate the asymptotic distribution of the number of exceedances among d identically distributed but not necessarily independent random variables (RVs) above a sequence of increasing thresholds, conditional on the assumption that there is at least one exceedance. Our results enable the computation of the fragility index, which represents the expected number of exceedances, given that there is at least one exceedance. Computed from the first d RVs of a strictly stationary sequence, we show that, under appropriate conditions, the reciprocal of the fragility index converges to the extremal index corresponding to the stationary sequence as d increases.
Multistate monotone systems are used to describe technological or biological systems when the system itself and its components can perform at different operationally meaningful levels. This generalizes the binary monotone systems used in standard reliability theory. In this paper we consider the availabilities and unavailabilities of the system in an interval, i.e. the probabilities that the system performs above or below the different levels throughout the whole interval. In complex systems it is often impossible to calculate these availabilities and unavailabilities exactly, but it is possible to construct lower and upper bounds based on the minimal path and cut vectors to the different levels. In this paper we consider systems which allow a modular decomposition. We analyse in depth the relationship between the minimal path and cut vectors for the system, the modules, and the organizing structure. We analyse the extent to which the availability bounds are improved by taking advantage of the modular decomposition. This problem was also treated in Butler (1982) and Funnemark and Natvig (1985), but the treatment was based on an inadequate analysis of the relationship between the different minimal path and cut vectors involved, and as a result was somewhat inaccurate. We also extend to interval bounds that have previously only been given for availabilities at a fixed point of time.