We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A general multi-type population model is considered, where individuals live and reproduce according to their age and type, but also under the influence of the size and composition of the entire population. We describe the dynamics of the population as a measure-valued process and obtain its asymptotics as the population grows with the environmental carrying capacity. Thus, a deterministic approximation is given, in the form of a law of large numbers, as well as a central limit theorem. This general framework is then adapted to model sexual reproduction, with a special section on serial monogamic mating systems.
This paper considers ergodic, continuous-time Markov chains
$\{X(t)\}_{t \in (\!-\infty,\infty)}$
on
$\mathbb{Z}^+=\{0,1,\ldots\}$
. For an arbitrarily fixed
$N \in \mathbb{Z}^+$
, we study the conditional stationary distribution
$\boldsymbol{\pi}(N)$
given the Markov chain being in
$\{0,1,\ldots,N\}$
. We first characterize
$\boldsymbol{\pi}(N)$
via systems of linear inequalities and identify simplices that contain
$\boldsymbol{\pi}(N)$
, by examining the
$(N+1) \times (N+1)$
northwest corner block of the infinitesimal generator
$\textbf{\textit{Q}}$
and the subset of the first
$N+1$
states whose members are directly reachable from at least one state in
$\{N+1,N+2,\ldots\}$
. These results are closely related to the augmented truncation approximation (ATA), and we provide some practical implications for the ATA. Next we consider an extension of the above results, using the
$(K+1) \times (K+1)$
(
$K > N$
) northwest corner block of
$\textbf{\textit{Q}}$
and the subset of the first
$K+1$
states whose members are directly reachable from at least one state in
$\{K+1,K+2,\ldots\}$
. Furthermore, we introduce new state transition structures called (K, N)-skip-free sets, using which we obtain the minimum convex polytope that contains
$\boldsymbol{\pi}(N)$
.
Draw-down time for a stochastic process is the first passage time of a draw-down level that depends on the previous maximum of the process. In this paper we study the draw-down-related Parisian ruin problem for spectrally negative Lévy risk processes. Intuitively, a draw-down Parisian ruin occurs when the surplus process has continuously stayed below the dynamic draw-down level for a fixed amount of time. We introduce the draw-down Parisian ruin time and solve the corresponding two-sided exit problems via excursion theory. We also find an expression for the potential measure for the process killed at the draw-down Parisian time. As applications, we obtain new results for spectrally negative Lévy risk processes with dividend barrier and with Parisian ruin.
We introduce a multivariate class of distributions with support I, a k-orthotope in
$[0,\infty)^{k}$
, which is dense in the set of all k-dimensional distributions with support I. We call this new class ‘multivariate finite-support phase-type distributions’ (MFSPH). Though we generally define MFSPH distributions on any finite k-orthotope in
$[0,\infty)^{k}$
, here we mainly deal with MFSPH distributions with support
$[0,1)^{k}$
. The distribution function of an MFSPH variate is computed by using that of a variate in the MPH
$^{*} $
class, the multivariate class of distributions introduced by Kulkarni (1989). The marginal distributions of MFSPH variates are found as FSPH distributions, the class studied by Ramaswami and Viswanath (2014). Some properties, including the mixture property, of MFSPH distributions are established. Estimates of the parameters of a particular class of bivariate finite-support phase-type distributions are found by using the expectation-maximization algorithm. Simulated samples are used to demonstrate how this class could be used as approximations for bivariate finite-support distributions.
Consider a Pólya urn with balls of several colours, where balls are drawn sequentially and each drawn ball is immediately replaced together with a fixed number of balls of the same colour. It is well known that the proportions of balls of the different colours converge in distribution to a Dirichlet distribution. We show that the rate of convergence is
$\Theta(1/n)$
in the minimal
$L_p$
metric for any
$p\in[1,\infty]$
, extending a result by Goldstein and Reinert; we further show the same rate for the Lévy distance, while the rate for the Kolmogorov distance depends on the parameters, i.e. on the initial composition of the urn. The method used here differs from the one used by Goldstein and Reinert, and uses direct calculations based on the known exact distributions.
We define a new family of multivariate stochastic processes over a finite time horizon that we call generalised Liouville processes (GLPs). GLPs are Markov processes constructed by splitting Lévy random bridges into non-overlapping subprocesses via time changes. We show that the terminal values and the increments of GLPs have generalised multivariate Liouville distributions, justifying their name. We provide various other properties of GLPs and some examples.
Block-structured Markov chains model a large variety of queueing problems and have many important applications in various areas. Stability properties have been well investigated for these Markov chains. In this paper we will present transient properties for two specific types of block-structured Markov chains, including M/G/1 type and GI/M/1 type. Necessary and sufficient conditions in terms of system parameters are obtained for geometric transience and algebraic transience. Possible extensions of the results to continuous-time Markov chains are also included.
We consider a supercritical branching Lévy process on the real line. Under mild moment assumptions on the number of offspring and their displacements, we prove a second-order limit theorem on the empirical mean position.
It is well understood that a supercritical superprocess is equal in law to a discrete Markov branching process whose genealogy is dressed in a Poissonian way with immigration which initiates subcritical superprocesses. The Markov branching process corresponds to the genealogical description of prolific individuals, that is, individuals who produce eternal genealogical lines of descent, and is often referred to as the skeleton or backbone of the original superprocess. The Poissonian dressing along the skeleton may be considered to be the remaining non-prolific genealogical mass in the superprocess. Such skeletal decompositions are equally well understood for continuous-state branching processes (CSBP).
In a previous article [16] we developed an SDE approach to study the skeletal representation of CSBPs, which provided a common framework for the skeletal decompositions of supercritical and (sub)critical CSBPs. It also helped us to understand how the skeleton thins down onto one infinite line of descent when conditioning on survival until larger and larger times, and eventually forever.
Here our main motivation is to show the robustness of the SDE approach by expanding it to the spatial setting of superprocesses. The current article only considers supercritical superprocesses, leaving the subcritical case open.
A measure on a locally compact group is said to be spread out if one of its convolution powers is not singular with respect to Haar measure. Using Markov chain theory, we conduct a detailed analysis of random walks on homogeneous spaces with spread out increment distribution. For finite volume spaces, we arrive at a complete picture of the asymptotics of the n-step distributions: they equidistribute towards Haar measure, often exponentially fast and locally uniformly in the starting position. In addition, many classical limit theorems are shown to hold. In the infinite volume case, we prove recurrence and a ratio limit theorem for symmetric spread out random walks on homogeneous spaces of at most quadratic growth. This settles one direction in a long-standing conjecture.
Consider an ergodic Markov chain on a countable state space for which the return times have exponential tails. We show that the stationary version of any such chain is a finitary factor of an independent and identically distributed (i.i.d.) process. A key step is to show that any stationary renewal process whose jump distribution has exponential tails and is not supported on a proper subgroup of ℤ is a finitary factor of an i.i.d. process.
In the present paper, we deal with asymptotical stability of Markov operators acting on abstract state spaces (i.e. an ordered Banach space, where the norm has an additivity property on the cone of positive elements). Basically, we are interested in the rate of convergence when a Markov operator T satisfies the uniform P-ergodicity, i.e.
$\|T^n-P\|\to 0$
, here P is a projection. We have showed that T is uniformly P-ergodic if and only if
$\|T^n-P\|\leq C\beta^n$
,
$0<\beta<1$
. In this paper, we prove that such a β is characterized by the spectral radius of T − P. Moreover, we give Deoblin’s kind of conditions for the uniform P-ergodicity of Markov operators.
The propagation of gradient flow structures from microscopic to macroscopic models is a topic of high current interest. In this paper, we discuss this propagation in a model for the diffusion of particles interacting via hard-core exclusion or short-range repulsive potentials. We formulate the microscopic model as a high-dimensional gradient flow in the Wasserstein metric for an appropriate free-energy functional. Then we use the JKO approach to identify the asymptotics of the metric and the free-energy functional beyond the lowest order for single particle densities in the limit of small particle volumes by matched asymptotic expansions. While we use a propagation of chaos assumption at far distances, we consider correlations at small distance in the expansion. In this way, we obtain a clear picture of the emergence of a macroscopic gradient structure incorporating corrections in the free-energy functional due to the volume exclusion.
In this article we consider the ergodic risk-sensitive control problem for a large class of multidimensional controlled diffusions on the whole space. We study the minimization and maximization problems under either a blanket stability hypothesis, or a near-monotone assumption on the running cost. We establish the convergence of the policy improvement algorithm for these models. We also present a more general result concerning the region of attraction of the equilibrium of the algorithm.
We consider a class of multitype Galton–Watson branching processes with a countably infinite type set
$\mathcal{X}_d$
whose mean progeny matrices have a block lower Hessenberg form. For these processes, we study the probabilities
$\textbf{\textit{q}}(A)$
of extinction in sets of types
$A\subseteq \mathcal{X}_d$
. We compare
$\textbf{\textit{q}}(A)$
with the global extinction probability
$\textbf{\textit{q}} = \textbf{\textit{q}}(\mathcal{X}_d)$
, that is, the probability that the population eventually becomes empty, and with the partial extinction probability
$\tilde{\textbf{\textit{q}}}$
, that is, the probability that all types eventually disappear from the population. After deriving partial and global extinction criteria, we develop conditions for
$\textbf{\textit{q}} < \textbf{\textit{q}}(A) < \tilde{\textbf{\textit{q}}}$
. We then present an iterative method to compute the vector
$\textbf{\textit{q}}(A)$
for any set A. Finally, we investigate the location of the vectors
$\textbf{\textit{q}}(A)$
in the set of fixed points of the progeny generating vector.
We discuss a continuous-time Markov branching model in which each individual can trigger an alarm according to a Poisson process. The model is stopped when a given number of alarms is triggered or when there are no more individuals present. Our goal is to determine the distribution of the state of the population at this stopping time. In addition, the state distribution at any fixed time is also obtained. The model is then modified to take into account the possible influence of death cases. All distributions are derived using probability-generating functions, and the approach followed is based on the construction of families of martingales.
We study the long-term behaviour of a random walker embedded in a growing sequence of graphs. We define a (generally non-Markovian) real-valued stochastic process, called the knowledge process, that represents the ratio between the number of vertices already visited by the walker and the current size of the graph. We mainly focus on the case where the underlying graph sequence is the growing sequence of complete graphs.
It has been known for nearly a decade that deterministically modeled reaction networks that are weakly reversible and consist of a single linkage class have trajectories that are bounded from both above and below by positive constants (so long as the initial condition has strictly positive components). It is conjectured that the stochastically modeled analogs of these systems are positive recurrent. We prove this conjecture in the affirmative under the following additional assumptions: (i) the system is binary, and (ii) for each species, there is a complex (vertex in the associated reaction diagram) that is a multiple of that species. To show this result, a new proof technique is developed in which we study the recurrence properties of the n-step embedded discrete-time Markov chain.
This article investigates the long-time behavior of conservative affine processes on the cone of symmetric positive semidefinite
$d\times d$
matrices. In particular, for conservative and subcritical affine processes we show that a finite
$\log$
-moment of the state-independent jump measure is sufficient for the existence of a unique limit distribution. Moreover, we study the convergence rate of the underlying transition kernel to the limit distribution: first, in a specific metric induced by the Laplace transform, and second, in the Wasserstein distance under a first moment assumption imposed on the state-independent jump measure and an additional condition on the diffusion parameter.