Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-18T08:18:58.973Z Has data issue: false hasContentIssue false

Exactly solvable urn models under random replacement schemes and their applications

Published online by Cambridge University Press:  14 February 2023

Kiyoshi Inoue*
Affiliation:
Seikei University
*
*Postal address: Faculty of Economics, Seikei University, 3-3-1 Kichijoji-Kitamachi, Musasino-shi, Tokyo, 180-8633, Japan. Email: kinoue@econ.seikei.ac.jp
Rights & Permissions [Opens in a new window]

Abstract

We examine urn models under random replacement schemes, and the related distributions, by using generating functions. A fundamental isomorphism between urn models and a certain system of differential equations has previously been established. We study the joint distribution of the numbers of balls in the urn, and determined recurrence relations for the probability generating functions. The associated partial differential equation satisfied by the generating function is derived. We develop analytical methods for the study of urn models that can lead to perspectives on urn-related problems from analytic combinatorics. The results presented here provide a broader framework for the study of exactly solvable urns than the existing framework. Finally, we examine several applications and their numerical results in order to demonstrate how our theoretical results can be employed in the study of urn models.

Type
Original Article
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of Applied Probability Trust

1. Introduction

Urn models have been a popular research subject for many years due to their applicability in a wide range of fields, including computer science, economics, physics, business, genetics, and ecology. We refer interested readers to the entire textbook [Reference Johnson and Kotz15]; [Reference Kotz and Balakrishnan18] provides a wealth of information regarding urn models, and presents some significant developments. [Reference Johnson, Kotz and Balakrishnan16, Reference Johnson, Kemp and Kotz17] studied various urn models and their corresponding distributions with statistical inference problems [Reference Inoue and Aki13, Reference Mahmoud20].

A fundamental isomorphism between urn models with two types of balls and certain systems of ordinary differential equations was established in [Reference Flajolet, Dumas and Puyhaubert9]. This highly innovative method makes it possible to obtain probability distributions from the solutions of the differential systems. [Reference Flajolet, Gabarro and Pekari8] introduced the partial differential equation approach to urn models with two types of balls, and derived the probability generating functions as solutions to the partial differential equation.

Since the pioneering works of [Reference Flajolet, Gabarro and Pekari8, Reference Flajolet, Dumas and Puyhaubert9], the theory and applications of urn models based on analytical methods have received increased attention from probabilists, statisticians, and applied scientists [Reference Balaji and Mahmoud2, Reference Chen and Zhang6]. The urn models are recognized as exactly solvable urns, and the methodology is known as analytic combinatorics, which have been studied in various situations [Reference Flajolet and Sedgewick10]. A chapter of [Reference Mahmoud20] is devoted to analytic urns, and provides a comprehensive list of references. [Reference Hwang, Kuba and Panholzer11] considered several diminishing urns with two colors, and obtained the exact distributions by solving partial differential equations, while [Reference Morcrette and Mahmoud21] developed an analytical theory for the study of Pólya urns with random rules. Urn models under random replacement schemes have been studied by many researchers; [Reference Athreya and Karlin1, Reference Janson14, Reference Smythe22] obtained asymptotic results (see also references therein). We are concerned with exact probability distributions via the methods of analytic combinatorics.

In this study we treat urn models with $(m+1)$ types of balls under random replacement schemes, and develop the exact theory of analytic urns with their applications. The results presented here serve as a bridge between discrete-time urn processes and continuous analysis (in particular, the theory of differential equations). In Section 2, urn models containing $(m+1)$ types of balls under random replacement rules are explained. We also describe the history of urn evolution, and the enumerative approach to all histories using generating functions. In Section 3, the fundamental isomorphism between urn models with random addition rules and differential equations established in [Reference Morcrette and Mahmoud21] for urn models with two types of balls is generalized to urns with random addition rules and $(m+1)$ types of balls. In Section 4, we consider the joint distribution of the numbers of balls in the urn after n draws, and provide recurrence relations for the probability generating functions. Furthermore, we derive a partial differential equation satisfied by the generating function. Finally, in Section 5, we discuss several applications related to urn models with random addition rules for illustrative purposes. Two partition numbers used in Section 5 are examined further in the appendix.

2. Urn models

2.1. Motivating example

We provide a simple example that illustrates urn histories and their enumeration. From an urn initially containing two white balls and one blue ball, a ball is chosen at random, its color is observed, and the ball is returned to the urn with additional balls: if the color of the chosen ball is white, one white ball and two blue balls are added to the urn; if the color of the chosen ball is blue, two white balls and one blue ball are added to the urn. The trial then repeats. It is common to represent the replacement scheme with the addition matrix $A=\left(\begin{smallmatrix}1 & 2 \\[5pt] 2 & 1\end{smallmatrix}\right)$ . We denote the starting state by WWB. Note that there are initially two white balls. At the first drawing, one possible history of the urn evolution corresponds to the drawing of a white ball, in which case we return it to the urn together with one new white ball and two new blue balls. If a blue ball is chosen in the second drawing, it is returned to the urn together with two new white balls and one new blue ball. We have one possible history of urn evolution, described by the following sequences:

\begin{equation*} W\underline{W}B \to WWWBB\underline{B} \to WWWWWBBBB,\end{equation*}

where the ball chosen in each step is underlined. Another urn evolution history is as follows:

\begin{equation*} \underline{W}WB \to WWWBB\underline{B} \to WWWWWBBBB,\end{equation*}

We regard the two sequences as different histories, as although they share the same urn composition after two draws, they arrived at it through different stochastic paths. From this perspective, all histories are equally likely.

We will enumerate urn histories using exponential generating functions.

2.2. Urn scheme

Consider a Pólya urn model containing balls of $(m+1)$ different types of labels, which is characterized by a random replacement scheme. From an urn containing $b_i$ balls labeled i ( $i=0,1,\ldots,m$ ), a ball is drawn, its label is noted, and the ball is returned to the urn along with additional balls depending on the label of the drawn ball; if a ball labeled i ( $i=0,1,\ldots,m$ ) is drawn, $A_{ij}$ balls labeled j ( $j=0,1,\ldots,m$ ) are added, where $A_{ij}$ ( $i,j=0,1,\ldots,m$ ) are discrete random variables. This scheme is represented by the $(m+1) \times (m+1)$ replacement matrix A of discrete random entries as

(2.1) \begin{equation}A=\left(\begin{array}{c@{\quad}c@{\quad}c@{\quad}c}A_{00} & A_{01}& \cdots & A_{0m} \\[5pt] A_{10} & A_{11}& \cdots & A_{1m} \\[5pt] \vdots & \vdots & \ddots & \vdots \\[5pt] A_{m0} & A_{m1}& \cdots & A_{mm}\end{array}\right). \end{equation}

The rows are indexed by the label of the drawn ball, and the columns are indexed by the label of the added ball. We assume that $A_{i0}+A_{i1}+ \cdots +A_{im}=\theta\ (\!\geq 0)$ , $A_{ii} \geq -1$ , and $A_{ij}\geq 0$ , $i \neq j$ , for $i,j=0,1,\ldots,m$ . The row sums of the replacement matrix are always constant with the value $\theta$ , which implies that a constant number of balls is added. The condition of steady linear growth of the urn size is said to be ‘balanced’, and the parameter $\theta$ is called the balance of the urn. The urn is said to be ‘tenable’ if we can perpetually sample from the urn according to the replacement scheme. Throughout this article, we limit ourselves to considering tenable balanced urns.

For $i=0,1,\ldots,m$ , let $B_n^{i}$ denote the number of balls labeled i after n draws from the urn. The total number of balls after n draws, denoted by $\tau_n=\sum_{i=0}^{m} B_n^{i}$ , is a deterministic quantity due to the balance condition. We then have $\tau_n = \tau_0 + n \theta = b_0 + b_1 + \cdots b_m + n \theta,$ where $\big(B_0^{0},B_0^{1},\ldots,B_0^{m}\big)=(b_0,b_1,\ldots,b_m)$ and $\tau_0=b_0+b_1 + \cdots + b_m$ . Assume that the joint distribution of $( A_{i0},A_{i1},\ldots,A_{im} )$ has the probability mass function $\pi_{a_{i0},a_{i1},\ldots,a_{im}}=\mathbb{P} ( A_{i0}=a_{i0},A_{i1}=a_{i1},\ldots,A_{im}=a_{im} )$ , $i=0,1,\ldots,m$ .

2.3. Enumerating histories

Let $h_n(i_0,i_1,\ldots,i_m)$ be the number of histories that corresponds to an urn with $\big(B_n^{0},B_n^{1},\ldots,B_n^{m}\big)=(i_0,i_1,\ldots,i_m)$ after n draws, and let

\begin{align*} H(x_0,\ldots,x_m,z)=\sum_{n=0}^{\infty} \, \sum_{i_0,i_1,\ldots,i_m \geq 0}h_n(i_0,\ldots,i_m) x_0^{i_0} \cdots x_m^{i_m} \frac{z^n}{n!}\end{align*}

be the exponential generating function. The generating function of the total number of histories is expressed as $H(1,1,\ldots,1,z)$ .

Proposition 2.1. For a tenable balanced urn under scheme (2.1), the total number of possible histories after n draws is given by $\tau_n=\tau_0(\tau_0+\theta) \cdots (\tau_0+(n-1)\theta)$ , and the generating functions of the total number of histories can be expressed as

\begin{equation*} H(1,1,\ldots,1,z) = \begin{cases} \mathrm{e}^{\tau_0 z} , & \theta=0, \\[5pt] \frac{1}{(1-\theta z)^{{\tau_0}/{\theta}}} , & \theta > 0. \end{cases}\end{equation*}

The joint probability mass function and the probability generating function of $\big(B_n^{0},\ldots,B_n^{m}\big)$ can be captured through the generating function $H(x_0,\ldots,x_m,z)$ as

(2.2) \begin{align} \mathbb{P}\big( B_n^{0}=i_0,\ldots,B_n^{m}=i_m \big) = \frac{h_n(i_0,\ldots,i_m)}{\tau_0 \cdots \tau_{n-1}} = \frac{\left[x_0^{i_0} \cdots x_m^{i_m} z^n\right]H(x_0,\ldots,x_m,z)}{[z^n]H(1,\ldots,1,z)}, \end{align}
(2.3) \begin{align} \mathbb{E} \Big[x_0^{B_n^{0}} x_1^{B_n^{1}} \cdots x_m^{B_n^{m}}\Big] & = \frac{[z^n]H(x_0,\ldots,x_m,z)}{[z^n] H(1,\ldots,1,z)}, \end{align}

respectively, where we use the notation $\big[x_1^{i_1} x_2^{i_2} \cdots x_k^{i_k}\big]g(x_1,x_2,\ldots,x_k)$ to extract the coefficient of $x_1^{i_1} x_2^{i_2} \cdots x_k^{i_k}$ in a multivariate generating function $g(x_1,x_2,\ldots,x_k)$ .

It is interesting to note that we can write the generating function $H(x_0,\ldots,x_m,z)$ through the joint probability mass function and probability generating function of $\big(B_n^{0},\ldots,B_n^{m}\big)$ :

\begin{align*} H(x_0,\ldots,x_m,z) & = \sum_{n=0}^{\infty} \, \sum_{i_0,\ldots,i_m \geq 0} \tau_0 \cdots \tau_{n-1} \mathbb{P} \big( B_n^{0}=i_0,\ldots,B_n^{m}=i_m \big) x_0^{i_0} \cdots x_m^{i_m} \frac{z^n}{n!} \\[5pt] & = \sum_{n=0}^{\infty} \tau_0 \cdots \tau_{n-1} \mathbb{E} \Big[x_0^{B_n^{0}} x_1^{B_n^{1}} \cdots x_m^{B_n^{m}}\Big]\frac{z^n}{n!}.\end{align*}

We define the operator $\mathcal{D}$ by

\begin{align*} \mathcal{D} = \sum_{i=0}^{m} \sum_{a_{i0}+a_{i1} + \cdots + a_{im}=\theta} \pi_{a_{i0},a_{i1},\ldots,a_{im}} x_{0}^{a_{i0}} \cdots x_{i}^{a_{ii}+1} \cdots x_{m}^{a_{im}} \frac{\partial}{\partial x_i},\end{align*}

which is suited for describing the urn histories.

Proposition 2.2. Under the initial urn condition $\big(B_0^{0},B_0^{1},\ldots,B_0^{m}\big)=(b_0,b_1,\ldots,b_m)$ and the replacement scheme as in (2.1), the generating function of the urn histories is expressed as

(2.4) \begin{equation} H(x_0,x_1,\ldots,x_m,z) = \sum_{n=0}^{\infty} \mathcal{D}^n \big(x_0^{b_0} x_1^{b_1} \cdots x_m^{b_m}\big) \frac{z^n}{n!}. \end{equation}

Proof. For $n \geq 0$ , we define the function $H_n ( x_0,x_1,\ldots,x_m )$ by

\begin{align*} H_n ( x_0,x_1,\ldots,x_m ) & = \sum_{i_0,i_1,\ldots,i_m \geq 0} h_n(i_0,i_1,\ldots,i_m) x_0^{i_0} \cdots x_m^{i_m}, \qquad n \geq 1, \\[5pt] H_0 ( x_0,x_1,\ldots,x_m ) & = x_0^{b_0} x_1^{b_1} \cdots x_m^{b_m}. \end{align*}

Then, the generating function $H(x_0,x_1,\ldots,x_m,z)$ of the urn histories can be rewritten as

\begin{align*} H(x_0,x_1,\ldots,x_m,z) = \sum_{n=0}^{\infty}H_{n} ( x_0,x_1,\ldots,x_m )\frac{z^n}{n!}. \end{align*}

The property of the operator $\mathcal{D}$ yields $H_{n+1}( x_0,x_1,\ldots,x_m) = \mathcal{D}(H_{n}( x_0,x_1,\ldots,x_m))$ . Hence, we have

\begin{equation*} H_{n} ( x_0,x_1,\ldots,x_m ) = \mathcal{D}^n(H_{0}( x_0,x_1,\ldots,x_m)) = \mathcal{D}^n\Big(x_0^{b_0} x_1^{b_1} \cdots x_m^{b_m}\Big). \end{equation*}

3. Urn models and systems of differential equations

We study urn models with random replacement schemes by employing analytical methods. Based on an isomorphism theorem, we associated the urn model with a system of differential equations. More specifically, we obtain the following theorem.

Theorem 3.1. Under the initial urn condition $\big(B_0^{0},B_0^{1},\ldots,B_0^{m}\big)=(b_0,b_1,...,b_m)$ and the replacement scheme as in (2.1), the generating function of the urn histories is given by $H(u_0,u_1,...,u_m,z) = x_{0}^{b_0}(z)x_{1}^{b_{1}}(z) \cdots x_{m}^{b_{m}}(z)$ , where $u_0=x_0(0)$ , $u_1=x_1(0)$ , …, $u_m=x_m(0)$ such that $u_0 u_1 \cdots u_m \neq 0$ , and $(x_{0}(t),x_{1}(t),\ldots,x_{m}(t))$ is the solution to the differential system of equations

(3.1) \begin{equation} \left\{ \begin{array}{r@{\ }c@{\ }l} \dot{x}_{0}(t) & = & x_{0}(t) \mathbb{E} \big[ x_{0}(t)^{A_{00}}x_{1}(t)^{A_{01}} \cdots x_{m}(t)^{A_{0m}} \big], \\[5pt] \dot{x}_{1}(t) & = & x_{1}(t) \mathbb{E} \big[x_{0}(t)^{A_{10}} x_{1}(t)^{A_{11}} \cdots x_{m}(t)^{A_{1m}} \big], \\[5pt] & \vdots & \\[5pt] \dot{x}_{m}(t) & = & x_{m}(t)\mathbb{E} \big[ x_{0}(t)^{A_{m0}} x_{1}(t)^{A_{m1}} \cdots x_{m}(t)^{A_{mm}} \big]. \end{array} \right.\end{equation}

Proof. From the Taylor series expansion (around $t=0$ ) of the function $x_i^{b_i}(t+z)$ , $i=0,1,\ldots,m$ , we have

\begin{align*} x_i^{b_i}(t+z) = x_i^{b_i}(t) + \frac{\partial}{\partial t} x_i^{b_i}(t) \frac{z}{1!} + \frac{\partial^2}{\partial t^2} x_i^{b_i}(t) \frac{z^2}{2!} + \cdots . \end{align*}

The product of the functions $x_0^{b_0}(t+z)x_1^{b_1}(t+z) \cdots x_m^{b_m}(t+z)$ is

\begin{align*} x_0^{b_0}(t+z)x_1^{b_1}(t+z) \cdots x_m^{b_m}(t+z) & = \sum_{n=0}^{\infty} \, \sum_{i_0 + i_1 + \cdots + i_m =n}\left(\begin{array}{c} n \\[3pt] i_0,i_1,\ldots,i_m \end{array}\right) \bigg( \frac{\partial^{t^{i_0}}}{\partial t^{i_0}} x_0^{b_0}(t) \bigg) \\[5pt] & \quad \times \bigg( \frac{\partial^{t^{i_{1}}}}{\partial t^{i_{1}}} x_1^{b_{1}}(t) \bigg) \cdots \bigg( \frac{\partial^{t^{i_{m}}}}{\partial t^{i_m}} x_m^{b_{m}}(t) \bigg)\frac{z^{n}}{n!} \\[5pt] & = \sum_{n=0}^{\infty}\frac{\partial^n}{\partial t^n}\Big( x_0^{b_0}(t) x_1^{b_1}(t) \cdots x_m^{b_{m}}(t)\Big) \frac{z^{n}}{n!}. \end{align*}

We denote the operator ${\partial}/{\partial t}$ as $\tilde{\mathcal{D}}$ . Then, we have

(3.2) \begin{equation} x_0^{b_0}(t+z)x_1^{b_1}(t+z) \cdots x_m^{b_m}(t+z) = \sum_{n=0}^{\infty}\tilde{\mathcal{D}}^n\Big( x_0^{b_0}(t) x_1^{b_1}(t) \cdots x_m^{b_m}(t) \Big) \frac{z^n}{n!}. \end{equation}

By highlighting the relationship of the two expansions (2.4) and (3.2), if we let $\mathcal{D}=\tilde{\mathcal{D}}$ , or

\begin{align*} \mathcal{D} \Big( x_0^{b_0}(t) x_1^{b_1}(t) \cdots x_m^{b_m}(t)\Big) = \frac{\partial}{\partial t}\Big( x_0^{b_0}(t) x_1^{b_1}(t) \cdots x_m^{b_m}(t)\Big), \end{align*}

we have $H ( x_0(t),x_1(t),\ldots,x_m(t),z) = x_0(t+z)^{b_0}x_1(t+z)^{b_1} \cdots x_m(t+z)^{b_m}$ , which is the same as letting

\begin{align*} \sum_{i=0}^{m}\sum_{a_{i0}+a_{i1} + \cdots + a_{im}=\theta}b_i\pi_{a_{i0},a_{i1},\ldots,a_{im}} x_{0}^{b_0 + a_{i0}} \cdots& \,x_{i}^{b_i + a_{ii}} \cdots x_{m}^{b_m + a_{im}} \\[5pt] & = \sum_{i=0}^{m}b_ix_{0}^{b_0} \cdots x_{i}^{b_i-1} \cdots x_{m}^{b_m } \frac{\partial}{\partial t} x_i(t). \end{align*}

This is possible if

\begin{align*} \left\{ \begin{array}{r@{\ }c@{\ }l} \dot{x}_{0}(t) & = & x_{0}(t){\sum_{a_{00}+ \cdots + a_{0m}=\theta}} \pi_{a_{00},\ldots,a_{0m}} x_{0}(t)^{a_{00}} x_{1}(t)^{a_{01}} \cdots x_{m}(t)^{a_{0m}}, \\[5pt] \dot{x}_{1}(t) & = & x_{1}(t){\sum_{a_{10} + \cdots + a_{1m}=\theta}} \pi_{a_{10},\ldots,a_{1m}} x_{0}(t)^{a_{10}} x_{1}(t)^{a_{11}} \cdots x_{m}(t)^{a_{1m}}, \\[5pt] & \vdots & \\[5pt] \dot{x}_{m}(t) & = & x_{m}(t){\sum_{a_{m0} + \cdots + a_{mm}=\theta}} \pi_{a_{m0},\ldots,a_{mm}} x_{0}(t)^{a_{m0}} x_{1}(t)^{a_{m1}} \cdots x_{m}(t)^{a_{mm}}. \end{array} \right. \end{align*}

By setting $t=0$ , we get

\begin{align*} H ( x_0(0),x_1(0),\ldots,x_m(0),z ) & = x_0(t+z)^{b_0}x_1(t+z)^{b_1} \cdots x_m(t+z)^{b_m}\big|_{t=0} \\[5pt] & = x_0(z)^{b_0}x_1(z)^{b_1} \cdots x_m(z)^{b_m}. \end{align*}

We note that the factorial moment of $B_n^{i}$ , $i=0,1,e...,m$ , can be derived through the generating function $H(x_0,\ldots,x_m,z)$ . The next theorem provides the detail.

Theorem 3.2. The joint ( $r_0,\ldots,r_m$ ) descending factorial moment of $\big(B_n^{0},\ldots,B_n^{m}\big)$ is given by

\begin{equation*} \mathbb{E}\Big[\big( B_n^{0} \big)_{r_0} \cdots \big( B_n^{m} \big)_{r_m} \Big] = \frac{[z^n]\displaystyle{ \frac{\partial^{r_0+ \cdots + r_m}}{\partial x_0^{r_0} \cdots \partial x_m^{r_m}} H(x_0,\ldots,x_m,z)} \bigg|_{x_0 = \cdots = x_m = 1} }{[z^n]H(1,\ldots,1,z)}, \end{equation*}

or equivalently

\begin{equation*} \mathbb{E}\Big[\big( B_n^{0} \big)_{r_0}\cdots\big( B_n^{m} \big)_{r_m}\Big] = \frac{r_0! \cdots r_m!\,[(x_0-1)^{r_0} \cdots (x_m-1)^{r_m}]H(x_0,\ldots,x_m,z)}{[z^n]H(1,\ldots,1,z)}, \end{equation*}

where $\big(B_n^{i}\big)_{r_i} = B_n^{i} \big( B_n^{i}-1 \big) \cdots \big( B_n^{i}-r_i + 1 \big)$ , $i=0,1,\ldots,m$ .

Proof. In terms of (2.3), we have

\begin{align*} \mathbb{E}\Big[\big( B_n^{0} \big)_{r_0} \cdots \big( B_n^{m} \big)_{r_m}\Big] = \displaystyle{ \frac{\partial^{r_0+ \cdots + r_m} }{\partial x_0^{r_0} \cdots \partial x_m^{r_m}} \mathbb{E} \Big[x_0^{B_n^{0}} x_1^{B_n^{1}} \cdots x_m^{B_n^{m}}\Big]} \bigg|_{x_0 = \cdots = x_m = 1}. \end{align*}

It is easy to see that the two results are equivalent.

4. Urn models and partial differential equations

4.1. Generating functions

Consider the joint distribution of $\big(B_n^{1},B_n^{2},\ldots,B_n^{m}\big)$ . The probability generating function of $\big(B_n^{1},B_n^{2},\ldots,B_n^{m}\big)$ is denoted by $\phi_n(\boldsymbol{{t}})$ ; that is,

\begin{align*} \phi_n(\boldsymbol{{t}}) = \mathbb{E}\Big[t_1^{B_n^{1}} \cdots t_m^{B_n^{m}}\Big] = \sum_{i_1,\ldots,i_m \geq 0}\mathbb{P} \big( B_n^{1}=i_1,\ldots,B_n^{m}=i_m\big)t_1^{i_1} \cdots t_m^{i_m},\end{align*}

where $\boldsymbol{{t}} = (t_1,\ldots,t_m)$ .

We define the generating function as $\overline{H}(t_1,\ldots,t_m,z) = H(t_0,t_1,\ldots,t_m,z) \big|_{t_0=1}$ . Immediately, we can observe the following relation:

\begin{align*} H(t_0,t_1,\ldots,t_m,z) = t_0^{\tau_0}\overline{H}\bigg(\frac{t_1}{t_0},\ldots,\frac{t_m}{t_0},t_0^\theta z\bigg).\end{align*}

In terms of (2.2) and (2.3), the joint probability mass function and probability generating function of $\big(B_n^{1},B_n^{2},\ldots,B_n^{m}\big)$ can be expressed as

\begin{align*} \mathbb{P} \big(B_n^{1}=i_1,\ldots,B_n^{m}=i_m\big) & = \frac{[t_1^{i_1} \cdots t_m^{i_m} z^n]\overline{H}(t_1,\ldots,t_m,z)}{[z^n]\overline{H}(1,\ldots,1,z)}, \\[5pt] \phi_n(\boldsymbol{{t}}) & = \frac{[z^n]\overline{H}(t_1,\ldots,t_m,z)}{[z^n]\overline{H}(1,\ldots,1,z)},\end{align*}

respectively.

It is worth mentioning that the generating function $\overline{H}(t_1,\ldots,t_m,z)$ can be captured through the joint probability mass function and the probability generating function of $\big(B_n^{1},\ldots,B_n^{m}\big)$ :

\begin{align*} \overline{H}(t_1,\ldots,t_m,z) & = \sum_{n=0}^{\infty} \, \sum_{i_1,\ldots,i_m \geq 0} \tau_0 \cdots \tau_{n-1}\mathbb{P} \big( B_n^{1}=i_1,\ldots,B_n^{m}=i_m \big) x_1^{i_1} \cdots x_m^{i_m} \frac{z^n}{n!} \\[5pt] & = \sum_{n=0}^{\infty}\tau_0 \tau_1 \cdots \tau_{n-1}\phi_n(\boldsymbol{{t}}) \frac{z^n}{n!}.\end{align*}

4.2. Recurrence relations for probability generating functions

We can evaluate the joint probability mass function of $\big(B_n^{1},\ldots,B_n^{m}\big)$ through recurrence relations that can be established easily using probability generating functions.

Theorem 4.1. Under the initial urn condition $\big(B_0^{0},B_0^{1},\ldots,B_0^{m}\big) = (b_0,b_1,\ldots,b_m)$ and the replacement scheme in (2.1), the probability generating functions $\phi_{n}(\boldsymbol{{t}})$ for $n \geq 0$ satisfy the following recurrence relations:

(4.1) \begin{align} \phi_{n}(\boldsymbol{{t}}) = \alpha_{0}(\boldsymbol{{t}}) \phi_{n-1}(\boldsymbol{{t}}) + \sum_{i=1}^{m}\frac{t_i ( \alpha_{i}(\boldsymbol{{t}})-\alpha_{0}(\boldsymbol{{t}}) )}{\tau_{n-1}} \cdot \frac{\partial \phi_{n-1}(\boldsymbol{{t}})}{\partial t_i}, \qquad n \geq 1, \end{align}
(4.2) \begin{align} \!\! \phi_{0}(\boldsymbol{{t}}) = t_1^{b_1} t_2^{b_2} \cdots t_m^{b_m}, \phantom{\alpha_{0}(\boldsymbol{{t}}) \phi_{n-1}(\boldsymbol{{t}}) + \sum_{i=1}^{m}\frac{t_i ( \alpha_{i}(\boldsymbol{{t}})-\alpha_{0}(\boldsymbol{{t}}) )}{\tau_{n-1}} \cdot \frac{\partial \phi_{n-1}}{\partial}} \end{align}

where $\alpha_{i}(\boldsymbol{{t}})=\mathbb{E}\left[t_1^{A_{i1}} t_2^{A_{i2}} \cdots t_m^{A_{im}} \right]$ .

Proof. (4.1) is an immediate consequence of the following conditional expectation:

\begin{align*} \phi_{n}(\boldsymbol{{t}}) & = \mathbb{E} \Big[t_1^{B_n^{1}} \cdots t_m^{B_n^{m}}\Big] = \mathbb{E} \Big[t_1^{B_n^{1}} \cdots t_m^{B_n^{m}} \mid B_{n-1}^{1},\ldots,B_{n-1}^{m}, A_{ij}, \, 0 \leq i,j \leq m\Big] \\[5pt] & = \sum_{i=0}^{m}\mathbb{E} \bigg[\frac{B_{n-1}^{i}}{\tau_{n-1}}t_1^{B_{n-1}^{1}+A_{i1}} \cdots t_m^{B_{n-1}^{m}+A_{im}}\bigg] \\[5pt] & = \mathbb{E} \Big[t_1^{B_{n-1}^{1}+A_{01}} \cdots t_m^{B_{n-1}^{m}+A_{0m}}\Big] \\[5pt] & \quad + \sum_{i=1}^{m} \mathbb{E} \bigg[\frac{B_{n-1}^{i}}{\tau_{n-1}} \Big(t_1^{B_{n-1}^{1}+A_{i1}} \cdots t_m^{B_{n-1}^{m}+A_{im}} - t_1^{B_{n-1}^{1}+A_{01}} \cdots t_m^{B_{n-1}^{m}+A_{0m}}\Big)\bigg]. \end{align*}

The initial condition (4.2) is easily obtained by the definition of the probability generating function.

As a special case of $m=1$ , consider the Pólya urn model containing two different labels: 0 and 1. If $\alpha_0(t_1) \neq \alpha_1(t_1)$ and we can find the antiderivative

\begin{align*}\displaystyle{\int \frac{\alpha_0(t_1)}{t_1(\alpha_1(t_1)-\alpha_0(t_1))} \, \mathrm{d} t_1},\end{align*}

we provide the simpler recurrence. The transformation

\begin{align*} \psi_n(t_1)=\exp\!\bigg(\int \frac{\tau_{n} \alpha_0(t_1)}{t_1(\alpha_1(t_1)-\alpha_0(t_1))} \, \mathrm{d} t_1\bigg) \phi_n(t_1) \end{align*}

yields

(4.3) \begin{align} \psi_{n}(t_1) = \frac{t_1 ( \alpha_{1}(t_1)-\alpha_{0}(t_1) )}{\tau_{n-1}} I(t_1, \theta ) \cdot \frac{\partial \psi_{n-1}(t_1)}{\partial t_i}, \qquad n \geq 1, \end{align}
(4.4) \begin{align} \psi_{0}(t_1) = t_1^{b_1} I(t_1, \tau_0 ),\phantom{\frac{t_1 ( \alpha_{1}(t_1)-\alpha_{0}(t_1) )}{\tau_{n-1}} I(t_1, \theta ) \cdot \frac{\partial \psi_{n-1}(t_1)}{\partial t_i},} \end{align}

where

\begin{align*}I(t_1,a) = \exp\! \bigg(\int \frac{a\,\alpha_0(t_1)}{t_1(\alpha_1(t_1)-\alpha_0(t_1))}\,\mathrm{d} t_1\bigg).\end{align*}

By differentiating (4.1) and (4.2), and making use of

\begin{align*} \mathbb{E}[A_{ji}] & = \frac{\partial }{\partial t_i} \alpha_j(\boldsymbol{{t}})\bigg|_{t_1 = \cdots = t_m=1} , \qquad i=1,\ldots,m, \ j=0,\ldots,m, \\[5pt] \mathbb{E}[B_n^{i}] & = \frac{\partial }{\partial t_i} \phi_n( \boldsymbol{{t}})\bigg|_{t_1 = \cdots = t_m=1} , \qquad n \geq 0, \ i=1,\ldots,m,\end{align*}

we can readily establish the recurrence relations for the expected values of $B_n^{i}$ , $n \geq 0$ , $i=1,\ldots,m$ .

Corollary 4.1. For $i=1,2,\ldots,m$ , the expected values $\mathbb{E} \big[ B_n^{i} \big]$ , $n \geq 0$ satisfy the recurrence relations

\begin{alignat*}{2} \mathbb{E}[B_n^{i}] & = \mathbb{E}[B_{n-1}^{i}] + \mathbb{E}[A_{0i}] + \sum_{j=1}^{m}\frac{\mathbb{E}[A_{ji}] - \mathbb{E}[A_{0i}]}{\tau_{n-1}} \cdot \mathbb{E}[B_{n-1}^{j}], \quad & & n \geq 1, \ i=1,\ldots,m, \\[5pt] \mathbb{E}[B_{0}^{i}] & = b_i, & & i=1,\ldots,m. \end{alignat*}

In the case of $m=1$ , [Reference Hwang, Chern and Duh12] studied recurrence relations with a slight modification of (4.1) and (4.2), known as recurrences of Eulerian type, and considered the applications to the Pólya urn models.

4.3. Partial differential equations

We find the generating function $\overline{H}(\boldsymbol{{t}},z)$ by employing the partial differential equation approach. We derive the partial differential equation satisfied by the generating function $\overline{H}(\boldsymbol{{t}},z)$ .

Theorem 4.2. Under the initial urn condition $\left(B_0^{0},B_0^{1},\ldots,B_0^{m}\right) = (b_0,b_1,\ldots,b_m)$ and the replacement scheme in (2.1), the generating function $\overline{H}(\boldsymbol{{t}},z)$ satisfies the partial differential equation

(4.5) \begin{align} \tau_{0} \alpha_{0}(\boldsymbol{{t}}) \overline{H}(\boldsymbol{{t}},z) = ( 1-\theta z \alpha_{0}(\boldsymbol{{t}}) )\frac{\partial \overline{H}(\boldsymbol{{t}},z)}{\partial z} - \sum_{i=1}^{m}t_i ( \alpha_{i}(\boldsymbol{{t}})-\alpha_{0}(\boldsymbol{{t}}) ) \cdot\frac{\partial \overline{H}(\boldsymbol{{t}},z)}{\partial t_i}, \end{align}
(4.6) \begin{align} \;\overline{H}(\boldsymbol{{t}},z) \big|_{z=0} = t_1^{b_1} t_2^{b_2} \cdots t_m^{b_m}, \phantom{( 1-\theta z \alpha_{0}(\boldsymbol{{t}}) )\frac{\partial \overline{H}(\boldsymbol{{t}},z)}{\partial z}\frac{\partial \overline{H}(\boldsymbol{{t}},z)}{\partial z}\sum_{i=1}^{m}t_i ( \alpha_{i}} \end{align}

where $\alpha_{i}(\boldsymbol{{t}}) = \mathbb{E}\big[t_1^{A_{i1}} t_2^{A_{i2}} \cdots t_m^{A_{im}} \big]$ .

Proof. From (4.1), we have

\begin{align*} & \sum_{n=1}^{\infty} \tau_{0} \cdots \tau_{n-1} \phi_{n}(\boldsymbol{{t}}) \frac{z^{n-1}}{(n-1)!} = \sum_{n=1}^{\infty} \alpha_{0}(\boldsymbol{{t}}) \tau_{0} \cdots \tau_{n-1} \phi_{n-1}(\boldsymbol{{t}}) \frac{z^{n-1}}{(n-1)!} \\[5pt] & \qquad\qquad\qquad\qquad\qquad\qquad\quad \sum_{n=1}^{\infty} \sum_{i=1}^{m} t_i ( \alpha_{i}(\boldsymbol{{t}})-\alpha_{0}(\boldsymbol{{t}}) ) \cdot \tau_{0} \cdots \tau_{n-2} \frac{\partial \phi_{n-1}(\boldsymbol{{t}})}{\partial t_i} \frac{z^{n-1}}{(n-1)!} \\[5pt] & \frac{\partial \overline{H}(\boldsymbol{{t}},z)}{\partial z} = \tau_0 \alpha_{0}(\boldsymbol{{t}}) \overline{H}(\boldsymbol{{t}},z) + \theta z \alpha_{0}(\boldsymbol{{t}}) \frac{\partial \overline{H}(\boldsymbol{{t}},z)}{\partial z} + \sum_{i=1}^{m} t_i ( \alpha_{i}(\boldsymbol{{t}})-\alpha_{0}(\boldsymbol{{t}}) ) \cdot \frac{\partial \overline{H}(\boldsymbol{{t}},z)}{\partial t_i} . \end{align*}

As a special case of $m=1$ , we examine the Pólya urn model containing the labels 0 and 1. If $\alpha_0(t_1) \neq \alpha_1(t_1)$ and we can obtain the antiderivative

\begin{align*}\displaystyle{\int \frac{\alpha_0(t_1)}{t_1(\alpha_1(t_1)-\alpha_0(t_1))} \, \mathrm{d} t_1},\end{align*}

we provide the simpler partial differential equation. When

\begin{align*}\Psi(t_1,z) = \sum_{n=0}^{\infty}\tau_0 \tau_1 \cdots \tau_n \psi_{n}(t_1) \frac{z^n}{n!},\end{align*}

(4.3) and (4.4) yield the following partial differential equation:

(4.7) \begin{align} \frac{\partial \Psi(t_1,z)}{\partial z} & = t_1 ( \alpha_1(t_1) -\alpha_0(t_1) )I(t_1,\theta) \frac{\partial \Psi(t_1,z)}{\partial t_1}, \end{align}
(4.8) \begin{align} \!\Psi(t_1,z) \big|_{z=0} & = t_1^{b_1} I(t_1, \tau_0 ), \phantom{\frac{\partial \Psi(t_1,z)}{\partial z} = t_1 ( \alpha_1(t_1) -\alpha } \end{align}

where

\begin{align*}I(t_1,a)=\exp\! \bigg(\int \frac{a \, \alpha_0(t_1)}{t_1(\alpha_1(t_1)-\alpha_0(t_1))} \, \mathrm{d} t_1\bigg).\end{align*}

First-order partial differential equations can often be solved by the method of characteristics. The partial differential equation in (4.7) can be written as

\begin{align*} \frac{\mathrm{d} z}{1}=-\frac{\mathrm{d} t_1}{t_1 ( \alpha_1(t_1) -\alpha_0(t_1) )I(t_1,\theta)}.\end{align*}

Note that

\begin{align*} \frac{\mathrm{d}}{\mathrm{d} t_1}\bigg(z + \int \frac{\mathrm{d} t_1}{t_1 ( \alpha_1(t_1) -\alpha_0(t_1) )I(t_1,\theta)}\bigg)=0.\end{align*}

We then see that $\eta(t_1,z)=C$ , where

\begin{align*}\eta(t_1,z) = z + \int \frac{\mathrm{d} t_1}{t_1 ( \alpha_1(t_1) -\alpha_0(t_1) )I(t_1,\theta)}.\end{align*}

Therefore, the generating function $\Psi(t_1,z)$ can be expressed as $\Psi(t_1,z) = G (\eta(t_1,z))$ , where the function G is specified by the initial condition (4.8) as $G(\eta(t_1,0)) = t_1^{b_1} I(t_1, \tau_0 )$ .

5. Applications

5.1. Coupon collector’s urn

Consider the coupon collector’s problem with a loss. Suppose that there are k distinct types of coupons bearing the numbers 1, 2, …, k, and that a coupon is placed in each package. Assume that a coupon of type i is collected with probability $1/k$ , $i=1,2,\ldots,k$ . A customer buys a product daily; they can retain the coupon with probability p, or may lose it with probability $1-p$ ( $0 < p \leq 1$ ). We are interested in the probability that they collect every type of coupon m or more times after n days. When $p=1$ , the problem corresponds to the classical coupon collector’s problem.

We can formulate this problem with the adequate urn model. Suppose that initially there are k balls all labeled (0) in the urn. When a ball labeled (0) is picked up, its label changes to (1) with probability p or remains the same with probability $1-p$ . When a ball labeled (1) is picked up, its label changes to (2) with probability p or remains (1) with probability $1-p$ , and so on. If a ball labeled ( $m-1$ ) becomes a ball labeled (m), it remains so forever. This experiment is repeated n times.

The replacement matrix is expressed as

(5.1) \begin{equation}A=\left(\begin{array}{c@{\quad}c@{\quad}c@{\quad}c@{\quad}c}-B(p) & B(p) & 0 &\cdots & 0 \\[5pt] 0 & -B(p) & B(p) & \cdots & 0 \\[5pt] \vdots & \vdots & \ddots & \ddots & \vdots\\[5pt] 0 & \cdots & 0 & -B(p) & B(p) \\[5pt] 0 & 0& \cdots & 0 & 0\end{array}\right)_{(m+1) \times (m+1)}, \end{equation}

where B(p) denotes the Bernoulli random variable taking value 1 with probability p and 0 with probability $1-p$ ( $0 < p \leq 1$ ). The system of differential equations is

\begin{align*} \left\{ \begin{array}{r@{\ }c@{\ }l} \dot{x}_{0}(t) & = & p x_{1}(t) + (1-p) x_{0}(t), \\[5pt] \dot{x}_{1}(t) & = & p x_{2}(t) + (1-p) x_{1}(t), \\[5pt] & \vdots & \\[5pt] \dot{x}_{m-1}(t) & = & p x_{m}(t) + (1-p) x_{m-1}(t), \\[5pt] \dot{x}_{m}(t) & = & x_{m}(t) , \end{array} \right.\end{align*}

with $x_0(0)=u_0$ , $x_1(0)=u_1$ , …, $x_m(0)=u_m$ . Then, the solution is

\begin{align*} x_{\ell}(t) = \mathrm{e}^{(1-p)t} \Bigg[u_{m} \Bigg( \mathrm{e}^{pt}-\sum_{i=0}^{m - \ell - 1} \frac{(pt)^i}{i!} \Bigg) + \sum_{i=\ell}^{m-1} u_{i} \frac{(pt)^{i-\ell}}{(i-\ell)!}\Bigg], \qquad \ell=0,1,\ldots,m.\end{align*}

Under the initial urn condition $( B_0^{0}, B_0^{1},\ldots, B_0^{m}) = ( k,0,\ldots,0)$ , the urn history generating function is given by

(5.2) \begin{equation} H(u_0,u_1,\ldots,u_m,z) = ( x_0(t) )^k = \mathrm{e}^{(1-p)kt}\Bigg[u_{m} \Bigg( \mathrm{e}^{pt}-\sum_{i=0}^{m-1} \frac{p^i t^i}{i!} \Bigg) + \sum_{i=0}^{m-1} u_{i} \frac{p^i t^i}{i!} \Bigg]^k. \end{equation}

The probability mass function of $B_n^{m}$ can be expressed in terms of the Stirling numbers.

For a real number a, a non-negative integer k, and a positive integer s, the numbers $S_{a,s}(n,k)$ , $n=sk,sk+1,\ldots$ , are defined by their exponential generating function

(5.3) \begin{equation}f_{a,s}(t,k)=\sum_{n=sk}^{\infty}S_{a,s}(n,k) \frac{t^n}{n!}=\mathrm{e}^{at}\frac{1}{k!}\bigg(\mathrm{e}^{t}-1-\frac{t}{1!} - \cdots - \frac{t^{s-1}}{(s-1)!}\bigg)^k. \end{equation}

These numbers are known as the generalized non-central Stirling numbers of the second kind [Reference Koutras19].

Observing that

\begin{align*} \left[u_m^k\right] H(1,1,\ldots,1,u_m,z)=\mathrm{e}^{(1-p)kt}\Bigg( \mathrm{e}^{pt}-\sum_{i=0}^{m-1} \frac{p^i t^i}{i!} \Bigg)^k ,\end{align*}

we have the following result under (5.3).

Proposition 5.1. The probability that a customer has all the types of coupons at least m or more times after n days is

\begin{align*} \mathbb{P}(B_n^{m}=k) = S_{{(1-p)k}/{p},m}(n,k)\frac{k! p^n}{k^n}. \end{align*}

In the case of $m=2$ , [Reference Morcrette and Mahmoud21] considered the urn for a two-type coupon collection. Note that it also mentioned a possible extension to an urn model with m colors, which stimulated our interest in analytic urns. Furthermore, it discussed the use of asymptotic tools to obtain limit laws.

5.2. Coupon collector waiting time problem

Under equivalent conditions to those presented in Section 5.1, we treat the waiting time problem in coupon collection with a loss in the special case of $m=1$ . A customer buys a product daily; they can retain the coupon with probability p, or lose it with probability $1-p$ ( $0 < p \leq 1$ ). How many days does it take to collect every type of coupon? Let that time be denoted by $T_k$ .

From the obvious identity $\mathbb{P}( T_k \leq n ) = \mathbb{P}(B_n^{1}=k)$ , $n \geq 1$ , we can express the probability mass function of the waiting time $T_k$ as

\begin{align*} \mathbb{P}( T_k = n ) & = \mathbb{P}( T_k \leq n ) - \mathbb{P}( T_k \leq n-1 ) \\[5pt] & = S_{{(1-p)k}/{p},1}(n,k)\frac{k! p^n}{k^n} - S_{{(1-p)k}/{p},1}(n-1,k)\frac{k! p^{n-1}}{k^{n-1}}, \qquad n \geq 1.\end{align*}

Proposition 5.2. The probability mass function of the waiting time $T_k$ is given by

\begin{equation*} \mathbb{P} \left( T_k = n \right) = S_{{(1-p)k}/{p},1}(n-1,k-1) \frac{k! p^n}{k^n}, \qquad n \geq 1.\end{equation*}

By making use of (A.1) in the appendix, we can proceed to derive the probability generating function of the waiting time $T_k$ as

\begin{equation*} \mathbb{E}[ u^{T_k} ] = k! \sum_{n=k}^{\infty}S_{{(1-p)k}/{p},1}(n-1,k-1) \bigg( \frac{pu}{k} \bigg)^n = \prod_{i=0}^{k-1}\frac{ ( p - ({pi}/{k})) u }{1 - ( 1 - p + ({pi}/{k})) u }.\end{equation*}

This representation implies that $T_k$ can be decomposed as a sum of k independent random variables.

Proposition 5.3. For $i=0,1,\ldots,k-1$ , let $Y_i$ be independent random variables with probability mass functions

\begin{align*} \mathbb{P}(Y_i = y ) = \bigg( p-\frac{pi}{k} \bigg)\bigg( 1-p+\frac{pi}{k} \bigg)^{y-1}. \end{align*}

Then, the waiting time $T_k$ can be decomposed as $T_k = Y_0+Y_1 + \cdots + Y_{k-1}$ .

The expected value and variance of $T_k$ are expressed, respectively, as

(5.4) \begin{align} \mathbb{E}(T_k) = \frac{k}{p}\sum_{i=0}^{k-1}\frac{1}{k-i} \sim \frac{k \log k}{p}, \end{align}
(5.5) \begin{align} \mathbb{V}(T_k) & = \frac{k}{p^2}\sum_{i=0}^{k-1}\frac{pi+k(1-p)}{(k-i)^2} = \frac{k^2}{p^2}\sum_{i=0}^{k-1}\frac{1}{(k-i)^2} - \frac{k}{p}\sum_{i=0}^{k-1}\frac{1}{(k-i)} \\[5pt] & \leq \frac{k^2}{p^2} \sum_{i=0}^{k-1} \frac{1}{(k-i)^2} = \frac{\pi^2 k^2}{6 p^2}. \nonumber \end{align}

Remark 5.1. From (5.4) and (5.5), we have

\begin{align*}\frac{{T_k}}{{({ k \log k })/{p}}} \to 1\end{align*}

in probability.

Consider a numerical example in the case of $n=365$ and $p=0.8$ . That is, we are interested in the number of people we should meet until we have seen at least one person with every possible birthday. In this case, the limit theorem yields approximately $({365 \log 365})/{0.8}=2691.83$ tries to get a complete set, which means that the number of trials is $7.37$ times the number of birthdays. We mention that [Reference Durrett7] considered the numerical example in the case of $n=365$ and $p=1$ .

5.3. Birthday urn

The birthday problem with recording failures can be approached with the same matrix as in (5.1). Assume that each person is equally likely to have any of the 365 days in a year as their birthday. Suppose that we randomly interview people individually. When we ask the person for their birthday, we can record it with probability p, or fail to record it with probability $1-p$ ( $0 < p \leq 1$ ). We need to determine the probability that, after interviews with n persons chosen at random, at least m persons have the same birthday on record.

The probability mass function $\mathbb{P}(B_n^{m}=0)$ can be evaluated effectively. For a real number a, a non-negative integer k, and a positive integer s, the numbers $M_{a,s}(n,k)$ , $n=0,1,\ldots$ , are defined by their exponential generating function

(5.6) \begin{equation} g_{a,s}(t,k) = \sum_{n=0}^{\infty}M_{a,s}(n,k) \frac{t^n}{n!} = \mathrm{e}^{a t}\bigg(1 + \frac{t}{1!} + \frac{t^2}{2!} + \cdots + \frac{t^s}{s!}\bigg)^k. \end{equation}

From (5.2) with $k=365$ , we have

\begin{align*} \left[u_m^{0}\right]H(1,1,\ldots,1,u_m) = \mathrm{e}^{365(1-p)t} \bigg(1 + \frac{pt}{1!} + \frac{(pt)^2}{2!} + \cdots + \frac{(pt)^{m-1}}{(m-1)!}\bigg)^{365}.\end{align*}

From (5.6),

\begin{align*} \mathbb{P}(B_n^{m}=0) = 1-\frac{n!}{365^n} \cdot [t^n]g_{{365(1-p)}/{p}, m-1}(t,365).\end{align*}

Furthermore, the following result is established.

Proposition 5.4. The probability that among n persons chosen at random, at least m persons have the same birthday, is given by

(5.7) \begin{equation} \mathbb{P}(B_n^{m}=0) = 1 - \frac{p^n}{365^n} \cdot M_{{365(1-p)}/{p}, m-1}(n,365). \end{equation}

5.4. Waiting time in birthday problem

Consider the waiting time birthday problem with failures. Assume that each person is equally likely to have any of the 365 days in the year as their birthday. Suppose that we randomly interview people individually. When we ask each person for their birthday, we can record it with probability p or fail to record it with probability $1-p$ ( $0 < p \leq 1$ ). Let $W_m$ be the waiting time until we find m people with a common birthday on record. We investigate the distribution of the waiting time $W_m$ in terms of $B_n^{m}$ .

It is clear that $\mathbb{P}( W_m \leq n ) = \mathbb{P}(B_n^{m}=0)$ . The distribution of $W_m$ can be evaluated using (5.7).

Proposition 5.5. The probability mass function of the waiting time $W_m$ is given by

\begin{align*} \mathbb{P}(W_m=n) = \frac{p^{n-1}}{k^{n-1}}M_{{(1-p)k}/{p}, m-1}(n-1,k) - \frac{p^n}{k^n}M_{{(1-p)k}/{p}, m-1}(n,k). \end{align*}

5.5. Ehrenfest urn on graphs

Consider five urns labeled 0, 1, 2, 3, and 4, initially containing k balls in total. At any given time, a ball is drawn from one urn with probability ${1}/{k}$ and moved to the adjacent urn with given probabilities (see Figure 1). For $i=0,1,2,3,4$ , let $B_n^{i}$ be the number of balls in the i-th urn.

Figure 1. The transition graph of the Ehrenfest urn.

The associated system of differential equations is

\begin{align*} \left\{\begin{array}{l}\dot{x}_{0}(t) = \dfrac{1}{2} x_{1}(t) + \dfrac{1}{2} x_{2}(t), \\[12pt] \dot{x}_{1}(t) = \dfrac{1}{3} x_{0}(t) + \dfrac{1}{3} x_{2}(t) + \dfrac{1}{3} x_{3}(t), \\[12pt] \dot{x}_{2}(t) = \dfrac{1}{3} x_{0}(t) + \dfrac{1}{3} x_{1}(t) + \dfrac{1}{3} x_{4}(t), \\[12pt] \dot{x}_{3}(t) = x_{1}(t), \\[5pt] \dot{x}_{4}(t) = x_{2}(t),\end{array}\right.\end{align*}

with $x_0(0)=u_0$ , $x_1(0)=u_1$ , $x_2(0)=u_2$ , $x_3(0)=u_3$ , and $x_4(0)=u_4$ .

For example, the solution $x_0(t)$ is given by

\begin{align*} x_0(t) &= \mathrm{e}^{t} \bigg( \frac{2 u_0 + 3u_1 + 3u_2 + u_3 + u_4}{10} \bigg) \\ & \quad + \!3 \mathrm{e}^{-\frac{2}{3}t}\bigg( \frac{2 u_0 - 2u_1 - 2 u_2 + u_3 + u_4} {20} \bigg) + \frac{2 u_0 -u_3 - u_4}{4}.\end{align*}

Under the initial urn condition $\left( B_0^{0}, B_0^{1}, B_0^{2}, B_0^{3}, B_0^{4} \right) = ( k,0,0,0,0 )$ , the expected values are

\begin{align*} \mathbb{E}[B_n^{0}] & = k \bigg( \frac{1}{5} + \frac{3}{10}\bigg( 1-\frac{5}{3k} \bigg)^n + \frac{1}{2}\bigg( 1-\frac{1}{k} \bigg)^n \bigg) , \\[5pt] \mathbb{E}[B_n^{1}] & = E[B_n^{2}] = k \bigg( \frac{3}{10} - \frac{3}{10}\bigg( 1-\frac{5}{3k} \bigg)^n\bigg) , \\[5pt] \mathbb{E}[B_n^{3}] & = E[B_n^{4}] = k \bigg( \frac{1}{10} + \frac{3}{20}\bigg( 1-\frac{5}{3k} \bigg)^n - \frac{1}{4}\bigg( 1-\frac{1}{k} \bigg)^n\bigg) .\end{align*}

Therefore, we have $\mathbb{E}[B_n^{0}] \to {k}/{5}$ , $\mathbb{E}[B_n^{1}] = \mathbb{E}[B_n^{2}] \to {3k}/{10}$ , and $\mathbb{E}[B_n^{3}] = \mathbb{E}[B_n^{4}] \to {k}/{10}$ as $n \to \infty$ . When $k=20$ and $n=30$ , Figure 2 shows the graphs of the joint probability mass function $\left(B_{30}^{1}, B_{30}^{3}\right)$ .

Figure 2. The joint probability mass function $\left(B_{30}^{1}, B_{30}^{3}\right)$ .

5.6. Pólya–Ehrenfest urn

Consider two urns labeled 0 and 1. Initially, urn 0 contains $b_0$ balls, and urn 1 contains $b_1$ balls. At any given time, a ball in the urns is drawn and moved to the other urn with equal probabilities. Therefore, the total number of balls constantly increases by 1. This urn model was referred to as the Pólya–Ehrenfest model in [Reference Blom, Holst and Sandell3]. The replacement matrix is expressed as $\left(\begin{smallmatrix}-1 & \hphantom{-}2 \\[5pt] \hphantom{-}2 & -1\end{smallmatrix}\right)$ .

We consider the urn scheme alternating with equal probabilities between $\left(\begin{smallmatrix}-1 & \hphantom{-}2 \\[5pt] \hphantom{-}2 & -1\end{smallmatrix}\right)$ and $\big(\begin{smallmatrix}1 & 0 \\[5pt] 0 & 1\end{smallmatrix}\big)$ , i.e. between a Pólya–Ehrenfest urn with probability $\dfrac{1}{2}$ and a Pólya urn with probability $\dfrac{1}{2}$ .

From the partial differential equations in (4.5) and (4.6) under the initial urn condition $(B_0^{0}, B_0^{1})=(b_0,b_1)$ with $\alpha_{0}(t_1) = {t_1^2}/{2} +\dfrac{1}{2}$ and $\alpha_{1}(t_1) = {1}/{2t_1} + {t_1}/{2}$ , we have

\begin{equation*} \frac{\mathrm{d} \overline{H}(t_1,z)}{ (b_0+b_1) \cdot \frac12({t_1^2+1}) \cdot \overline{H}(t_1,z) } = \frac{\mathrm{d} z}{1 - z \cdot \frac12 ({t_1^2+1})} = -\frac{\mathrm{d} t_1}{\frac12{(1-t_1)(1+t_1^2)}}.\end{equation*}

The solution is given by

\begin{equation*} \overline{H}(t_1,z) = \frac{( 1-t_1 )^{b_0+b_1}\bigg[ \displaystyle{\frac{\tan\!\big(\frac12({1-t_1})z\big) +t_1}{1 - t_1\tan\!\big(\frac12({1-t_1})z\big)}} \bigg]^{b_0}} {\bigg[ 1 - \displaystyle{\frac{\tan\!\big(\frac12({1-t_1})z\big) + t_1}{1-t_1\tan\!\big(\frac12({1-t_1})z\big)}} \bigg]^{b_0+b_1}}.\end{equation*}

From the results presented in Corollary 4.1, the expected value of $B_{n}^{1}$ is

\begin{align*} \mathbb{E}[ B_{n}^{1} ] = \frac{n(n-1)+2(b_0+b_1)n+2(b_0+b_1-1)b_0}{2(b_0+b_1+n-1)}.\end{align*}

In the case where $b_0=1$ , $b_1=1$ , and $n=50$ , we present graphs of the distribution of the number of balls in urn 1 in Figure 3.

Figure 3. The probability mass function of $B_{50}^{1}$ .

5.7. Multinomial urn

Assume that the random variables $(A_{i0},A_{i1},\ldots,A_{im})$ have the probability mass function

\begin{align*} \mathbb{P}(A_{i0}=a_{i0},A_{i1}=a_{i1},\ldots,A_{im}=a_{im}) = \left(\begin{array}{c}\theta \\ a_{i0},a_{i1},\ldots,a_{im}\end{array}\right) p_{0}^{a_{i0}} p_{1}^{a_{i1}} \cdots p_{m}^{a_{im}} , \end{align*}

with $p_{0} + p_{1}+ \cdots +p_{m}=1$ , $i=0,1,\ldots,m$ . Then, we have $\alpha_{i}(\boldsymbol{{t}}) = ( p_{0}+p_{1} t_1 + p_{2} t_2 +\cdots + p_{m} t_m )^{\theta}$ , $i=0,1,\ldots,m$ .

From the partial differential equation in (4.5), with (4.6) under the initial urn condition $(B_n^{0},B_n^{1},\ldots,B_n^{m}) = (b_0,b_1,\ldots,b_m)$ , the solution is

\begin{align*} \overline{H}(\boldsymbol{{t}},z) = \frac{t_1^{b_1} t_2^{b_2} \cdots t_m^{b_m} } {[ 1-\theta(p_{0}+p_{1} t_1 + p_{2} t_2 +\cdots + p_{m} t_m )^{\theta} z ]^{({b_0 + b_1 + \cdots + b_m})/{\theta}}\,}.\end{align*}

Proposition 5.6. Under the initial urn condition $(B_n^{0},B_n^{1},\ldots,B_n^{m}) = (b_0,b_1,\ldots,b_m)$ , the probability generating function and probability mass function of ( $B_n^{1},B_n^{2},\ldots,B_n^{m}$ ) are given by

\begin{align*} & \phi_n(\boldsymbol{{t}}) = t_1^{b_1} t_2^{b_2} \cdots t_m^{b_m} (p_{0}+p_{1} t_1 + p_{2} t_2 +\cdots + p_{m} t_m)^{n \theta} , \\[5pt] & \mathbb{P} (B_n^{1}=i_1,B_n^{2}=i_2,\ldots,B_n^{m}=i_m) = \left(\begin{array}{c} n \theta \\ i_0,i_1-b_1,\ldots,i_m-b_m \end{array}\right) p_{0}^{i_0} p_{1}^{i_1-b_1} \cdots p_{m}^{i_m-b_m} , \\[5pt] & i_0 + i_1 + \cdots +i_m-(b_1+\cdots+b_m)=n \theta. \end{align*}

5.8. Uniform urn

Assume that the random variables $(A_{i1},A_{i1},\ldots,A_{im})$ have the probability mass function $\mathbb{P}(A_{i1}=a_{i1},A_{i2}=a_{i2},\ldots,A_{im}=a_{im}) = ( \ell+1 )^{-m}$ , with $A_{i0}=\theta-\sum_{j=1}^{m} A_{ij}$ , where $\theta=m \ell$ . Then, we have

\begin{equation*} \alpha_{i}(\boldsymbol{{t}}) = \prod_{i=1}^{m}\bigg(\frac{1+ t_i + t_i^2 + \cdots + t_i^\ell }{\ell+1} \bigg), \qquad i=0,1,\ldots,m.\end{equation*}

From the partial differential equation in (4.5), with (4.6) under the initial urn condition $(B_n^{0},B_n^{1},\ldots,B_n^{m}) = (b_0,b_1,\ldots,b_m)$ , we obtain the solution $\overline{H}(\boldsymbol{{t}},z)$ as

\begin{align*} \overline{H}(\boldsymbol{{t}},z) = \frac{t_1^{b_1} t_2^{b_2} \cdots t_m^{b_m} } {\big[ 1-\theta \prod_{i=1}^{m}({1+ t_i + t_i^2 + \cdots + t_i^\ell})({\ell+1}) z \big] ^{({b_0 + b_1 + \cdots + b_m})/{\theta}}\,}.\end{align*}

Proposition 5.7. Under the initial urn condition $(B_n^{0},B_n^{1},\ldots,B_n^{m}) = (b_0,b_1,\ldots,b_m)$ , the probability generating function and probability mass function of ( $B_n^{1},B_n^{2},\ldots,B_n^{m}$ ) are given by

\begin{align*} & \phi_n(\boldsymbol{{t}}) = t_1^{b_1} t_2^{b_2} \cdots t_m^{b_m} \prod_{i=1}^{m} \bigg(\frac{1+ t_i + t_i^2 + \cdots + t_i^\ell}{\ell+1}\bigg)^{n}, \\[5pt] & \mathbb{P}(B_n^{1}=i_1,B_n^{2}=i_2,\ldots,B_n^{m}=i_m) = \frac{ \prod_{j=1}^{m} T_{\ell, n, i_j-b_j}}{( \ell+1 )^{n }}, \end{align*}

where $T_{\ell, n,k} = [x^k](1+t+t^2+\cdots +t^\ell)^n$ .

We note that the numbers $T_{\ell, n,k}$ are well known classically and have many combinatorial aspects [Reference Flajolet and Sedgewick10].

Appendix A. The partition numbers

In this section we examine two partition numbers and derive the recurrence relations [Reference Charalambides4, Reference Charalambides5].

For a real number a, a non-negative integer k, and a positive integer s, the generalized non-central Stirling numbers of the second kind $S_{a,s}(n,k)$ , $n=sk,sk+1,\ldots$ , are defined by their exponential generating function given by (5.3).

Proposition A.1. The numbers $S_{a,s}(n,k)$ satisfy the recurrence relations

\begin{alignat*}{2} S_{a,s}(n,k) & = (k+a)S_{a,s}(n-1,k) + \left(\begin{array}{c} n-1 \\ s-1 \end{array}\right) S_{a,s}(n-s,k-1), \quad & n & \geq sk, \\[5pt] S_{a,s}(n,k) & = 0, & n & < sk, \\[5pt] S_{a,s}(n,0) & = a^n, & n & \geq 0, \\[5pt] S_{a,s}(0,0) & = 1. & & \end{alignat*}

Proof. By differentiating the generating function $f_{a,s}(t,k)$ with respect to t, we have

\begin{align*} f'_{\!\!a,s}(t,k)=(k+a)f_{a,s}(t,k) + \frac{t^{s-1}}{(s-1)!} f_{a,s}(t,k-1).\end{align*}

By equating the coefficients of ${t^n}/{n!}$ on both sides of the above expression, we derive the recurrence relations.

For $s=1$ , we define the generating function of the numbers $S_{a,1}(n,k)$ by $\varphi_{a,1}(u)= \sum_{n=k}^{\infty} S_{a,1}(n,k) u^n$ .

Proposition A.2. The generating function of the numbers $S_{a,1}(n,k)$ is given by

(A.1) \begin{equation} \varphi_{a,1}(u,k) = u^k\prod_{i=0}^{k}( 1-au-iu )^{-1}, \qquad k \geq 1. \end{equation}

Proof. For $s=1$ , we have $S_{a,1}(n,k) = (k+a)S_{a,1}(n-1,k) + S_{a,1}(n-1,k-1)$ , $n \geq k$ . By multiplying the above expression by $u^n$ and summing for $n \geq k$ under the initial conditions, we get $\varphi_{a,1}(u,k) = u(k+a)\varphi_{a,1}(u,k)+u\varphi_{a,1}(u,k-1)$ and

\begin{align*} \varphi_{a,1}(u,k) = \frac{u}{1-u(k+a)} \varphi_{a,1}(u,k-1).\end{align*}

By applying this recurrence relation repeatedly with $\varphi_{a,1}(u,0)=1$ , we obtain (A.1).

For a real number a, a non-negative integer k, and a positive integer s, the numbers $M_{a,s}(n,k)$ , $n=0,1,\ldots$ , are defined by their exponential generating function given by (5.6).

Proposition A.3. The numbers $M_{a,s}(n,k)$ satisfy the recurrence relations

\begin{alignat*}{2} M_{a,s}(n,k) & = (k+a)M_{a,s}(n-1,k) - k \left(\begin{array}{c} n-1 \\ s \end{array}\right) M_{a,s}(n-s-1,k-1), \quad & n & \geq s+1, \\[5pt] M_{a,s}(n,k) & = (k+a)^n, & n & \leq s. \end{alignat*}

Proof. As in the proof of Proposition A.1, by differentiating the generating function $g_{a,s}(t,k)$ with respect to t, we have

\begin{align*} g'_{\!\!a,s}(t,k)=(k-a)g_{a,s}(t,k) - k \frac{t^{s}}{(s)!} g_{a,s}(t,k-1).\end{align*}

By equating the coefficients of ${t^n}/{n!}$ on both sides of the above expression, we derive the recurrence relations.

Acknowledgements

The author wishes to thank the editor and referees for the careful review of this paper and the helpful suggestions which led to improved results.

Funding information

There are no funding bodies to thank relating to the creation of this article.

Competing interests

There were no competing interests to declare which arose during the preparation or publication process of this article.

References

Athreya, K. B. and Karlin, S. (1968). Embedding of urn schemes into continuous time Markov branching processes and related limit theorems. Ann. Math. Statist. 39, 18011817.CrossRefGoogle Scholar
Balaji, S. and Mahmoud, H. (2006). Exact and limiting distributions in diagonal Pólya processes. Ann. Inst. Statist. Math. 58, 171185.CrossRefGoogle Scholar
Blom, G., Holst, L. and Sandell, D. (1994). Problems and Snapshots from the World of Probability. Springer, New York.CrossRefGoogle Scholar
Charalambides, Ch. A. (2002). Enumerative Combinatorics. Chapman and Hall/CRC, Boca Raton, FL.Google Scholar
Charalambides, Ch. A. (2005). Combinatorial Methods in Discrete Distributions. John Wiley, New York.CrossRefGoogle Scholar
Chen, C. and Zhang, P. (2019). Characterizations of asymptotic distributions of continuous-time Polya processes. Commun. Statist. Theory Meth. 48, 53085321.CrossRefGoogle Scholar
Durrett, R. (2019). Probability: Theory and Examples, 5th edn. Cambridge University Press.CrossRefGoogle Scholar
Flajolet, P., Gabarro, J. and Pekari, H. (2005). Analytic urns. Ann. Prob. 33, 12001233.10.1214/009117905000000026CrossRefGoogle Scholar
Flajolet, P., Dumas, P. and Puyhaubert, V. (2006). Some exactly solvable models of urn process theory. In Proc. 4th Colloq. Math. Comp. Sci.: Algorithms, Trees, Combinatorics and Probabilities, ed. P. Chassaing, pp. 59–118.Google Scholar
Flajolet, P. and Sedgewick, R. (2009). Analytic Combinatorics. Cambridge University Press.CrossRefGoogle Scholar
Hwang, H.-K., Kuba, M. and Panholzer, A. (2007). Analysis of some exactly solvable diminishing urn models. In Proc. 19th Int. Conf. Formal Power Series and Algebraic Combinatorics, Nankai University, Tianjin.Google Scholar
Hwang, H.-K., Chern, H.-H. and Duh, G.-H. (2020). An asymptotic distribution theory for Eulerian recurrences with applications. To appear in Adv. Appl. Math. CrossRefGoogle Scholar
Inoue, K. and Aki, S. (2001). Pólya urn models under general replacement schemes. J. Japan Statist. Soc. 31, 193205.CrossRefGoogle Scholar
Janson, S. (2004). Functional limit theorems for multitype branching processes and generalized Pólya urns. Stoch. Process. Appl. 110, 177245.CrossRefGoogle Scholar
Johnson, N. L. and Kotz, S. (1977). Urn Models and Their Applications. Wiley, New York.Google Scholar
Johnson, N. L., Kotz, S. and Balakrishnan, N. (1997). Discrete Multivariate Distributions. Wiley, New York.Google Scholar
Johnson, N. L., Kemp, A. W. and Kotz, S. (2005). Univariate Discrete Distributions, 3rd edn. Wiley, New York.CrossRefGoogle Scholar
Kotz, S. and Balakrishnan, N. (1997). Advances in urn models during the past two decades. In Advances in Combinatorial Methods and Applications to Probability and Statistics, ed. N. Balakrishnan. Birkhäuser, Boston, MA, pp. 203–257.CrossRefGoogle Scholar
Koutras, M. V. (1982). Non-central Stirling numbers and some applications. Discrete Math. 42, 7389.CrossRefGoogle Scholar
Mahmoud, H. (2008). Pólya Urn Models. Chapman and Hall, Boca Raton, FL.CrossRefGoogle Scholar
Morcrette, B. and Mahmoud, H. (2012). Exactly solvable balanced tenable urns with random entries via the analytic methodology. In Proc. 23rd Int. Meeting Probabilistic, Combinatorial, and Asymptotic Methods in the Analysis of Algorithms, pp. 219232.10.46298/dmtcs.2996CrossRefGoogle Scholar
Smythe, R. T. (1996). Central limit theorems for urn models. Stoch. Process. Appl. 65, 115137.CrossRefGoogle Scholar
Figure 0

Figure 1. The transition graph of the Ehrenfest urn.

Figure 1

Figure 2. The joint probability mass function $\left(B_{30}^{1}, B_{30}^{3}\right)$.

Figure 2

Figure 3. The probability mass function of $B_{50}^{1}$.