Probability Distributions Arising in Connection with the Inspection Paradox for the Poisson Process

Abstract

The Inspection Paradox refers to the fact that in a Renewal Process, the length of the interarrival period which contains a fixed time is stochastically larger than the length of a typical interarrival period. To provide a more complete understanding of this phenomenon, conditioning arguments are used to obtain the distributions and moments of the lengths of the interarrival periods other than the one containing this fixed time for the case of the time-homogeneous Poisson Process. Distributions of the waiting times for events that occur both before and after this fixed time are derived. This provides a fairly complete probabilistic analysis of the Inspection Paradox.

Keywords

Share and Cite:

Marengo, J. , Himes, A. , Reinberger, W. and Farnsworth, D. (2023) Probability Distributions Arising in Connection with the Inspection Paradox for the Poisson Process. Open Journal of Statistics, 13, 16-24. doi: 10.4236/ojs.2023.131002.

1. Introduction

Consider a Renewal Process with interarrival times ${X}_{1},{X}_{2},\cdots$ and waiting times ${S}_{n}={\sum }_{k=1}^{n}\text{\hspace{0.17em}}{X}_{k}$ for $n=1,2,\cdots$ . Suppose that a piece of equipment, such as a battery, is installed at time zero and is immediately replaced upon failure by a similar battery. When this second battery fails, it is replaced by a similar battery, and this process is continued indefinitely. We can model this situation with a Renewal Process by regarding ${X}_{k}$ as being the lifetime of the k-th battery, ${S}_{n}$ as the time at which the n-th battery fails, and $N\left(t\right)$ as the number of failures (renewals) by time t.

Now, fix $t>0$ and consider the length ${X}_{N\left(t\right)+1}$ of the interarrival period which contains the instant t. Even though ${X}_{1},{X}_{2},\cdots$ are independent and identically distributed, ${X}_{N\left(t\right)+1}$ is stochastically larger than ${X}_{k}$ for every positive integer k. This is the Inspection Paradox. With regard to the batteries, it says that the battery in use at time t tends to have a longer lifetime than an ordinary battery. Its explanation lies in the fact that a longer interarrival period, rather than a shorter one, is more likely to cover the instant t.

The Inspection Paradox can arise in almost any sampling application. For example, there is a well-known result which says that in sampling by groups, one is less likely to encounter people who are in smaller groups [1] . This paradox appears in the sciences [2] and medicine [3] . In some circumstances, it can be remediated by using statistical procedures [4] . Its variations remain an area of study [5] . Instead of being regarded as a threat to the validity of an experiment, it can be used as a tool for gaining efficiency [6] . Our goal is to provide a more complete analysis of the paradox in the time-homogeneous Poisson Process case. We do this by deriving probability distributions and moments for the lengths of time periods that arise in the context of the paradox, and we examine some of their properties.

We write

${X}_{N\left(t\right)+1}={A}_{t}+{Y}_{t},$ (1)

where

${A}_{t}=t-{S}_{N\left(t\right)}\text{\hspace{0.17em}}\text{and}\text{\hspace{0.17em}}{Y}_{t}={S}_{N\left(t\right)+1}-t.$

The random variable ${X}_{N\left(t\right)+1}$ may be regarded as the lifetime of the battery in operation at time t, ${A}_{t}$ is the age at time t of the battery operating at that instant, and ${Y}_{t}$ is the remaining life of this battery.

In the following, take the Renewal Process to be a time-homogeneous Poisson Process with rate $\lambda$ , interarrival times ${X}_{1},{X}_{2},\cdots$ and waiting times ${S}_{n}={\sum }_{k=1}^{n}\text{\hspace{0.17em}}{X}_{k}$ for $n=1,2,\cdots$ . Such a process has independent increments and the number of events in any interval of length c has the Poisson distribution with mean $\lambda c$ . The interarrival times are independent and identically distributed with the exponential distribution with mean $1/\lambda$ . The waiting time ${S}_{n}$ has the $\text{Gamma}\left(n,\lambda \right)$ distribution with probability density function

$f\left(s\right)=\left(\begin{array}{ll}0\hfill & \text{if}\text{\hspace{0.17em}}s<0\hfill \\ \lambda {\text{e}}^{-\lambda s}\frac{{\left(\lambda s\right)}^{n-1}}{\left(n-1\right)!}\hfill & \text{if}\text{\hspace{0.17em}}0\le s\hfill \end{array}$

and distribution function

$F\left(s\right)=\left(\begin{array}{ll}0\hfill & \text{if}\text{\hspace{0.17em}}s<0\hfill \\ 1-\underset{k=0}{\overset{n-1}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda s}\frac{{\left(\lambda s\right)}^{k}}{k!}\hfill & \text{if}\text{\hspace{0.17em}}0\le s\hfill \end{array}$ .

${S}_{n}$ is called the “waiting time” because, starting at time zero, it is the amount of time that an observer of the process waits before the n-th event occurs.

Starting at time t, the time ${Y}_{t}$ until the next renewal is exponentially distributed with mean $1/\lambda$ and is independent of ${A}_{t}$ . This follows from the memoryless property of the exponential distribution. In fact, the independence of ${A}_{t}$ and ${Y}_{t}$ is known to characterize the Poisson Process. The distribution of ${A}_{t}$ is given by

$P\left({A}_{t}\le x\right)=\left(\begin{array}{ll}0\hfill & \text{if}\text{\hspace{0.17em}}x<0\hfill \\ 1-{\text{e}}^{-\lambda x}\hfill & \text{if}\text{\hspace{0.17em}}0\le x

with expectation

$\mathbb{E}\left[{A}_{t}\right]=\frac{1}{\lambda }\left(1-{\text{e}}^{-\lambda t}\right).$

Thus, ${A}_{t}$ has the same distribution as $\mathrm{min}\left\{{X}_{1},t\right\}$ , and ${Y}_{t}$ has the same distribution as ${X}_{1}$ . Each of these assertions separately are also known to characterize the Poisson Process.

From (1), the distribution of ${X}_{N\left(t\right)+1}$ is the convolution of the distributions of ${A}_{t}$ and ${Y}_{t}$ . It follows that

$P\left({X}_{N\left(t\right)+1}\le x\right)=\left(\begin{array}{ll}0\hfill & \text{if}\text{\hspace{0.17em}}x<0\hfill \\ 1-{\text{e}}^{-\lambda x}\left(1+\lambda x\right)\hfill & \text{if}\text{\hspace{0.17em}}0\le x

with expectation

$\mathbb{E}\left[{X}_{N\left(t\right)+1}\right]=\frac{2}{\lambda }-\frac{1}{\lambda }{\text{e}}^{-\lambda t}.$

As $t\to \infty$ ,

${A}_{t}\stackrel{D}{\to }\mathrm{exp}\left(\lambda \right),$

${X}_{N\left(t\right)+1}\stackrel{D}{\to }\text{Gamma}\left(2,\lambda \right),$

and

$\mathbb{E}\left[{X}_{N\left(t\right)+1}\right]\to \frac{2}{\lambda },$ (2)

where $\stackrel{D}{\to }$ indicates convergence in distribution. From (2) it follows that for large t, the expected life of the battery in operation at time t is approximately twice the expected life of an typical battery, whose lifetime has an exponential distribution with mean $1/\lambda$ . This background material is presented in [7] [8] .

We give explicit formulas for the distributions and moments of the lengths of the interarrival periods other than the one containing a fixed instant t when the underlying Renewal Process is a time-homogeneous Poisson Process with rate $\lambda$ . We show that the infinite sequence of lengths of the interarrival periods preceding the one containing t is strictly stochastically decreasing. Also, we demonstrate that the lengths of the interarrival periods succeeding the one containing t are independent exponential random variables with mean $1/\lambda$ , and we derive the distributions and moments for ${S}_{N\left(t\right)+k}$ for every integer k.

Section 2 presents the probability distribution for the length of each interarrival period (e.g., lifetime) that ends before a fixed inspection time t and each one that begins after this time. It is noted that, proceeding backwards in time from the instant t, the lifetimes are strictly stochastically decreasing, and all moments for each of these lifetimes are presented. In light of the memoryless property of a Poisson process, it isn’t surprising that the lifetimes that begin after time t are independent and identically distributed exponential random variables having a distribution that doesn’t depend on t. Section 3 presents the probability distribution for the waiting time (i.e., partial sum) corresponding to each of the interarrival periods mentioned above. Section 4 summarizes these results and discusses the limitations of this study.

2. Interarrival Times

We begin with a definition.

Definition 1. A random variable U is strictly stochastically smaller than a random variable V if for all real x,

$P\left(U\le x\right)\ge P\left(V\le x\right),$

and if the inequality between these two probabilities is strict for some real number x. A sequence ${\left\{{U}_{n}\right\}}_{n=1}^{\infty }$ of random variables is strictly stochastically decreasing if ${U}_{n+1}$ is strictly stochastically smaller than ${U}_{n}$ for $n\ge 1$ .

It is well-known that, given that $N\left(t\right)=n$ , the conditional joint distribution of the waiting times that terminate prior to time t is the same as the joint distribution of the order statistics for n independent random variables that are each uniformly distributed on the interval $\left[0,t\right]$ [8] . It follows that each of the lengths of the corresponding interarrival times have the same beta distribution with pdf ${f}_{t}\left(x\right)=\left(n/{t}^{n}\right){\left(t-x\right)}^{n-1}$ for $0 [9] . The following theorem gives the unconditional probability distribution and moments for the length of each of the interarrival periods that end before the fixed time t.

Theorem 1. If $j\in \left\{0,1,\cdots \right\}$ and $t>0$ , then

$P\left({X}_{N\left(t\right)-j}\le x\right)=\left(\begin{array}{ll}0\hfill & \text{if}\text{\hspace{0.17em}}x<0\hfill \\ 1-{\text{e}}^{-\lambda x}\underset{k=j+1}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda \left(t-x\right)}\frac{{\lambda }^{k}{\left(t-x\right)}^{k}}{k!}\hfill & \text{if}\text{\hspace{0.17em}}0\le x (3)

and for $m=1,2,\cdots$ , the m-th moment of ${X}_{N\left(t\right)-j}$ is

$\mathbb{E}\left[{X}_{N\left(t\right)-j}^{m}\right]=\frac{m!}{{\lambda }^{m}}\underset{k=m+j+1}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{k}}{k!}.$

Proof. We define ${X}_{N\left(t\right)-j}=0$ if $j\ge N\left(t\right)$ , and we note that $0\le {X}_{N\left(t\right)-j}\le t$ . By conditioning on the value of $N\left(t\right)$ , we have for $x\in \left[0,t\right]$ that

$\underset{n=0}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}P\left({X}_{N\left(t\right)-j}\le x\right)=\underset{n=0}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}P\left({X}_{N\left(t\right)-j}\le x,N\left(t\right)=n\right).$

For $n=0,1,\cdots ,j$ ,

$P\left({X}_{N\left(t\right)-j}\le x,N\left(t\right)=n\right)=P\left(N\left(t\right)=n\right)={\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{n}}{n!},$ (4)

where we have defined ${X}_{k}=0$ for $k\le 0$ .

For $n\ge j+1$ ,

$P\left({X}_{N\left(t\right)-j}\le x,N\left(t\right)=n\right)=P\left({X}_{n-j}\le x,{S}_{n}\le t,{S}_{n+1}>t\right).$ (5)

This probability does not depend on $j\in \left\{0,1,\cdots ,n-1\right\}$ . Setting $j=0$ in the right hand side of (5) and conditioning on the value of ${S}_{n-1}$ , we have

$\begin{array}{l}P\left({X}_{N\left(t\right)-j}\le x,N\left(t\right)=n\right)\\ =P\left({X}_{n}\le x,{S}_{n-1}+{X}_{n}\le t,{S}_{n-1}+{X}_{n}+{X}_{n+1}>t\right)\\ ={\int }_{0}^{t}\text{\hspace{0.17em}}P\left({X}_{n}\le \mathrm{min}\left\{x,t-s\right\},{X}_{n}+{X}_{n+1}>t-s\right)\text{d}{F}_{n-1}\left(s\right)\\ ={\int }_{0}^{t-x}\text{\hspace{0.17em}}P\left({X}_{n}\le x,{X}_{n}+{X}_{n+1}>t-s\right)\text{d}{F}_{n-1}\left(s\right)\\ \text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{\hspace{0.17em}}+{\int }_{t-x}^{t}\text{\hspace{0.17em}}P\left({X}_{n}\le t-s,{X}_{n}+{X}_{n+1}>t-s\right)\text{d}{F}_{n-1}\left(s\right),\end{array}$ (6)

where ${F}_{n-1}\left(s\right)$ is the distribution function of ${S}_{n-1}$ .

For $s\in \left[0,t-x\right]$ ,

$\begin{array}{l}P\left({X}_{n}\le x,{X}_{n}+{X}_{n+1}>t-s\right)\\ =P\left({X}_{1}\le x,{X}_{1}+{X}_{2}>t-s\right)=P\left(N\left(x\right)=1,N\left(t-s\right)-N\left(x\right)=0\right)\\ =P\left(N\left(x\right)=1\right)P\left(N\left(t-s\right)-N\left(x\right)=0\right)={\text{e}}^{-\lambda x}\left(\lambda x\right){\text{e}}^{-\lambda \left(t-s-x\right)}\\ =\lambda x{\text{e}}^{-\lambda \left(t-s\right)},\end{array}$ (7)

where we have used the fact that a Poisson Process has independent increments. For $s\in \left[t-x,t\right]$ ,

$\begin{array}{l}P\left({X}_{n}\le t-s,{X}_{n}+{X}_{n+1}>t-s\right)\\ =P\left({X}_{1}\le t-s,{X}_{1}+{X}_{2}>t-s\right)=P\left(N\left(t-s\right)=1\right)\\ =\lambda \left(t-s\right){\text{e}}^{-\lambda \left(t-s\right)}.\end{array}$ (8)

We may use the fact that ${S}_{n-1}$ has the $\text{Gamma}\left(n-1,\lambda \right)$ distribution to obtain from (6), (7), and (8) that for $n\ge j+1$ ,

$\begin{array}{l}P\left({X}_{N\left(t\right)-j}\le x,N\left(t\right)=n\right)\\ ={\int }_{0}^{t-x}\text{\hspace{0.17em}}\lambda x{\text{e}}^{-\lambda \left(t-s\right)}\text{d}{F}_{n-1}\left(s\right)+{\int }_{t-x}^{t}\text{\hspace{0.17em}}\lambda \left(t-s\right){\text{e}}^{-\lambda \left(t-s\right)}\text{d}{F}_{n-1}\left(s\right)\\ ={\text{e}}^{-\lambda t}\left\{\frac{{\left(\lambda t\right)}^{n}}{n!}-\frac{{\lambda }^{n}{\left(t-x\right)}^{n}}{n!}\right\}.\end{array}$ (9)

Using (4) and (9), we have after summing on n that for $x\in \left[0,t\right]$ ,

$\begin{array}{c}P\left({X}_{N\left(t\right)-j}\le x\right)={\text{e}}^{-\lambda t}\left\{\underset{n=0}{\overset{j}{\sum }}\frac{{\left(\lambda t\right)}^{n}}{n!}+\underset{n=j+1}{\overset{\infty }{\sum }}\frac{{\left(\lambda t\right)}^{n}}{n!}-\underset{n=j+1}{\overset{\infty }{\sum }}\frac{{\lambda }^{n}{\left(t-x\right)}^{n}}{n!}\right\}\\ =1-{\text{e}}^{-\lambda x}\underset{k=j+1}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda \left(t-x\right)}\frac{{\lambda }^{k}{\left(t-x\right)}^{k}}{k!},\end{array}$

which establishes (3). Therefore, we may write

$P\left({X}_{N\left(t\right)-j}>x\right)=\left(\begin{array}{ll}1\hfill & \text{if}\text{\hspace{0.17em}}x<0\hfill \\ {\text{e}}^{-\lambda x}\underset{k=j+1}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda \left(t-x\right)}\frac{{\lambda }^{k}{\left(t-s\right)}^{k}}{k!}\hfill & \text{if}\text{\hspace{0.17em}}0\le x

Then,

$\begin{array}{c}\mathbb{E}\left[{X}_{N\left(t\right)-j}^{m}\right]={\int }_{0}^{\infty }\text{\hspace{0.17em}}m{x}^{m-1}P\left({X}_{N\left(t\right)-j}>x\right)\text{d}x\\ ={\int }_{0}^{t}\text{\hspace{0.17em}}m{x}^{m-1}{\text{e}}^{-\lambda x}\underset{k=j+1}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda \left(t-x\right)}\frac{{\lambda }^{k}{\left(t-x\right)}^{k}}{k!}\text{d}x\\ ={\int }_{0}^{t}\text{\hspace{0.17em}}m{x}^{m-1}{\text{e}}^{-\lambda x}\left(1-\underset{k=0}{\overset{j}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda \left(t-x\right)}\frac{{\lambda }^{k}{\left(t-x\right)}^{k}}{k!}\right)\text{d}x\\ ={\int }_{0}^{t}\text{\hspace{0.17em}}m{x}^{m-1}{\text{e}}^{-\lambda x}\text{d}x-{\text{e}}^{-\lambda t}\underset{k=0}{\overset{j}{\sum }}\frac{{\lambda }^{k}}{k!}{\int }_{0}^{t}\text{\hspace{0.17em}}m{x}^{m-1}{\left(t-x\right)}^{k}\text{d}x.\end{array}$

Now,

${\int }_{0}^{t}\text{\hspace{0.17em}}m{x}^{m-1}{\text{e}}^{-\lambda x}\text{d}x=\frac{m!}{{\lambda }^{m}}{\int }_{0}^{t}\frac{{\left(\lambda x\right)}^{m-1}}{\left(m-1\right)!}\lambda {\text{e}}^{-\lambda x}\text{d}x=\frac{m!}{{\lambda }^{m}}\left(1-\underset{k=0}{\overset{m-1}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{k}}{k!}\right),$

where the last equality follows from the fact the last integrand is the value at t of the distribution function for the $\text{Gamma}\left(m,\lambda \right)$ distribution. Also,

${\int }_{0}^{t}\text{\hspace{0.17em}}m{x}^{m-1}{\left(t-x\right)}^{k}\text{d}x={t}^{m+k}{\int }_{0}^{1}\text{\hspace{0.17em}}m{y}^{m-1}{\left(1-y\right)}^{k}\text{d}y={t}^{m+k}\frac{m!k!}{\left(m+k\right)!}.$

Therefore,

$\begin{array}{c}\mathbb{E}\left[{X}_{N\left(t\right)-j}^{m}\right]=\frac{m!}{{\lambda }^{m}}\left(1-\underset{k=0}{\overset{m-1}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{k}}{k!}\right)-{\text{e}}^{-\lambda t}\underset{k=0}{\overset{j}{\sum }}\frac{{\lambda }^{k}}{k!}{t}^{m+k}\frac{m!k!}{\left(m+k\right)!}\\ =\frac{m!}{{\lambda }^{m}}\underset{k=m+j+1}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{k}}{k!}.\end{array}$

Remark 1. It follows from Theorem 1 that as $t\to \infty$

${X}_{N\left(t\right)-j}\stackrel{D}{\to }{X}_{1}\text{\hspace{0.17em}}\text{ }\text{and}\text{\hspace{0.17em}}\text{ }\mathbb{E}\left[{X}_{N\left(t\right)-j}^{m}\right]\to \frac{m!}{{\lambda }^{m}}=\mathbb{E}\left[{X}_{1}^{m}\right]\text{\hspace{0.17em}}\text{\hspace{0.17em}}\text{for}\text{\hspace{0.17em}}m=1,2,\cdots .$

Since exponential growth dominates polynomial growth, one minus the sum that appears in the statement of Theorem 1 approaches zero as t approaches infinity for fixed j. So ${X}_{N\left(t\right)-j}$ converges in distribution to ${X}_{1}$ as t approaches infinity. Moreover,

$P\left({X}_{N\left(t\right)-j}\le x\right)>1-{\text{e}}^{-\lambda x}\text{\hspace{0.17em}}\text{for}\text{\hspace{0.17em}}x>0,$

and so ${X}_{N\left(t\right)}$ is strictly stochastically smaller than ${X}_{k}$ for every positive integer k. Since $N\left(t\right)$ is a finite random variable, $P\left({X}_{N\left(t\right)-j}=0\text{\hspace{0.17em}}\text{for}\text{\hspace{0.17em}}\text{sufficiently}\text{\hspace{0.17em}}\text{large}\text{\hspace{0.17em}}j\right)=1$ . The sequence ${\left\{{X}_{N\left(t\right)-j}\right\}}_{j=0}^{\infty }$ is strictly stochastically decreasing, and the distribution function of ${X}_{N\left(t\right)-j}$ has a single jump of size

$P\left(N\left(t\right)\le j\right)=\underset{k=0}{\overset{j}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{k}}{k!}$

at zero. Of course, the size of this jump tends to zero as $t\to \infty$ . Routine calculus shows that the probability density function associated with the absolutely continuous part of the distribution function of ${X}_{N\left(t\right)-j}$ is strictly decreasing and concave up on the interval $\left(0,t\right)$ . If $j\ge 1$ , this density is continuous everywhere except at zero where it has a jump of size

$\lambda \left({\sum }_{k=j}^{\infty }\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{k}}{k!}\right)/\left({\sum }_{k=j+1}^{\infty }\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{k}}{k!}\right)$ . If $j=0$ , the density of ${X}_{N\left(t\right)-j}$ is

continuous everywhere except at zero, where it has a jump of size $\lambda /\left(1-{\text{e}}^{-\lambda t}\right)$ , and at t, where it has a jump of size $\lambda {\text{e}}^{-\lambda t}/\left(1-{\text{e}}^{-\lambda t}\right)$ .

The next theorem gives the probability distribution for the length of each of the interarrival periods that begin after the fixed time t.

Theorem 2. If $j\in \left\{2,3,\cdots \right\}$ and $t>0$ , then ${X}_{N\left(t\right)+j}$ is exponentially distributed with mean $1/\lambda$ .

Proof. We condition on the value of $N\left(t\right)$ to obtain for $x>0$ that

$\begin{array}{c}P\left({X}_{N\left(t\right)+j}\le x\right)=\underset{n=0}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}P\left({X}_{N\left(t\right)+j}\le x,N\left(t\right)=n\right)\\ =\underset{n=0}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}P\left({X}_{n+j}\le x,{S}_{n}\le t,{S}_{n+1}>t\right)\\ =\underset{n=0}{\overset{\infty }{\sum }}\text{\hspace{0.17em}}P\left({X}_{n+j}\le x\right)P\left({S}_{n}\le t,{S}_{n+1}>t\right)\\ =\underset{n=0}{\overset{\infty }{\sum }}\left(1-{\text{e}}^{-\lambda x}\right)P\left(N\left(t\right)=n\right)=1-{\text{e}}^{-\lambda x}.\end{array}$

Remark 2. This proof can be shortened by observing that, since $N\left(t\right)+1$ is a stopping time for ${X}_{1},{X}_{2},\cdots$ , the event $N\left(t\right)=n$ is independent of ${X}_{n+j}$ for $j\ge 2$ .

3. Waiting Times

The next theorem gives the distribution of the waiting time for each event which occurs before the fixed time t.

Theorem 3. If $j\in \left\{0,1,2,\cdots \right\}$ and $t>0$ , then

$P\left(t-{S}_{N\left(t\right)-j}\le s\right)=\left(\begin{array}{ll}0\hfill & \text{if}\text{\hspace{0.17em}}s<0\hfill \\ P\left(Y\le s\right)\hfill & \text{if}\text{\hspace{0.17em}}0\le s

where Y has the $\text{Gamma}\left(j+1,\lambda \right)$ distribution.

Proof. Observe that $0\le {S}_{N\left(t\right)-j}\le t$ . If $0\le s , then

$\begin{array}{c}P\left(t-{S}_{N\left(t\right)-j}\le s\right)=P\left(N\left(t\right)-N\left(t-s\right)\ge j+1\right)\\ =1-\underset{k=0}{\overset{j}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda s}\frac{{\left(\lambda s\right)}^{k}}{k!}=P\left(Y\le s\right).\end{array}$

Remark 3. From Theorem 3, as $t\to \infty$ ,

$t-{S}_{N\left(t\right)-j}\stackrel{D}{\to }\text{Gamma}\left(j+1,\lambda \right),$

Theorem 3 states that for $s , $P\left({S}_{N\left(t\right)-j}\le s\right)=Pr\left(Y\le s\right)$ . So ${S}_{N\left(t\right)-j}$ converges in distribution to Y as t approaches infinity. But Y has the $\text{Gamma}\left(j+1,\text{lambda}\right)$ distribution. Since $N\left(t\right)$ is a finite random variable, $P\left({S}_{N\left(t\right)-j}=0\text{\hspace{0.17em}}\text{for}\text{\hspace{0.17em}}\text{sufficiently}\text{\hspace{0.17em}}\text{large}\text{\hspace{0.17em}}j\right)=1$ . The distribution function of $t-{S}_{N\left(t\right)-j}$ has a single jump of size

$P\left(N\left(t\right)\le j\right)={\text{e}}^{-\lambda t}\underset{k=0}{\overset{j}{\sum }}\frac{{\left(\lambda t\right)}^{k}}{k!}$

at t. A straightforward, but tedious, calculation yields that

$\mathbb{E}\left[{\left(t-{S}_{N\left(t\right)-j}\right)}^{m}\right]=\frac{m!}{{\lambda }^{m}}\left\{\left(\begin{array}{c}m+j\\ m\end{array}\right)-\underset{k=0}{\overset{j}{\sum }}\left(\begin{array}{c}m-1+k\\ m-1\end{array}\right)\underset{\mathcal{l}=0}{\overset{m-1+k}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda t}\frac{{\left(\lambda t\right)}^{\mathcal{l}}}{\mathcal{l}!}\right\}.$

It follows that as $t\to \infty$

$\mathbb{E}\left[{\left(t-{S}_{N\left(t\right)-j}\right)}^{m}\right]\to \frac{m!}{{\lambda }^{m}}\left(\begin{array}{c}m+j\\ m\end{array}\right)=\mathbb{E}\left[{Y}^{m}\right],$

where Y has the $\text{Gamma}\left(j+1,\lambda \right)$ distribution.

Our last theorem gives the distribution of the waiting time for each event that occurs after the fixed time t. Although these distributions are of primarily theoretical interest, knowing them provides a more complete understanding of the Poisson Process.

Theorem 4. If $j\in \left\{1,2,\cdots \right\}$ and $t>0$ , then ${S}_{N\left(t\right)+j}-t$ has the $\text{Gamma}\left(j,\lambda \right)$ distribution.

Proof. Note that ${S}_{N\left(t\right)+j}-t>0$ . For $x>0$ ,

$P\left({S}_{N\left(t\right)+j}-t\le x\right)=P\left(N\left(t+x\right)-N\left(t\right)\ge j\right)=1-\underset{k=0}{\overset{j-1}{\sum }}\text{\hspace{0.17em}}{\text{e}}^{-\lambda x}\frac{{\left(\lambda x\right)}^{k}}{k!}.$

Remark 4. The following is another proof of Theorem 4. Recall that ${Y}_{t}$ is exponentially distributed with mean $1/\lambda$ , and, from Theorem 2, observe that for $k=2,\cdots ,j$ , ${X}_{N\left(t\right)+k}$ also has that distribution. Because $N\left(t\right)+1$ is a stopping time for the sequence ${X}_{1},{X}_{2},\cdots$ , the random variables ${Y}_{t},{X}_{N\left(t\right)+2},\cdots$ , and ${X}_{N\left(t\right)+j}$ are independent. Since

${S}_{N\left(t\right)+j}-t={Y}_{t}+\underset{k=2}{\overset{j}{\sum }}\text{\hspace{0.17em}}{X}_{N\left(t\right)+k},$

Theorem 4 follows.

4. Conclusion

In summary, we have effectively completed the mathematical study of the Inspection Paradox for the case of the Poisson Process. We have accomplished this by deriving the probability distributions and moments for all of the interarrival periods other than the one containing the fixed inspection time under consideration, and also by finding the distributions and moments of the associated waiting times. We note that our results apply only in the case of a Poisson process and not for an arbitrary renewal process. Since such a process doesn’t necessarily have memoryless interarrival times, the theoretical challenges associated with accomplishing this generalization seem formidable.

Conflicts of Interest

The authors declare no conflicts of interest.

 [1] Field, S. (1991) Why Your Friends Have More Friends Than You Do. American Journal of Sociology, 96, 1464-1477. https://doi.org/10.1086/229693 [2] Morozov, E. and Potakhina, L. (2014) An Application of the Inspection Paradox in Stability Analysis of Optical Systems. 2014 6th International Congress on Ultra Modern Telecommunications and Control Systems and Workshops (ICUMT), St. Petersburg, 6-8 October 2014, 522-525. https://doi.org/10.1109/ICUMT.2014.7002156 [3] Zelen, M. and Feinleib, M. (1969) On the Theory of Screening for Chronic Diseases. Biometrika, 56, 601-614. https://doi.org/10.1093/biomet/56.3.601 [4] Lann, A. and Falk, R. (2006) Tell Me the Method, I’ll Give You the Mean. The American Statistician, 60, 322-327. https://doi.org/10.1198/000313006X151460 [5] Ross, S.M. (2003) The Inspection Paradox. Probability in the Engineering and Informational Sciences, 17, 47-51. https://doi.org/10.1017/S0269964803171033 [6] Pal, A., Kostinski, S. and Reuveni, S. (2022) The Inspection Paradox in Stochastic Resetting. Journal of Physics A: Mathematical and Theoretical, 55, 1-25. https://doi.org/10.1088/1751-8121/ac3cdf [7] Karlin, S. and Taylor, H.M. (1975) A First Course in Stochastic Processes. 2nd Edition, Academic Press, San Diego. https://doi.org/10.1016/B978-0-08-057041-9.50005-2 [8] Ross, S.M. (1993) An Introduction to Probability Models. 5th Edition, Academic Press, San Diego. https://doi.org/10.1016/B978-0-12-598455-3.50004-0 [9] Hogg, R.V., McKean, J.W. and Craig, A.T. (2018) Introduction to Mathematical Statistics. 8th Edition, Pearson, Boston.