Probability Distributions Arising in Connection with the Inspection Paradox for the Poisson Process ()
1. Introduction
Consider a Renewal Process with interarrival times
and waiting times
for
. Suppose that a piece of equipment, such as a battery, is installed at time zero and is immediately replaced upon failure by a similar battery. When this second battery fails, it is replaced by a similar battery, and this process is continued indefinitely. We can model this situation with a Renewal Process by regarding
as being the lifetime of the k-th battery,
as the time at which the n-th battery fails, and
as the number of failures (renewals) by time t.
Now, fix
and consider the length
of the interarrival period which contains the instant t. Even though
are independent and identically distributed,
is stochastically larger than
for every positive integer k. This is the Inspection Paradox. With regard to the batteries, it says that the battery in use at time t tends to have a longer lifetime than an ordinary battery. Its explanation lies in the fact that a longer interarrival period, rather than a shorter one, is more likely to cover the instant t.
The Inspection Paradox can arise in almost any sampling application. For example, there is a well-known result which says that in sampling by groups, one is less likely to encounter people who are in smaller groups [1] . This paradox appears in the sciences [2] and medicine [3] . In some circumstances, it can be remediated by using statistical procedures [4] . Its variations remain an area of study [5] . Instead of being regarded as a threat to the validity of an experiment, it can be used as a tool for gaining efficiency [6] . Our goal is to provide a more complete analysis of the paradox in the time-homogeneous Poisson Process case. We do this by deriving probability distributions and moments for the lengths of time periods that arise in the context of the paradox, and we examine some of their properties.
We write
(1)
where
The random variable
may be regarded as the lifetime of the battery in operation at time t,
is the age at time t of the battery operating at that instant, and
is the remaining life of this battery.
In the following, take the Renewal Process to be a time-homogeneous Poisson Process with rate
, interarrival times
and waiting times
for
. Such a process has independent increments and the number of events in any interval of length c has the Poisson distribution with mean
. The interarrival times are independent and identically distributed with the exponential distribution with mean
. The waiting time
has the
distribution with probability density function
and distribution function
.
is called the “waiting time” because, starting at time zero, it is the amount of time that an observer of the process waits before the n-th event occurs.
Starting at time t, the time
until the next renewal is exponentially distributed with mean
and is independent of
. This follows from the memoryless property of the exponential distribution. In fact, the independence of
and
is known to characterize the Poisson Process. The distribution of
is given by
with expectation
Thus,
has the same distribution as
, and
has the same distribution as
. Each of these assertions separately are also known to characterize the Poisson Process.
From (1), the distribution of
is the convolution of the distributions of
and
. It follows that
with expectation
As
,
and
(2)
where
indicates convergence in distribution. From (2) it follows that for large t, the expected life of the battery in operation at time t is approximately twice the expected life of an typical battery, whose lifetime has an exponential distribution with mean
. This background material is presented in [7] [8] .
We give explicit formulas for the distributions and moments of the lengths of the interarrival periods other than the one containing a fixed instant t when the underlying Renewal Process is a time-homogeneous Poisson Process with rate
. We show that the infinite sequence of lengths of the interarrival periods preceding the one containing t is strictly stochastically decreasing. Also, we demonstrate that the lengths of the interarrival periods succeeding the one containing t are independent exponential random variables with mean
, and we derive the distributions and moments for
for every integer k.
Section 2 presents the probability distribution for the length of each interarrival period (e.g., lifetime) that ends before a fixed inspection time t and each one that begins after this time. It is noted that, proceeding backwards in time from the instant t, the lifetimes are strictly stochastically decreasing, and all moments for each of these lifetimes are presented. In light of the memoryless property of a Poisson process, it isn’t surprising that the lifetimes that begin after time t are independent and identically distributed exponential random variables having a distribution that doesn’t depend on t. Section 3 presents the probability distribution for the waiting time (i.e., partial sum) corresponding to each of the interarrival periods mentioned above. Section 4 summarizes these results and discusses the limitations of this study.
2. Interarrival Times
We begin with a definition.
Definition 1. A random variable U is strictly stochastically smaller than a random variable V if for all real x,
and if the inequality between these two probabilities is strict for some real number x. A sequence
of random variables is strictly stochastically decreasing if
is strictly stochastically smaller than
for
.
It is well-known that, given that
, the conditional joint distribution of the waiting times that terminate prior to time t is the same as the joint distribution of the order statistics for n independent random variables that are each uniformly distributed on the interval
[8] . It follows that each of the lengths of the corresponding interarrival times have the same beta distribution with pdf
for
[9] . The following theorem gives the unconditional probability distribution and moments for the length of each of the interarrival periods that end before the fixed time t.
Theorem 1. If
and
, then
(3)
and for
, the m-th moment of
is
Proof. We define
if
, and we note that
. By conditioning on the value of
, we have for
that
For
,
(4)
where we have defined
for
.
For
,
(5)
This probability does not depend on
. Setting
in the right hand side of (5) and conditioning on the value of
, we have
(6)
where
is the distribution function of
.
For
,
(7)
where we have used the fact that a Poisson Process has independent increments. For
,
(8)
We may use the fact that
has the
distribution to obtain from (6), (7), and (8) that for
,
(9)
Using (4) and (9), we have after summing on n that for
,
which establishes (3). Therefore, we may write
Then,
Now,
where the last equality follows from the fact the last integrand is the value at t of the distribution function for the
distribution. Also,
Therefore,
Remark 1. It follows from Theorem 1 that as
Since exponential growth dominates polynomial growth, one minus the sum that appears in the statement of Theorem 1 approaches zero as t approaches infinity for fixed j. So
converges in distribution to
as t approaches infinity. Moreover,
and so
is strictly stochastically smaller than
for every positive integer k. Since
is a finite random variable,
. The sequence
is strictly stochastically decreasing, and the distribution function of
has a single jump of size
at zero. Of course, the size of this jump tends to zero as
. Routine calculus shows that the probability density function associated with the absolutely continuous part of the distribution function of
is strictly decreasing and concave up on the interval
. If
, this density is continuous everywhere except at zero where it has a jump of size
. If
, the density of
is
continuous everywhere except at zero, where it has a jump of size
, and at t, where it has a jump of size
.
The next theorem gives the probability distribution for the length of each of the interarrival periods that begin after the fixed time t.
Theorem 2. If
and
, then
is exponentially distributed with mean
.
Proof. We condition on the value of
to obtain for
that
Remark 2. This proof can be shortened by observing that, since
is a stopping time for
, the event
is independent of
for
.
3. Waiting Times
The next theorem gives the distribution of the waiting time for each event which occurs before the fixed time t.
Theorem 3. If
and
, then
where Y has the
distribution.
Proof. Observe that
. If
, then
Remark 3. From Theorem 3, as
,
Theorem 3 states that for
,
. So
converges in distribution to Y as t approaches infinity. But Y has the
distribution. Since
is a finite random variable,
. The distribution function of
has a single jump of size
at t. A straightforward, but tedious, calculation yields that
It follows that as
where Y has the
distribution.
Our last theorem gives the distribution of the waiting time for each event that occurs after the fixed time t. Although these distributions are of primarily theoretical interest, knowing them provides a more complete understanding of the Poisson Process.
Theorem 4. If
and
, then
has the
distribution.
Proof. Note that
. For
,
Remark 4. The following is another proof of Theorem 4. Recall that
is exponentially distributed with mean
, and, from Theorem 2, observe that for
,
also has that distribution. Because
is a stopping time for the sequence
, the random variables
, and
are independent. Since
Theorem 4 follows.
4. Conclusion
In summary, we have effectively completed the mathematical study of the Inspection Paradox for the case of the Poisson Process. We have accomplished this by deriving the probability distributions and moments for all of the interarrival periods other than the one containing the fixed inspection time under consideration, and also by finding the distributions and moments of the associated waiting times. We note that our results apply only in the case of a Poisson process and not for an arbitrary renewal process. Since such a process doesn’t necessarily have memoryless interarrival times, the theoretical challenges associated with accomplishing this generalization seem formidable.