The Conditional Poisson Process and the Erlang and Negative Binomial Distributions

Abstract

It is a well known fact that for the hierarchical model of a Poisson random variable Y whose mean has an Erlang distribution, the unconditional distribution of Y is negative binomial. However, the proofs in the literature [1] [2] provide no intuitive understanding as to why this result should be true. It is the purpose of this manuscript to give a new proof of this result which provides such an understanding. The memoryless property of the exponential distribution allows one to conclude that the events in two independent Poisson processes may be regarded as Bernoulli trials, and this fact is used to achieve the research purpose. Another goal of this manuscript is to give another proof of this last fact which does not rely on the memoryless property.

Share and Cite:

Agarwal, A. , Bajorski, P. , Farnsworth, D. , Marengo, J. and Qian, W. (2017) The Conditional Poisson Process and the Erlang and Negative Binomial Distributions. Open Journal of Statistics, 7, 16-22. doi: 10.4236/ojs.2017.71002.

1. Introduction

There is much current interest in compounding or mixing distributions and their applications. Indeed, the early history of statistics was greatly concerned with the problem [3] . The work by Greenwood and Yule [4] in the more modern era has been followed up with new results and extensive applications [5] , including many based on the Poisson distribution because of its centrality in statistical analysis and probability modeling [6] [7] [8] [9] . The present derivations supply new insights into the structure of this type of modeling by revealing how compounded Poisson variables produce a negative binomial distribution.

There is a relatively simple fact about mixture distributions which says that if the mean of a conditional Poisson random variable has an Erlang distri- bution, then the unconditional distribution of this variable is negative binomial. In particular,

Theorem 1. Let m N and θ R + . Suppose that the random variable Λ has the Erlang distribution with probability density function

f ( λ ) = θ e θ λ ( θ λ ) m 1 ( m 1 ) ! for λ > 0 (1)

and that, given that Λ = λ , N ( t ) has the Poisson distribution with probability mass function

p ( k | λ ) = e λ t ( λ t ) k k ! . (2)

Then, the unconditional distribution of N ( t ) is that of the number of failures before the m t h success in Bernoulli trials with success probability

p = θ θ + t . That is,

P ( N ( t ) = k ) = ( k + m 1 k ) ( θ θ + t ) m ( t θ + t ) k . (3)

The proof involves an application of the law of total probability which conditions on the value of Λ . That is, an integration of the product of (1) and (2) yields (3). This result appears in many different settings ( [10] , pp. 194-195; [11] , p. 191; [2] , pp. 332-333; [12] , p. 84).

Could this result have been guessed? This proof provides no intuitive under- standing as to why it is true. One purpose of this paper is to give another proof of Theorem 1 which provides such an understanding. In that proof, two indepen- dent Poisson processes are carefully chosen and the memoryless property of their exponentially distributed interarrival times is used to conclude that the events in these processes may be regarded as Bernoulli trials. In Section 3, without using the memoryless property, another proof that these events are Bernoulli trials has been provided.

2. Proof of Theorem 1

This section contains an alternative proof of Theorem 1 which can facilitate one’s intuitive understanding of this result. The proof uses properties of the Poisson process and exponential distribution to obtain (3).

Fix t 0 , let { N 1 ( u ) , u 0 } be a Poisson process with rate θ , and let { N 2 ( u ) , u 0 } be an independent Poisson process with rate t . One may think of the events in the first process as “successes” and those in the second as “failures”. Using the well-known facts [2] that the interarrival times in the first (second) process are independent exponentially distributed random variables with rate θ (rate t ) and the memoryless property of the exponential distri- bution ( [2] , pp. 150, 159; [10] , p. 102) one may regard these successes and failures as being Bernoulli trials. That is, the trials are independent, and the probability of success is the same on each trial. Intuitively, the process proba- bilistically restarts itself at any point in time. Specifically, suppose an event has just occurred in one of the two processes. Then, regardless of the amount of time that has elapsed since the last event in the other process, the distribution of the amount of time remaining until the next event occurs in the other process is exponential with the rate for that process. Hence, independently of what has occurred up to that point of time, the probability that the next event is a success is the probability that an exponential random variable with rate θ is less than an independent exponential random variable with rate t , and this probability is

easily seen to be θ θ + t ( [12] , p. 287). A proof (which does not make direct

reference to the memoryless property) that the events in the two processes constitute Bernoulli trials is given in Section 3.

Since the sum of m independent exponential random variables each having rate θ has the Erlang distribution in (1), one may think of Λ as being the time of occurrence of the m t h event in the process { N 1 ( u ) , u 0 } . That is, Λ is the occurrence time of the m t h success ( [2] , p. 150). Given that Λ = λ , the conditional distributions of both N ( t ) and N 2 ( λ ) are the same Poisson distribution with mean λ t . By conditioning on Λ , the unconditional distribu- tion of N ( t ) is the same as the unconditional distribution of N 2 ( Λ ) . The proof now follows by observing that N 2 ( Λ ) is the number of failures before the time of the m t h success.

3. Proof That the Trials Are Bernoulli

This section contains a proof, which does not depend on the memoryless property, that the events in two independent Poisson processes may be regarded as Bernoulli trials.

Theorem 2. Consider two independent Poisson processes with respective rates λ 1 and λ 2 in which the events that occur in either process are called trials and are referred to as successes or failures according as they come from the first or second process. Then the trials are independent and the probability of

success is λ 1 λ 1 + λ 2 on each trial. That is, these trials are Bernoulli trials.

Before proving Theorem 2, the following two lemmas are needed.

Lemma 1. For nonnegative integers m and n ,

k = 0 n ( m + k m ) = ( m + n + 1 m + 1 ) .

Proof. The number of possible choices of m + 1 distinct numbers from the

set { 1 , 2 , 3 , , m + n + 1 } is ( m + n + 1 m + 1 ) . By conditioning on the value of the

largest number chosen, one can see that this number of choices is also given by

k = 0 n ( m + k m ) . □

Lemma 2. Using the terminology in Theorem 2, let E n , k be the event that there are exactly k successes among the first n trials. Then

P ( E n , k andsuccessontrial n + 1 ) = ( n k ) ( λ 1 λ 1 + λ 2 ) k + 1 ( λ 2 λ 1 + λ 2 ) n k .

Proof. Let T i , j be the time between the ( i 1 ) st event and the i th event (i.e., the i th interarrival time) in the j th process. Also, let S j and F j be the respective times until the j th success and j th failure, so that

S j = i = 1 j T i , 1 and F j = i = 1 j T i , 2 .

Then,

P ( E n , k andsuccessontrial n + 1 ) = P ( F n k < S k + 1 < F n k + 1 ) = P ( F n k < S k + T k + 1 , 1 < F n k + T n k + 1 , 2 ) .

By conditioning on the independent random variables S k and F n k , and using the fact that the interarrival times in a Poisson process are independent random variables, it follows that the last probability is

0 0 P ( v < u + T k + 1 , 1 < v + T n k + 1 , 2 ) f S k ( u ) f F n k ( v ) d u d v , (4)

where f S k and f F n k are the pdf’s of S k and F n k , respectively. Now, use the fact that T k + 1 , 1 and T n k + 1 , 2 are independent and have respective exponential distributions with failure rates λ 1 and λ 2 . If u v , it follows that

P ( v < u + t < v + T n k + 1 , 2 ) = ( 0 if t v u e λ 2 ( t + u v ) if t > v u .

By conditioning on the value of T k + 1 , 1 , it can be concluded that

P ( v < u + T k + 1 , 1 < v + T n k + 1 , 2 ) = v u e λ 2 ( t + u v ) λ 1 e λ 1 t d t = λ 1 λ 1 + λ 2 e λ 1 ( v u ) . (5)

Similarly, if u > v ,

P ( v < u + t < v + T n k + 1 , 2 ) = e λ 2 ( t + u v ) ,

and hence

P ( v < u + T k + 1 , 1 < v + T n k + 1 , 2 ) = 0 e λ 2 ( t + u v ) λ 1 e λ 1 t d t = λ 1 λ 1 + λ 2 e λ 2 ( u v ) . (6)

Using the fact that S k and F n k have Erlang distributions with respective shape parameters k and n k and respective scale parameters λ 1 and λ 2 , substituting (5) and (6) into (4) leads to

P ( E n , k andsuccessontrial n + 1 ) = λ 1 λ 1 + λ 2 ( 0 u e λ 1 ( v u ) ( λ 1 u ) k 1 ( k 1 ) ! λ 1 e λ 1 u ( λ 2 v ) n k 1 ( n k 1 ) ! λ 2 e λ 2 v d v d u + 0 v e λ 2 ( u v ) ( λ 1 u ) k 1 ( k 1 ) ! λ 1 e λ 1 u ( λ 2 v ) n k 1 ( n k 1 ) ! λ 2 e λ 2 v d u d v ) . (7)

To evaluate the first double integral in (7), one can use the fact that the waiting time until the ( n k ) t h event in a Poisson process with rate λ 1 + λ 2 exceeds u if and only if the number of events in this process that occur by time u is at most n k 1 , and that this number has a Poisson distribution with mean ( λ 1 + λ 2 ) u . Hence the first integral in (7) is

λ 1 k λ 2 n k ( λ 1 + λ 2 ) n k 0 u k 1 ( k 1 ) ! u ( ( λ 1 + λ 2 ) u ) n k 1 ( n k 1 ) ! ( λ 1 + λ 2 ) e ( λ 1 + λ 2 ) v d v d u = λ 1 k λ 2 n k ( λ 1 + λ 2 ) n k 0 u k 1 ( k 1 ) ! j = 0 n k 1 e ( λ 1 + λ 2 ) u ( ( λ 1 + λ 2 ) u ) j j ! d u = λ 1 k λ 2 n k ( λ 1 + λ 2 ) n j = 0 n k 1 ( k 1 + j k 1 ) 0 ( ( λ 1 + λ 2 ) u ) j + k 1 ( j + k 1 ) ! ( λ 1 + λ 2 ) e ( λ 1 + λ 2 ) u d u = j = 0 n k 1 ( k 1 + j k 1 ) ( λ 1 λ 1 + λ 2 ) k ( λ 2 λ 1 + λ 2 ) n k = ( n 1 k ) ( λ 1 λ 1 + λ 2 ) k ( λ 2 λ 1 + λ 2 ) n k , (8)

where the penultimate equation follows from the fact that the integral in the preceding expression is one, and the last equation follows by an application of Lemma 1. By interchanging λ 1 with λ 2 , k with n k , and once again applying Lemma 1, it can be concluded in a similar manner that the value of the second double integral in (7) is

j = 0 k 1 ( n k 1 + j n k 1 ) ( λ 1 λ 1 + λ 2 ) k ( λ 2 λ 1 + λ 2 ) n k = ( n 1 k 1 ) ( λ 1 λ 1 + λ 2 ) k ( λ 2 λ 1 + λ 2 ) n k . (9)

From (7), (8) and (9), it now follows that

P ( E n , k andsuccessontrial n + 1 ) = [ ( n 1 k ) + ( n 1 k 1 ) ] ( λ 1 λ 1 + λ 2 ) k + 1 ( λ 2 λ 1 + λ 2 ) n k = ( n k ) ( λ 1 λ 1 + λ 2 ) k + 1 ( λ 2 λ 1 + λ 2 ) n k .

The case n = 0 is easy and left to the reader. The argument just presented assumes that 0 < k < n . The cases k = 0 and k = n are simpler and are also left to the reader. The proof of Lemma 2 is complete. □

Proof of Theorem 2. It will now be shown by induction on m that for m 2 , the first m trials are independent and that the probability of success on each of

these trials is λ 1 λ 1 + λ 2 .

First, suppose m = 2 . Set n = k = 0 in Lemma 2 to conclude that

P ( successonthe 1 sttrial ) = λ 1 λ 1 + λ 2 , (10)

and hence P ( failureonthe1sttrial ) = 1 λ 1 λ 1 + λ 2 = λ 2 λ 1 + λ 2 .

Set n = 1 and k = 1 in Lemma 2 to see that

P ( successoneachofthefirsttwotrials ) = ( λ 1 λ 1 + λ 2 ) 2 , (11)

and set n = 1 and k = 0 to obtain

P ( failureonthe1sttrialandsuccessonthe2ndtrial ) = λ 1 λ 2 ( λ 1 + λ 2 ) 2 .

Adding the last two probabilities gives

P ( successonthe2ndtrial ) = ( λ 1 λ 1 + λ 2 ) 2 + λ 1 λ 2 ( λ 1 + λ 2 ) 2 = λ 1 λ 1 + λ 2 . (12)

The independence of the first two trials now follows from (10), (11) and (12), and consequently Theorem 2 is true for m = 2 .

Suppose that Theorem 2 is true for some m 2 . The first m trials are there- fore Bernoulli trials, so that the number of successes has a binomial distribution. Specifically,

P ( E m , k ) = ( m k ) ( λ 1 λ 1 + λ 2 ) k ( λ 2 λ 1 + λ 2 ) m k . (13)

From Lemma 2, we can condition on the number of successes in the first m trials to see that

P ( successontrial m + 1 ) = k = 0 m P ( E m , k andsuccessontrial m + 1 ) = k = 0 m ( m k ) ( λ 1 λ 1 + λ 2 ) k + 1 ( λ 2 λ 1 + λ 2 ) m k = λ 1 λ 1 + λ 2 . (14)

It follows that on each of the first m + 1 trials, the probability of success is

λ 1 λ 1 + λ 2 . Furthermore, it follows from Lemma 2, (13), and (14) that for

k = 0 , 1 , , m

P ( E m , k andsuccessontrial m + 1 ) = P ( E m , k ) P ( successontrial m + 1 ) ,

from which one may conclude that the first m + 1 trials are independent. The proof of Theorem 2 is now complete.

4. Conclusion

In this paper, a new proof has been provided for the fact that, in the hierarchical model of a Poisson random variable Y whose mean has an Erlang distribution, the unconditional distribution of Y is negative binomial. A new proof that the events in two independent Poisson processes may be regarded as Bernoulli trials has also been provided. The distinguishing feature of this proof is that it does not make use of the memoryless property of the exponential distribution.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Karlin, S. and Taylor, H.M. (1975) A First Course in Stochastic Processes. 2nd Edition, Academic Press, San Deigo, CA.
[2] Ross, S.M. (2014) Introduction to Probability Models. 11th Edition, Academic Press, San Diego, CA.
[3] Stigler, S.M. (1986) The History of Statistics: The Measurement of Uncertainty before 1900. Belknap Press of Harvard University Press, Cambridge, MA.
[4] Greenwood, M. and Yule, G.U. (1920) An Inquiry into the Nature of Frequency Distributions of Multiple Happenings, with Particular Reference to the Occurrence of Multiple Attacks of Disease or Repeated Accidents. Journal of the Royal Statistical Society A, 83, 255-279.
https://doi.org/10.2307/2341080
[5] Sundt, B. and Vernic, R. (2009) Recursions for Convolutions and Compound Distributions with Insurance Applications. Springer-Verlag, Berlin Heidelberg.
[6] Albrecht, P. (1982) On Some Statistical Methods Connected with the Mixed Poisson Process. Scandinavian Actuarial Journal, No. 1, 1-14.
https://doi.org/10.1080/03461238.1982.10405427
[7] Antzoulakos, D. and Chadjiconstantinidis, S. (2004) On Mixed and Compound Mixed Poisson Distributions. Scandinavian Actuarial Journal, No. 3, 161-188.
https://doi.org/10.1080/03461230110106525
[8] Nadarajah, S. and Kotz, S. (2006) Compound Mixed Poisson Distributions I. Scandinavian Actuarial Journal, No. 3, 141-162.
https://doi.org/10.1080/03461230600783384
[9] Nadarajah, S. and Kotz, S. (2006) Compound Mixed Poisson Distributions II. Scandinavian Actuarial Journal, No. 3, 163-181.
https://doi.org/10.1080/03461230600715253
[10] Casella, G. and Berger, R.L. (1990) Statistical Inference. Wadsworth & Brooks/Cole, Pacific Grove, CA.
[11] Hogg, R., McKean, J.W. and Craig, A.T. (2005) Introduction to Mathematical Statistics. 6th Edition, Pearson, Upper Saddle River, NJ.
[12] Taylor, H.M. and Karlin, S. (1998) An Introduction to Stochastic Modeling. 3rd Edition, Academic Press, San Deigo, CA.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.