Uniformly Minimum-Variance Unbiased Estimator (UMVUE) for the Gamma Cumulative Distribution Function with Known and Integer Scale Parameter ()
1. Introduction
Statistical inference is an important topic in scientific studies, whether from the theoretical or applied aspects. Studying efficient estimators for the probability density function (PDF) or for the cumulative distribution function (CDF) can be useful for various applications, such as the estimation of Fisher information or the estimation of quantiles. Other applications are mentioned in [1].
Some studies on the estimation of CDF have appeared in recent literature for some continuous distributions, for instance, Pareto-Rayleigh distribution [2], Exponentiated Burr XII distribution [1] or general distributions [3].
The purpose of this paper is to present the Uniformly minimum-variance unbiased estimator (UMVUE) for the gamma cumulative distribution function with known and integer scale parameter.
The Gamma distribution is a member of the two-parameter family of continuous probability distributions. There are two different parameterizations currently used. In this article we focus on the shape (k) and scale (
) parameters, so
and the probability density function of X is defined by
If k is a positive integer,
, such a distribution is called an Erlang distribution and represents the sum of k independent exponential random variables, each of which has mean equal to
.
Let
be a random sample of size n from the population X. The most common way to estimate a cumulative distribution function of X, F, is from the empirical distribution function defined by
where
is the indicator of event A. This paper presents another estimator for F when k is known as an integer parameter, which is referred to as the minimum-variance unbiased estimator (UMVUE).
The paper is organized as follows. Section 2 presents two lemmas for the main result. Section 3 presents and demonstrates the main result of this paper. To finish, Section 4 presents a simple example that compares the empirical distribution function and the UMVUE estimator of
.
2. Preliminaries
The following lemmas will be used for establishing the main results.
Lemma 1 Let
,
,
and
. Then,
Proof. The proof goes by induction on b. Let’s call
.
Base case: check the result for
. That is,
For this, an integration by substitution will be performed.
Inductive step: show that the result is true for
if it is true for b. That is, assuming that
we will conclude that
To solve
an integration by parts will be done. Consider
and
, then
and
.
Replace j + 1 with l in the above summation:
which closes the proof by induction.
Lemma 2 Let
be a random sample with size n from the population
. Let
. Then,
Proof. First, notice that
Set
,
and
,
.
Then,
3. Main Result
Theorem 1 Let
be a random sample with size n from the population
. Consider
a known parameter. For any
let
be an unknown parameter. The Uniform Minimum Variance Unbiased Estimator (UMVUE) for
is given by
If
,
.
Proof. Rao-Blackwell Theorem [4] [5] states that if
is an unbiased estimator for
and
is a sufficient statistic for
, then
is an unbiased estimator for
based on
and
. Lehmann-Scheffé Theorem [6] [7] states that if
is an unbiased estimator for
and
is a complete sufficient statistic for
, then
is the (unique) uniformly minimum-variance unbiased estimator (UMVUE) for
. Let us find some unbiased estimator and a complete sufficient statistic for
to deduct its UMVUE.
First note that
defined below is an unbiased estimator for
.
Also note that
, where k is a known parameter, belong to an uniparametric exponential family. It is known that for the exponential family it is possible to find directly a sufficient and complete statistic [8]. In the case of X,
is a complete sufficient statistic. Then,
is the UMVUE for
. Let’s show that
has the expression given in the statement.
Before proceeding with the computations, we develop the expression of
. For this, we used Lemma 2.
Assuming
, and using Lemma 1,
4. A Very Simple Example
Consider the population
and the random sample of size 5:
which was generated in the R Program [9] by the command
set.seed(1);rgamma(5,3,1).
For this sample,
and the uniformly minimum-variance unbiased estimator for the gamma cumulative distribution function with scale parameter
is
Figure 1. Gamma cumulative distribution function—approximate and exact curves.
In Figure 1 can be compared the curves of the uniformly minimum-variance unbiased estimator for the gamma cumulative distribution function
(UMVUE) and the empirical cumulative function (ECF), both created from the random samples presented above. The black dotted line represents the exact curve (Real), considering
, parameter used to generate the sample.
5. Conclusions
In the previous sections, we argue about the construction of the UMVUE for the gamma cumulative distribution function with known and integer scale parameter. This is a pontual estimator for the
, for
, where
and
is a well-known parameter.
The advantage of using the UMVUE estimator is that besides being the uniformly minimum-variance unbiased estimator it is a continuous estimator with respect to q. Moreover, with small samples, the results are already satisfactory, which can be seen in Section 4. The disadvantage of the present approach is that for large values of k the estimator will have a complex expression and if
is too large, it may not be simple to compute
.