Inferences under a Class of Finite Mixture Distributions Based on Generalized Order Statistics

The main purpose of this paper is to obtain estimates of parameters, reliability and hazard rate functions of a heterogeneous population represented by finite mixture of two general components. The doubly Type II censoring of generalized order statistics scheme is used. Maximum likelihood and Bayes methods of estimation are used for this purpose. The two methods of estimation are compared via a Monte Carlo Simulation study.


Introduction
Let the random variable (rv) T follow a class including some known lifetime models, its cumulative distribution function (CDF) is given by and its probability density function (PDF) is given by where is the derivative of with respect to t and where Bayesian inferences based on finite mixture distribution have been discussed by several authors.Bayesian estimation of the mixing parameter, mean and reliability function of a mixture of two exponential lifetime distributions based on right censored samples considered by [1,2] es-timated the survival and hazard functions of a finite mixture of two Gompertz components by using type I and type II censored samples, using the maximum likelihood (ML) and Bayes methods.Based on type I censored samples from a finite mixture of two truncated type I generalized logistic components, [3] computed the Bayes estimates of parameters, reliability and hazard rate functions.
[4] considered estimation for the mixed exponential distribution based on record statistics.[5] considered Bayes inference under a finite mixture of two compound Gompertz components model.[6] studied some properties of the mixture of two inverse Weibull distributions and obtained the estimates of the unknown parameters via the EM Algorithm.
Mixtures of distributions arise frequently in life testing, reliability, biological and physical sciences.Some of the most important references that discussed different types of mixtures of distributions are a monograph by [21][22][23].
The PDF, CDF, RF and HRF of a finite mixture of two components of the class under study are given, respectively, where, for , the mixing proportions 1, 2 j  j p   are such that 1 2 and 0 1 , are given from (1), ( 2), (3) after using j  and The property of identifiability is an important consideration on estimating the parameters in a mixture of distributions.Also, testing hypothesis, classification of random variables, can be meaning fully discussed only if the class of all finite mixtures is identifiable.Idenifiability of mixtures has been discussed by several authors, including [24][25][26].
Our aim of this paper is the estimation of the parameters and functions of these parameters of a class of finite mixture distributions based on doubly Type II censoring gos's using ML and Bayes methods.Illustrative example of Gompertz distribution is given and compared with the results obtained by previous researchers.

Maximum Likelihood Estimation
be the (r -s) gos's drawn from a mixture of two components of the class (2).Based on this doubly censored sample, the likelihood function can be written [27] as  is the parameter space, and .
The likelihood function (9) and maximum likelihood estimates (MLE's) can be obtained by using (1) and (5) in two cases, regarding to m value, as follows.

MLE's When m 1  
In this case, substituting (1), (5) in (9), the likelihood function takes the form Take the logarithm of (10), we have The solution of the five nonlinear likelihood Equations (12) using numerical method, yields the MLE's ˆˆ, , , p    and 2  .

MLE's When m 1  
The likelihood function takes the form So, from ( 14) Differentiating ( 15) with respect to the parameters , j p  and j  and equating to zero gives the following likelihood equations where where is the hyperparameter space.

Bayes Estimates When m 1  
It follows, from (10) and (18), that the posterior density function is given by The solution of the five nonlinear likelihood Equations ( 16) using numerical method, yields the MLE's

Bayes Estimation
In this section, Bayesian estimation for the parameters of a class of finite mixture distributions is considered under squared error and Linex (Linear-Exponential) loss functions.We shall use the conjugate prior density, that was suggested by [29], in the following form where , , , , , , The Bayes estimator of a function, say   u  , under squared error and Linex loss functions is given, respectively, by where the integral is taken over the five dimensional space and .0 a  To compute the integral, we can use the Monte Carlo Integration (MCI) method in the form where , 1,2, ,   , for more details see [30].Under squared error and Linex loss functions, we can obtain the Bayes estimator of the parameter p, by generating , , , , , (18) and setting in (23) and (24).The Bayes estimates of , ,    and 2  can be similarly computed.

Bayes Estimates When m 1  
The posterior density function can be obtained from ( 14) and ( 18), as where Under squared error and Linex loss functions, we can obtain the Bayes estimator of the parameter p, by generating , , , , , (18) and setting in ( 23) and (24).The Bayes estimates of , ,    and 2  can be similarly computed.

Maximum Likelihood Estimation
Suppose that, for 1, 2 j  and 0, 1, In this case, the j subpopulation is Gompertz distribution with parameter 0.
  in (12), we have the following nonlinear equations where, for , p  and 2  are the solution of the above nonlinear equations.
Also, for 1 m   substituting   j t an  d   j t in (13), ( 16) and ( 17), we have the following nonlinear equa Substituting and 0, 1 27), we have the following nonlinear equations

 
Substituting in (29), we have the following nonlinear equations The solution of the nonlinear likelihood Equations 3 ( 3) gives the MLE's 1 , p  and 2  .

Bayes Estimation
the posterior density fun s the ction take form  3)-(9) are repeated 1000 times.Th (ER) m ing the esti ions, respectively computational (our) results were computed by e averages and the estimated risks are coputed over the 1000 repetitions by averag mates and the squared deviat .The usi case io ng Mathematica 6.0.In all above s the pr r parameters chosen as 1

,
the likelihood function takes the form

z n 1 ,
above steps times to get a sample of si e .4) The sample obtained in Step 3 is ordered.5) The MLE's of the parameters n Bayes estimates under squared error and Linex loss functions (BES, BEL)

Figure 2 .
Estimated Risks (ER) of the estimates of complete samples.

Figure
Figure 3.Estimated Risks (ER) of the estimates of