In the presence of multicollinearity in logistic regression, the variance of the Maximum Likelihood Estimator (MLE) becomes inflated. Siray et al. (2015)  proposed a restricted Liu estimator in logistic regression model with exact linear restrictions. However, there are some situations, where the linear restrictions are stochastic. In this paper, we propose a Stochastic Restricted Maximum Likelihood Estimator (SRMLE) for the logistic regression model with stochastic linear restrictions to overcome this issue. Moreover, a Monte Carlo simulation is conducted for comparing the performances of the MLE, Restricted Maximum Likelihood Estimator (RMLE), Ridge Type Logistic Estimator(LRE), Liu Type Logistic Estimator(LLE), and SRMLE for the logistic regression model by using Scalar Mean Squared Error (SMSE).

Logistic Regression Multicollinearity Stochastic Restricted Maximum Likelihood Estimator Scalar Mean Squared Error
1. Introduction

In many fields of study such as medicine and epidemiology, it is very important to predict a binary response variable, or to compute the probability of occurrence of an event, in terms of the values of a set of explanatory variables related to it. For example, the probability of suffering a heart attack is computed in terms of the levels of a set of risk factors such as cholesterol and blood pressure. The logistic regression model serves admirably this purpose and is the most used for these cases.

The general form of logistic regression model is

which follows Bernoulli distribution with parameter as

where is the row of X, which is an data matrix with p explanatory variables and is a vector of coefficients, is independent with mean zero and variance of the response. The maximum likelihood method is the most common estimation technique to estimate the parameter, and the Maximum Likelihood Estimator (MLE) of can be obtained as follows:

where; Z is the column vector with element equals and , which is an unbiased estimate of. The covariance matrix of is

As many authors have stated (Hosmer and Lemeshow (1989)  and Ryan (1997)  , among others), the logistic regression model becomes unstable when there exists strong dependence among explanatory variables (multi-collinearity). For example, we suppose that the probability of a person surviving 10 or more extra years is modelled using three predictors Sex, Diastolic blood pressure and Body mass index. Since the response “whether the person surviving 10 or more extra years” is binary, the logistic regression model is appropriate for this problem. However, it is understood that the predictors Sex, Diastolic blood pressure and Body mass index may have some inter-relationship within each person. In this case, the estimation of the model parameters becomes inaccurate because of the need to invert near-singular information matrices. Consequently, the interpretation of the relationship between the response and each explanatory variable in terms of odds ratio may be erroneous. As a result, the estimates have large variances and large confidence intervals, which produce inefficient estimates.

To overcome the problem of multi-collinearity in the logistic regression, many estimators are proposed alternatives to the MLE. The most popular way to deal with this problem is called the Ridge Logistic Regression (RLR), which is first proposed by Schaffer et al. (1984)  . Later Principal Component Logistic Estimator (PCLE) by Aguilera et al. (2006)  , the Modified Logistic Ridge Regression Estimator (MLRE) by Nja et al. (2013)  , Liu Estimator by Mansson et al. (2012)  , and Liu-type estimator by Inan and Erdogan (2013)  in logistic regression have been proposed.

An alternative technique to resolve the multi-collinearity problem is to consider parameter estimation with priori available linear restrictions on the unknown parameters, which may be exact or stochastic. That is, in some practical situations there exist different sets of prior information from different sources like past experience or long association of the experimenter with the experiment and similar kind of experiments conducted in the past. If the exact linear restrictions are available in addition to logistic regression model, many authors propose different estimators for the respective parameter. Duffy and Santer (1989)  introduce a Restricted Maximum Likelihood Estimator (RMLE) by incorporating the exact linear restriction on the unknown parameters. Recently Şiray et al. (2015)  proposes a new estimator called Restricted Liu Estimator (RLE) by replacing MLE by RMLE in the logistic Liu estimator.

In this paper we propose a new estimator which is called as the Stochastic Restricted Maximum Likelihood Estimator (SRMLE) when the linear stochastic restrictions are available in addition to the logistic regression model. The rest of the paper is organized as follows. The proposed estimator and its asymptotic properties are given in Section 2. In Section 3, the mean square error matrix and the scalar mean square error for this new estimator are obtained. Section 4 describes some important existing estimators for the logistic regression models. Performance of the proposed estimator with respect to Scalar Mean Squared Error (SMSE) is compared with some existing estimators by performing a Monte Carlo simulation study in Section 5. The conclusion of the study is presented in Section 6.

2. The Proposed Estimator and its Asymptotic Properties

First consider the multiple linear regression model

where y is an observable random vector, X is an known design matrix of rank p, is a vector of unknown parameters and is an vector of disturbances.

The Ordinary Least Square Estimator (OLSE) of is given by

where.

In addition to sample model (5), consider the following linear stochastic restriction on the parameter space;

where r is an stochastic known vector, R is a of full rank with known elements and is an random vector of disturbances with mean 0 and dispersion matrix, and is assumed to be known positive definite matrix. Further it is assumed that is stochastically independent of, i.e..

The Restricted Ordinary Least Square Estimator (ROLSE) due to exact prior restriction (i.e.) in (7) is given by

Theil and Goldberger (1961)  proposed the mixed regression estimator (ME) for the regression model (2.1) with the stochastic restricted prior information (7)

Suppose that the following linear prior information is given in addition to the general logistic regression model (1)

where h is an stochastic known vector, H is a of full rank known elements and is an random vector of disturbances with mean 0 and dispersion matrix, and is assumed

to be known positive definite matrix. Further, it is assumed that is stochastically independent of, i.e..

Duffy and Santner (1989)  proposed the Restricted Maximum Likelihood Estimator (RMLE) for the logistic regression model (1) with the exact prior restriction (i.e.) in (10)

Following RMLE in (11) and the Mixed Estimator (ME) in (9) in the Linear Regression Model, we propose a new estimator which is named as the Stochastic Restricted Maximum Likelihood Estimator (SRMLE) when the linear stochastic restriction (10) is available in addition to the logistic regression model (1).

Asymptotic Properties of SRMLE:

The is asymptotically unbiased.

The asymtotic covariance matrix of SRMLE equals

3. Mean Square Error Matrix Comparisons

To compare different estimators with respect to the same parameter vector in the regression model, one can use the well known Mean Square Error (MSE) Matrix (MSE) and/or Scalar Mean Square Error (SMSE) criteria.

where is the dispersion matrix, and denotes the bias vector.

The Scalar Mean Square Error (SMSE) of the estimator can be defined as

For two given estimators and, the estimator is said to be superior to under the MSE criterion if and only if

The MSE and SMSE of the proposed estimator SRMLE is

Note that the difference given in (20) is non-negative definite. Thus by the MSE criteria it follows that has smaller Mean square error than.

4. Some Existing Logistic Estimators

To examine the performance of the proposed estimator SRMLE over some existing estimators, the following estimators are considered.

1) Logistic Ridge Estimator

Schaefer et al. (1984)  proposed a ridge estimator for the logistic regression model (1).

where is the ridge parameter and.

The asymptotic MSE and SMSE of,

where.

2) Logistic Liu Estimator

Following Liu (1993)  , Urgan and Tez (2008)  , Mansson et al. (2012)  examined the Liu Estimator for logistic regression model, which is defined as

where is a parameter and.

The asymptotic MSE and SMSE of,

where.

3) Restricted MLE

As we mentioned in Section 2, Duffy and Santner (1989)  proposed the Restricted Maximum Likelihood Estimator (RMLE) for the logistic regression model (1) with the exact prior restriction (i.e.) in (10).

The asymptotic MSE and SMSE of,

where

and

Mean Squared Error Comparisons

・ SRMLE versus LRE

where and. One can obviously say that

and are positive definite and is non-negative definite matrices. Further by

Theorem 1 (see Appendix 1), it is clear that is positive definite matrix. By Lemma 1 (see Appendix 1), if, where is the largest eigen value of then is a positive definite matrix. Based on the above arguments, the following theorem can be stated.

Theorem 4.1. The estimator SRMLE is superior to LRE if and only if.

・ SRMLE Versus LLE

where and. One can obviously say that and are positive definite and is non-negative defi-

nite matrices. Further by Theorem 1 (see Appendix 1), it is clear that is positive definite matrix. By Lemma 1 (see Appendix 1), if, where is the the largest eigen value of then is a positive definite matrix. Based on the above arguments, the following theorem can be stated.

Theorem 4.2. The estimator is superior to if and only if.

・ SRMLE versus RMLE

where and. One can obviously say that and are positive definite and is non-negative definite matrices. Further by Theorem 1 (see Appendix 1), it is clear that is positive definite matrix. By Lemma 1 (see Appendix 1), if, where is the the largest eigen value of then is a positive definite matrix. Based on the above arguments, the following theorem can be stated.

Theorem 4.3. The estimator is superior to if and only if.

Based on the above results one can say that the new estimator SRMLE is superior to the other estimators with respect to the mean squared error matrix sense under certain conditions. To check the superiority of the estimators numerically, we then consider a simulation study in the next section.

5. A Simulation Study

A Monte Carlo simulation is done to illustrate the performance of the new estimator SRMLE over the MLE, RMLE, LRE, and LLE by means of Scalar Mean Square Error (SMSE). Following McDonald and Galarneau (1975)  the data are generated as follows:

where are pseudo- random numbers from standardized normal distribution and represents the correlation between any two explanatory variables. Four explanatory variables are generated using (33). We considered four different values of corresponding to 0.70, 0.80, 0.90 and 0.99. Further four different values of n corresponding to 20, 40, 50, and 100 are considered. The dependent variable in (1) is obtained from the Ber-

noulli () distribution where. The parameter values of are chosen so that and.

Moreover, for the restriction, we choose

Further for the ridge parameter k and the Liu parameter d, some selected values are chosen so that and.

The experiment is replicated 3000 times by generating new pseudo-random numbers and the estimated SMSE is obtained as

The simulation results are listed in Tables A1-A16 (Appendix 3) and also displayed in Figures A1-A4 (Appendix 2). From Figures A1-A4, it can be noticed that in general increase in degree of correlation between two explanatory variables inflates the estimated SMSE of all the estimators and increase in sample size n declines the estimated SMSE of all the estimators. Further, the new estimator SRMLE has smaller SMSE compared to MLE with respect to all the values of and n. However, when and, SRMLE performs better compared to the estimators LRE, and LLE. From Table A17 (Appendix 3), it is clear that when k and d are small LLE is better than other estimators in the MSE sense, and LRE is better when k and d are large. For moderate k and d values the proposed estimator is good, but this will change with the n and values. Therefore we then analyse the estimators LRE, LLE and SRMLE further by using different k and d values and the results are summarized in Table A18 and Table A19 (Appendix 3). According to these results it is clear that SRMLE is even superior to LRE and LLE for certain values of k and d.

6. Concluding Remarks

In this research, we introduced the Stochastic Restricted Maximum Likelihood Estimator (SRMLE) for logistic regression model when the linear stochastic restriction was available. The performances of the SRMLE over MLE, LRE, RMLE, and LLE in logistic regression model were investigated by performing a Monte Carlo simulation study. The research had been done by considering different degree of correlations, different numbers of observations and different values of parameters k, d. It was noted that the SMSE of the MLE was inflated when the multicollinearity was presented and it was severe particularly for small samples. The simulation results showed that the proposed estimator SRMLE had smaller SMSE than the estimator MLE with respect to all the values of n and. Further it was noted that the proposed estimator SRMLE was superior over the estimators LLE and LRE for some k and d values related to different and n.

Acknowledgements

We thank the editor and the referee for their comments and suggestions, and the Postgraduate Institute of Science, University of Peradeniya, Sri Lanka for providing necessary facilities to complete this research.

Cite this paper

VarathanNagarajah,PushpakanthieWijekoon,11, (2015) Stochastic Restricted Maximum Likelihood Estimator in Logistic Regression Model. Open Journal of Statistics,05,837-851. doi: 10.4236/ojs.2015.57082

Appendix 1

Theorem 1. Let A: and B: such that and. Then. (Rao and Toutenburg, 1995)  .

Lemma 1. Let the two matrices , , then if and only if. (Rao et al., 2008)  .

k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99 k, d = 1.0
MLE2.60972.60972.60972.60972.60972.60972.60972.60972.60972.60972.60972.6097
LRE2.60972.17741.86881.63611.45431.30841.18921.09011.00680.93610.88120.8755
RMLE2.26822.26822.26822.26822.26822.26822.26822.26822.26822.26822.26822.2682
LLE0.87550.99951.13551.28351.44351.61561.79971.99582.20392.42402.63252.6562
SRMLE1.22741.22741.2274 1.22741.22741.22741.22741.22741.22741.22741.22741.2274
k, d = 0.0k, d = 0.1k, d = 0.2 k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE3.75093.75093.75093.75093.75093.75093.75093.75093.75093.75093.75093.7509
LRE3.75092.87862.33121.95251.67501.46331.29711.16371.05480.96460.89600.8890
RMLE2.24522.24522.24522.24522.24522.24522.24522.24522.24522.24522.24522.2452
LLE0.88901.06891.27331.50231.75582.03402.33672.66403.01583.39233.75213.7933
SRMLE1.41791.41791.41791.41791.41791.41791.41791.41791.41791.41791.41791.4179
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE7.24477.24477.24477.24477.24477.24477.24477.24477.24477.24477.24477.2447
LRE7.24474.46353.14552.40051.91931.58591.34361.16111.01990.90830.82670.8186
RMLE2.22632.22632.22632.22632.22632.22632.22632.22632.22632.22632.22632.2263
LLE0.81861.12871.51351.97312.50753.11653.80034.55895.39226.30027.18137.2829
SRMLE1.76931.76931.76931.76931.76931.76931.76931.76931.76931.76931.76931.7693
k, d=0.0k, d=0.1k, d=0.2k, d=0.3k, d=0.4k, d=0.5k, d=0.6k, d=0.7k, d=0.8k, d=0.9k, d=0.99k, d=1.0
MLE70.589070.589070.589070.589070.589070.589070.589070.589070.589070.589070.589070.5890
LRE70.58906.56712.60981.46200.97110.71530.56500.46920.40450.35890.32880.3259
RMLE2.21182.21182.21182.21182.21182.21182.21182.21182.21182.21182.21182.2118
LLE0.32591.51794.00717.793512.877019.258026.936035.912046.184057.754069.275870.6209
SRMLE2.54102.54102.54102.54102.54102.54102.54102.54102.54102.54102.54102.5410
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE0.86480.86480.86480.86480.86480.86480.86480.86480.86480.86480.86480.8648
LRE0.86480.82100.78110.74470.71130.68070.65250.62660.60250.58030.56170.5597
RMLE2.10572.10572.10572.10572.10572.10572.10572.10572.10572.10572.10572.1057
LLE0.55970.58750.61610.64540.67560.70650.73830.77080.80420.83830.86970.8732
SRMLE0.61410.61410.61410.61410.61410.61410.61410.61410.61410.61410.61410.6141
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE1.23201.23201.23201.23201.23201.23201.23201.23201.23201.23201.23201.2320
LRE1.23201.13991.05930.98820.92520.86900.81860.77330.73240.69530.66480.6600
RMLE2.09672.09672.09672.09672.09672.09672.09672.09672.09672.09672.09672.0967
LLE0.66150.71010.76070.81340.86820.92500.98391.04481.10771.17281.23301.2398
SRMLE0.76750.76750.76750.76750.76750.76750.76750.76750.76750.76750.76750.7675
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE2.35572.35572.35572.35572.35572.35572.35572.35572.35572.35572.35572.3557
LRE2.35572.01821.75561.54601.37541.23431.11611.01580.93010.85600.79760.7916
RMLE2.08922.08922.08922.08922.08922.08922.08922.08922.08922.08922.08922.0892
LLE0.79160.90671.03131.16511.30821.46071.62261.79371.97422.16402.34282.3631
SRMLE1.09611.09611.09611.09611.09611.09611.09611.09611.09611.09611.09611.0961
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE22.720222.720222.720222.720222.720222.720222.720222.720222.720222.720222.720222.7202
LRE22.72027.19203.62832.21571.50851.10300.84860.67840.55900.47190.41240.4066
RMLE2.08352.08352.08352.08352.08352.08352.08352.08352.08352.08352.08352.0835
LLE0.40661.04282.03363.37905.07917.13379.543012.307015.425518.899022.327822.7265
SRMLE2.19822.19822.19822.19822.19822.19822.19822.19822.19822.19822.19822.1982
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE0.55440.55440.55440.55440.55440.55440.55440.55440.55440.55440.55440.5544
LRE0.55440.53680.52020.50460.48980.47580.46260.45010.43820.42710.41740.4164
RMLE2.07012.07012.07012.07012.07012.07012.07012.07012.07012.07012.07012.0701
LLE0.41640.42950.44290.45650.47030.48440.49870.51350.52800.54300.55670.5582
SRMLE0.43680.43680.43680.43680.43680.43680.43680.43680.43680.43680.43680.4368
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE0.78330.78330.78330.78330.78330.78330.78330.78330.78330.78330.78330.7833
LRE0.78330.75110.71690.68530.65610.62890.60370.58030.55840.53800.52080.5189
RMLE2.06422.06422.06422.06422.06422.06422.06422.06422.06422.06422.06422.0642
LLE0.51890.54330.56840.59410.62040.64740.67500.70320.73210.76170.78880.7918
SRMLE0.56200.56200.56200.56200.56200.56200.56200.56200.56200.56200.56200.5620
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE1.50401.50401.50401.50401.50401.50401.50401.50401.50401.50401.50401.5040
LRE1.50401.36501.24611.14341.05410.97560.90640.84510.79040.74140.70150.6973
RMLE2.05932.05932.05932.05932.05932.05932.05932.05932.05932.05932.05932.0593
LLE0.69730.76320.83240.90490.98091.06021.14291.22901.31841.41121.49761.5074
SRMLE0.84790.84790.84790.84790.84790.84790.84790.84790.84790.84790.84790.8479
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE14.476214.476214.476214.476214.476214.476214.476214.476214.476214.476214.476214.4762
LRE14.47626.47393.73272.44791.73881.30511.02010.82260.68010.57380.49970.4925
RMLE2.05552.05552.05552.05552.05552.05552.05552.05552.05552.05552.05552.0555
LLE0.49250.99601.69842.59983.70004.99926.49748.194410.090412.185014.240914.4792
SRMLE2.02122.02122.02122.02122.02122.02122.02122.02122.02122.02122.02122.0212
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE0.40830.40830.40830.40830.40830.40830.40830.40830.40830.40830.40830.4083
LRE0.40830.39890.38990.38120.37300.36510.35740.35020.34310.33640.33060.3300
RMLE2.05242.05242.05242.05242.05242.05242.05242.05242.05242.05242.05242.0524
LLE0.33000.33760.34530.35310.36110.36910.37720.38530.39360.40200.40960.4105
SRMLE0.34000.34000.34000.34000.34000.34000.34000.34000.34000.34000.34000.3400
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE0.58010.58010.58010.58010.58010.58010.58010.58010.58010.58010.58010.5801
LRE0.58010.56020.54130.52360.50680.49110.47600.46180.44840.43560.42740.4235
RMLE2.04812.04812.04812.04812.04812.04812.04812.04812.04812.04812.04812.0481
LLE0.42350.43810.45310.46820.48360.49930.51530.53160.54810.56500.58030.5821
SRMLE0.44540.44540.44540.44540.44540.44540.44540.44540.44540.44540.44540.4454
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE1.10561.10561.10561.10561.10561.10561.10561.10561.10561.10561.10561.1056
LRE1.10561.03020.96290.90250.84810.79890.75420.71350.67630.64220.61390.6109
RMLE2.04442.04442.04442.04442.04442.04442.04442.04442.04442.04442.04442.0444
LLE0.61090.65340.69750.74320.79050.83930.88980.94180.99551.05071.10171.1075
SRMLE0.69670.69670.69670.69670.69670.69670.69670.69670.69670.69670.69670.6967
k, d = 0.0k, d = 0.1k, d = 0.2k, d = 0.3k, d = 0.4k, d = 0.5k, d = 0.6k, d = 0.7k, d = 0.8k, d = 0.9k, d = 0.99k, d = 1.0
MLE10.628010.628010.628010.628010.628010.628010.628010.628010.628010.628010.628010.6280
LRE10.62805.72563.61792.50691.84661.42121.13080.92350.77030.65380.57140.5632
RMLE2.04162.04162.04162.04162.04162.04162.04162.04162.04162.04162.04162.0416
LLE0.56320.98681.53992.22273.03503.97695.04836.24947.58009.040210.465210.6300
SRMLE1.88271.88271.88271.88271.88271.88271.88271.88271.88271.88271.88271.8827
Summary of the Tables A1-A16
Best Estimatorn = 20
LLE
SRMLE
LRE
n = 50
LLE
SRMLE
LRE
n = 75
LLE
SRMLE
LRE
n = 100
LLE
SRMLE
LRE
n = 20n = 50
LLE
LRE
SRMLE
LLE
LRE
SRMLE
LLE
LRE
SRMLE
LLE
LRE
SRMLE
n = 75n = 100
LLE
LRE
SRMLE
LLE
LRE
SRMLE
LLE
LRE
SRMLE
LLE
LRE
SRMLE
ReferencesSiray, G.U., Toker, S. and, Ka&ccediliranlar, S. (2015) On the Restricted Liu Estimator in Logistic Regression Model. Communications in Statistics—Simulation and Computation, 44, 217-232. </br>http://dx.doi.org/10.1080/03610918.2013.771742Hosmer, D.W. and Lemeshow, S. (1989) Applied Logistic Regression. Wiley, New York.Ryan, T.P. (1997) Modern Regression Methods. Wiley, New York.Schaefer, R.L., Roi, L.D. and Wolfe, R.A. (1984) A Ridge Logistic Estimator. Communications in Statistics—Theory and Methods, 13, 99-113. </br>http://dx.doi.org/10.1080/03610928408828664Aguilera, A.M., Escabias, M. and Valderrama, M.J. (2006) Using Principal Components for Estimating Logistic Regression with High-Dimensional Multicollinear Data. Computational Statistics & Data Analysis, 50, 1905-1924. </br>http://dx.doi.org/10.1016/j.csda.2005.03.011Nja, M.E., Ogoke, U.P. and Nduka, E.C. (2013) The Logistic Regression Model with a Modified Weight Function. Journal of Statistical and Econometric Method, 2, 161-171.Mansson, G., Kibria, B.M.G. and Shukur, G. (2012) On Liu Estimators for the Logit Regression Model. The Royal Institute of Techonology, Centre of Excellence for Science and Innovation Studies (CESIS), Paper No. 259. </br>http://dx.doi.org/10.1016/j.econmod.2011.11.015Inan, D. and Erdogan, B.E. (2013) Liu-Type Logistic Estimator. Communications in Statistics—Simulation and Computation, 42, 1578-1586. </br>http://dx.doi.org/10.1080/03610918.2012.667480Duffy, D.E. and Santner, T.J. (1989) On the Small Sample Prosperities of Norm-Restricted Maximum Likelihood Estimators for Logistic Regression Models. Communications in Statistics—Theory and Methods, 18, 959-980. </br>http://dx.doi.org/10.1080/03610928908829944Theil, H. and Goldberger, A.S. (1961) On Pure and Mixed Estimation in ECONOMICS. International Economic Review, 2, 65-77. </br>http://dx.doi.org/10.2307/2525589Liu, K. (1993) A New Class of Biased Estimate in Linear Regression. Communications in Statistics—Theory and Methods, 22, 393-402. </br>http://dx.doi.org/10.1080/03610929308831027Urgan, N.N. and Tez, M. (2008) Liu Estimator in Logistic Regression When the Data Are Collinear. International Conference on Continuous Optimization and Knowledge-Based Technologies, Linthuania, Selected Papers, Vilnius, 323-327.McDonald, G.C. and Galarneau, D.I. (1975) A Monte Carlo Evaluation of Some Ridge-Type Estimators. Journal of the American Statistical Association, 70, 407-416. </br>http://dx.doi.org/10.1080/01621459.1975.10479882Rao, C.R. and Toutenburg, H. (1995) Linear Models: Least Squares and Alternatives. 2nd Edition, Springer-Verlag, New York, Inc.Rao, C.R., Toutenburg, H., Shalabh and Heumann, C. (2008) Linear Models and Generalizations. Springer, Berlin.