TITLE:
L1/2 Regularization Based on Bayesian Empirical Likelihood
AUTHORS:
Yuan Wang, Wanzhou Ye
KEYWORDS:
Bayesian Empirical Likelihood, Generalized Gaussian Prior, L1/2 Regularization, MCMC Method
JOURNAL NAME:
Advances in Pure Mathematics,
Vol.12 No.5,
May
31,
2022
ABSTRACT: Bayesian empirical likelihood is a semiparametric method that combines parametric priors and nonparametric likelihoods, that is, replacing the parametric likelihood function in Bayes theorem with a nonparametric empirical likelihood function, which can be used without assuming the distribution of the data. It can effectively avoid the problems caused by the wrong setting of the model. In the variable selection based on Bayesian empirical likelihood, the penalty term is introduced into the model in the form of parameter prior. In this paper, we propose a novel variable selection method, L1/2 regularization based on Bayesian empirical likelihood. The L1/2 penalty is introduced into the model through a scale mixture of uniform representation of generalized Gaussian prior, and the posterior distribution is then sampled using MCMC method. Simulations demonstrate that the proposed method can have better predictive ability when the error violates the zero-mean normality assumption of the standard parameter model, and can perform variable selection.