Academic
Publications
A Bayesian Lasso via reversible-jump MCMC

A Bayesian Lasso via reversible-jump MCMC,10.1016/j.sigpro.2011.02.014,Signal Processing,Xiaohui Chen,Z. Jane Wang,Martin J. McKeown

A Bayesian Lasso via reversible-jump MCMC  
BibTex | RIS | RefWorks Download
Variable selection is a topic of great importance in high-dimensional statistical modeling and has a wide range of real-world applications. Many variable selection techniques have been proposed in the context of linear regression, and the Lasso model is probably one of the most popular penalized regression techniques. In this paper, we propose a new, fully hierarchical, Bayesian version of the Lasso model by employing flexible sparsity promoting priors. To obtain the Bayesian Lasso estimate, a reversible-jump MCMC algorithm is developed for joint posterior inference over both discrete and continuous parameter spaces. Simulations demonstrate that the proposed RJ-MCMC-based Bayesian Lasso yields smaller estimation errors and more accurate sparsity pattern detection when compared with state-of-the-art optimization-based Lasso-type methods, a standard Gibbs sampler-based Bayesian Lasso and the Binomial–Gaussian prior model. To demonstrate the applicability and estimation stability of the proposed Bayesian Lasso, we examine a benchmark diabetes data set and real functional Magnetic Resonance Imaging data. As an extension of the proposed RJ-MCMC framework, we also develop an MCMC-based algorithm for the Binomial–Gaussian prior model and illustrate its improved performance over the non-Bayesian estimate via simulations.
Journal: Signal Processing , vol. 91, no. 8, pp. 1920-1932, 2011
Cumulative Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.