Open Access
September 2010 Bayesian regularized quantile regression
Qing Li, Nan Lin, Ruibin Xi
Bayesian Anal. 5(3): 533-556 (September 2010). DOI: 10.1214/10-BA521

Abstract

Regularization, e.g. lasso, has been shown to be effective in quantile regression in improving the prediction accuracy (Li and Zhu, 2008; Wu and Liu, 2009). This paper studies regularization in quantile regressions from a Bayesian perspective. By proposing a hierarchical model framework, we give a generic treatment to a set of regularization approaches, including lasso, group lasso and elastic net penalties. Gibbs samplers are derived for all cases. This is the first work to discuss regularized quantile regression with the group lasso penalty and the elastic net penalty. Both simulated and real data examples show that Bayesian regularized quantile regression methods often outperform quantile regression without regularization and their non-Bayesian counterparts with regularization.

Citation

Download Citation

Qing Li. Nan Lin. Ruibin Xi. "Bayesian regularized quantile regression." Bayesian Anal. 5 (3) 533 - 556, September 2010. https://doi.org/10.1214/10-BA521

Information

Published: September 2010
First available in Project Euclid: 22 June 2012

zbMATH: 1330.62143
MathSciNet: MR2719666
Digital Object Identifier: 10.1214/10-BA521

Keywords: Bayesian analysis , Elastic net , Gibbs sampler , group lasso , Lasso , Quantile regression , regularization

Rights: Copyright © 2010 International Society for Bayesian Analysis

Vol.5 • No. 3 • September 2010
Back to Top