[1]
R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Statistical Methodology), 58(1): 267–288, (1996).
DOI: 10.1111/j.2517-6161.1996.tb02080.x
Google Scholar
[2]
H. Zou and T. Hastie. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67(2): 301–320, (2005).
DOI: 10.1111/j.1467-9868.2005.00503.x
Google Scholar
[3]
H. Zou. The adaptive lasso and its oracle properties. Journal of the American Statistical Association, 101(476): 1418–1429, (2006).
DOI: 10.1198/016214506000000735
Google Scholar
[4]
H. Zou and H.H. Zhang. On the adaptive elastic-net with a diverging number of parameters. The Annals of Statistics, 37(4): 1733–1751, (2009).
Google Scholar
[5]
R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, and K. Knight. Sparsity and smoothness via the fused lasso. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 67(1): 91–108, (2005).
DOI: 10.1111/j.1467-9868.2005.00490.x
Google Scholar
[6]
Z. John Daye and X. Jessie Jeng. Shrinkage and model selection with correlated variables via weighted fusion. Computational Statistics & Data Analysis, 53(4): 1284 – 1298, (2009).
DOI: 10.1016/j.csda.2008.11.007
Google Scholar
[7]
B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani. Least angle regression. The Annals of Statistics, 32(2): 407–499, (2004).
DOI: 10.1214/009053604000000067
Google Scholar
[8]
J. Fan and R. Li. Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, 96(456): 1348–1360, (2001).
DOI: 10.1198/016214501753382273
Google Scholar