Abstract
In this paper, we investigate the objective function and deflation process for sparse Partial Least Squares (PLS) regression with multiple components. While many have considered variations on the objective for sparse PLS, the deflation process for sparse PLS has not received as much attention. Our work highlights a flaw in the Statistically Inspired Modification of Partial Least Squares (SIMPLS) deflation method when applied in sparse PLS regression. We also consider the Nonlinear Iterative Partial Least Squares (NIPALS) deflation in sparse PLS regression. To remedy the flaw in the SIMPLS method, we propose a new sparse PLS method wherein the direction vectors are constrained to be sparse and lie in a chosen subspace. We give insight into this new PLS procedure and show through examples and simulation studies that the proposed technique can outperform alternative sparse PLS techniques in coefficient estimation. Moreover, our analysis reveals a simple renormalization step that can be used to improve the estimation of sparse PLS direction vectors generated using any convex relaxation method.
Original language | English |
---|---|
Pages (from-to) | 1005-1019 |
Number of pages | 15 |
Journal | Journal of Statistical Computation and Simulation |
Volume | 89 |
Issue number | 6 |
DOIs | |
Publication status | Published - 2019 |
Externally published | Yes |
Keywords
- Deflation
- Lasso
- Partial Least Squares
- penalized matrix decomposition