On partial least squares estimation in scalar-on-function regression models

Semanur Saricam, Ufuk Beyaztas*, Baris Asikgil, Han Lin Shang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
69 Downloads (Pure)

Abstract

Scalar-on-function regression, where the response is scalar-valued and the predictor consists of random functions, is one of the most important tools for exploring the functional relationship between a scalar response and functional predictor/s. The functional partial least squares method improves estimation accuracy for estimating the regression coefficient function compared to other existing methods, such as least squares, maximum likelihood, and maximum penalized likelihood. The functional partial least squares method is often based on the SIMPLS or NIPALS algorithm, but these algorithms can be computationally slow for analyzing a large dataset. In this study, we propose two modified functional partial least squares methods to efficiently estimate the regression coefficient function under the scalar-on-function regression. In the proposed methods, the infinite-dimensional functional predictors are first projected onto a finite-dimensional space using a basis expansion method. Then, two partial least squares algorithms, based on re-orthogonalization of the score and loading vectors, are used to estimate the linear relationship between scalar response and the basis coefficients of the functional predictors. The finite-sample performance and computing speed are evaluated using a series of Monte-Carlo simulation studies and a sugar process dataset.
Original languageEnglish
Article numbere3452
Pages (from-to)1-16
Number of pages16
JournalJournal of Chemometrics
Volume36
Issue number12
Early online date12 Oct 2022
DOIs
Publication statusPublished - Dec 2022

Keywords

  • Bidiag1
  • Bidiag2
  • NIPALS
  • SIMPLS
  • bidiagonalization

Fingerprint

Dive into the research topics of 'On partial least squares estimation in scalar-on-function regression models'. Together they form a unique fingerprint.

Cite this