Penalized nonparametric scalar-on-function regression via principal coordinates

Philip T. Reiss*, David L. Miller, Pei Shien Wu, Wen Yu Hua

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This article introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. In the proposed method, which we call principal coordinate ridge regression, one regresses the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, principal coordinate ridge regression, with dynamic time warping distance used to define the principal coordinates, is shown to outperform a functional generalized linear model. Supplementary materials for this article are available online.

Original languageEnglish
Pages (from-to)569-587
Number of pages10
JournalJournal of Computational and Graphical Statistics
Issue number3
Early online date2 Aug 2016
Publication statusPublished - 2017


  • Dynamic time warping
  • Functional regression
  • Generalized additive model
  • Kernel ridge regression
  • Multidimensional scaling


Dive into the research topics of 'Penalized nonparametric scalar-on-function regression via principal coordinates'. Together they form a unique fingerprint.

Cite this