next up previous
Next: About this document ...

Regression splines: a review
David Ruppert
Cornell University

Nonparametric estimation, also called smoothing, has been studied for many decades. Although smoothing is being applied to many problems, many statisticians are still not sufficiently familiar with smoothing techniques to use them with confidence. Regression splines are an extension of classical linear statistical models which makes them relatively easy for statisticians to understand and to use.

A pth degree univariate spline is a function with an everywhere continuous p-1 derivative and a pth derivative that is continuous everywhere except for taking jumps at a finite set of ``knots.'' Given a set of knots, the set of all splines with these knots is a vector space. Therefore, splines can be used as a linear statistical model, and in this capacity they are often called regression splines. Most work on regression splines has addressed the issue of knot selection using variable subset selection techniques. It has been assumed that after knot selection, the coefficients will be estimated by least squares.

An idea that has been neglected until recently is to bypass knot selection by using a large set of fixed knots and a penalized estimator.

The talk will review both the knot selection and penalty function approaches and discuss the extensions to multivariate splines. No knowledge of splines or of smoothing is assumed, only familiarity with multiple linear regression and the analysis of factorial designs.



 
next up previous
Next: About this document ...
David Ruppert
1/7/1999