# fregre.pc {fda.usc}

Functional Regression with scalar response using Principal Components Analysis.
Package:
fda.usc
Version:
1.2.3

### Description

Computes functional (ridge or penalized) regression between functional explanatory variable X(t) and scalar response Y using Principal Components Analysis.

Y=

where <.,.> denotes the inner product on L_2 and ε are random errors with mean zero , finite variance σ^2 and E[X(t)ε]=0.

### Usage

```fregre.pc(fdataobj, y, l =NULL,lambda=0,P=c(1,0,0),
weights = rep(1, len = n),...)
```

### Arguments

fdataobj
`fdata` class object or `fdata.comp` class object created
by `create.pc.basis` function.
y
Scalar response with length `n`.
l
Index of components to include in the model.If is null `l` (by default), `l=1:3`.
lambda
Amount of penalization. Default value is 0, i.e. no penalization is used.
P
If `P` is a vector: `P` are coefficients to define the penalty matrix object, see `P.penalty`. If `P` is a matrix: P is the penalty matrix object.
weights
weights
...
Further arguments passed to or from other methods.

### Details

The function computes the ν_1,...,ν_∞ orthonormal basis of functional principal components to represent the functional data as X(t)=∑_(k=1:∞) γ_k ν_k and the functional parameter as β(t)=∑_(k=1:∞) β_k ν_k, where γ_k= < X(t), ν_k > and β_k=<β,ν_k>.

The response can be fitted by:

• λ=0, no penalization, y.est= ν'(ν'ν)^{-1}ν'y
• Ridge regression, λ>0 and P=1, y.est=ν'(ν'ν+λ I)^{-1}ν'y
• Penalized regression, λ>0 and P!=0. For example, P=c(0,0,1) penalizes the second derivative (curvature) by `P=P.penalty(fdataobj["argvals"],P)`, y.est=ν'(ν'ν+λ v'Pv)^{-1}ν'y

### Values

Return:

call
The matched call of `fregre.pc` function.
coefficients
A named vector of coefficients.
residuals
`y`-`fitted values`.
fitted.values
Estimated scalar response.
beta.est
beta coefficient estimated of class `fdata`
df
The residual degrees of freedom. In ridge regression, `df(rn)` is the effective degrees of freedom.
r2
Coefficient of determination.
sr2
Residual variance.
Vp
Estimated covariance matrix for the parameters.
H
Hat matrix.
l
Index of principal components selected.
lambda
Amount of shrinkage.
P
Penalty matrix.
fdata.comp
Fitted object in `fdata2pc` function.
lm
`lm` object.
fdataobj
Functional explanatory data.
y
Scalar response.

### References

Cai TT, Hall P. 2006. Prediction in functional linear regression. Annals of Statistics 34: 2159-2179.

Cardot H, Ferraty F, Sarda P. 1999. Functional linear model. Statistics and Probability Letters 45: 11-22.

Hall P, Hosseini-Nasab M. 2006. On properties of functional principal components analysis. Journal of the Royal Statistical Society B 68: 109-126.

Febrero-Bande, M., Oviedo de la Fuente, M. (2012). Statistical Computing in Functional Data Analysis: The R Package fda.usc. Journal of Statistical Software, 51(4), 1-28. http://www.jstatsoft.org/v51/i04/

N. Kraemer, A.-L. Boulsteix, and G. Tutz (2008). Penalized Partial Least Squares with Applications to B-Spline Transformations and Functional Data. Chemometrics and Intelligent Laboratory Systems, 94, 60 - 69. http://dx.doi.org/10.1016/j.chemolab.2008.06.009

See Also as: `fregre.pc.cv`, `summary.fregre.fd` and `predict.fregre.fd`.
Alternative method: `fregre.basis` and `fregre.np`.

### Examples

```## Not run:
data(tecator)
absorp=tecator\$absorp.fdata
ind=1:129
x=absorp[ind,]
y=tecator\$y\$Fat[ind]
res=fregre.pc(x,y)
summary(res)
res2=fregre.pc(x,y,l=c(1,3,4))
summary(res2)
# Functional Ridge Regression
res3=fregre.pc(x,y,l=c(1,3,4),lambda=1,P=1)
summary(res3)
# Functional Regression with 2nd derivative penalization
res4=fregre.pc(x,y,l=c(1,3,4),lambda=1,P=c(0,0,1))
summary(res4)
betas<-c(res\$beta.est,res2\$beta.est,res3\$beta.est,res4\$beta.est)
plot(betas)
## End(Not run)```

### Author(s)

Manuel Febrero-Bande, Manuel Oviedo de la Fuente manuel.oviedo@usc.es

Documentation reproduced from package fda.usc, version 1.2.3. License: GPL-2