Processing math: 100%

Residuals of regression models with ARIMA noises

Description of the model

We consider the following regression model with stationary ARMA noises

y=Xβ+μ

with μN(0,σ2Ω), yn×1, Xn×nX

The GLS estimators of β are defined by

ˆβ=(XΩ1X)1XΩ1y

An alternative solution will consist in applying the inverse of the Cholesky factor of the covariance matrix on the model and than in applying OLS.

More exactly, we suppose that Ω=LL. Than, we consider the regression model

L1y=L1Xβ+L1u

or

yl=Xlβ+ϵ

with ϵ N(0,σ2I)

We define the raw residuals as rraw=yXˆβ

We define the full residuals as rfull=ylXlˆβ=L1rraw

The leas squares problem ylXlβ is usually solved by means of the QR algorithm.

More exactly, we can find an orthogonal matrix Q and an upper triangular matrix R such that X=QR.

We have then:

ylXlβ∣∣=∣∣QyRβ

which is minimized for

ˆβ=R1I(Qy)I

We can consider then the qr residuals defined by rQR=(Qy)II. I refers to the first nX rows and II to the next rows.

rfulln×1 and rQR(nnX)×1. We have of course that ir2full,i=ir2QR,i

Finally, we can consider the one-step-ahead forecast errors : rKF,i=yiE(yii1). Those residuals can be generated by the Kalman filter.