Residuals of regression models with ARIMA noises
Description of the model
We consider the following regression model with stationary ARMA noises
y=Xβ+μwith μ∼N(0,σ2Ω), y∼n×1, X∼n×nX
The GLS estimators of β are defined by
ˆβ=(X′Ω−1X)−1X′Ω−1yAn alternative solution will consist in applying the inverse of the Cholesky factor of the covariance matrix on the model and than in applying OLS.
More exactly, we suppose that Ω=LL′. Than, we consider the regression model
L−1y=L−1Xβ+L−1uor
yl=Xlβ+ϵwith ϵ N(0,σ2I)
We define the raw residuals as rraw=y−Xˆβ
We define the full residuals as rfull=yl−Xlˆβ=L−1rraw
The leas squares problem yl−Xlβ is usually solved by means of the QR algorithm.
More exactly, we can find an orthogonal matrix Q and an upper triangular matrix R such that X=QR.
We have then:
∣∣yl−Xlβ∣∣=∣∣Q′y−Rβ∣∣which is minimized for
ˆβ=R−1I(Q′y)IWe can consider then the qr residuals defined by rQR=(Q′y)II. I refers to the first nX rows and II to the next rows.
rfull∼n×1 and rQR∼(n−nX)×1. We have of course that ∑ir2full,i=∑ir2QR,i
Finally, we can consider the one-step-ahead forecast errors : rKF,i=yi−E(yi∣i−1). Those residuals can be generated by the Kalman filter.