Residuals of regression models with ARIMA noises

Description of the model

We consider the following regression model with stationary ARMA noises

\[y = X \beta + \mu\]

with $\mu \sim N(0, \sigma^2\Omega)$, $y \sim n \times 1$, $X \sim n \times n_X$

The GLS estimators of $\beta$ are defined by

\[\hat\beta= \left(X'\Omega^{-1}X\right)^{-1} X'\Omega^{-1}y\]

An alternative solution will consist in applying the inverse of the Cholesky factor of the covariance matrix on the model and than in applying OLS.

More exactly, we suppose that $\Omega = LL’$. Than, we consider the regression model

\[L^{-1}y = L^{-1} X \beta + L^{-1}u\]

or

\[y_l = X_l \beta + \epsilon\]

with $\epsilon ~ N(0, \sigma^2 I)$

We define the raw residuals as $r_{raw}=y-X \hat\beta$

We define the full residuals as $r_{full}=y_l-X_l \hat\beta= L^{-1}r_{raw}$

The leas squares problem $y_l-X_l\beta$ is usually solved by means of the QR algorithm.

More exactly, we can find an orthogonal matrix Q and an upper triangular matrix R such that $X=QR$.

We have then:

\[\mid\mid y_l-X_l \beta\mid\mid=\mid\mid Q'y-R\beta \mid\mid\]

which is minimized for

\[\hat\beta=R_I^{-1}(Q'y)_I\]

We can consider then the qr residuals defined by $r_{QR}=(Q’y)_{II}$. $I$ refers to the first $n_X$ rows and $II$ to the next rows.

$r_{full} \sim n \times 1$ and $r_{QR} \sim (n-n_X) \times 1$. We have of course that $\sum_i{r_{full, i}^2}=\sum_i{r_{QR, i}^2}$

Finally, we can consider the one-step-ahead forecast errors : $r_{KF,i}=y_i-E(y_i\mid i-1)$. Those residuals can be generated by the Kalman filter.