Processing math: 100%

LjungBox algorithm for pure MA models

Short description

We consider the following stationary moving average model of order q

yt=Θ(B)ϵt

First, considering the initial innovations ϵt defined for qt<0, we can write

(ytyt)=A(ϵtϵt)

where A(n+q)×(n+q) is defined by

(100θ110θ2θ110θqθ110000θqθ11)

Using the Pi-Weights of the model, we obtain G=A1

G=(100π110π2π1πn+q1πn+q2πn)

and we have that

(ϵϵ)=G(yy)=G1y+G2y

Writing $(\epsilon^, \epsilon)as\bar{\epsilon}and(y^, y)as\bar{y}$ , we have that

p(ˉϵ)=p(ˉy)=p(y)p(y|y) p(ˉϵ)=(2πσ2)(n+q)/2e(ˉϵˉϵ)/2σ2=(2πσ2)(n+q)/2eˉyGGˉy)/2σ2

Considering the equation G2y=G1y+ˉϵ, by standard regression theory, we get:

ˆy=E(y|y)=(G1G1)1G1G2yvar(y|y)=(G1G1)1σ2

We can now split p(ˉϵ) into

p(y|y)=(2πσ2)q/2|G1G1|1/2e(yˆy)G1G1(yˆy)/2σ2p(y)=(2πσ2)n/2|G1G1|1/2e(G1y+G2y)(G1y+G2y)/2σ2

So, the different steps of the algoritm for computing the likelihood are:

The processing defines the linear transformation Tzt=(ˆa ˆat) and the searched determinant.