We consider the following stationary moving average model of order q
yt=Θ(B)ϵtFirst, considering the initial innovations ϵ∗t defined for −q≤t<0, we can write
(y∗tyt)=A(ϵ∗tϵt)where A(n+q)×(n+q) is defined by
(10⋯⋯⋯⋯0θ110⋯⋯⋯⋮θ2θ110⋯⋯⋮⋱⋱⋱⋱⋱⋱⋮⋯θq⋯θ110⋮⋮⋱⋱⋱⋱⋱00⋯0θq⋯θ11)Using the Pi-Weights of the model, we obtain G=A−1
G=(10⋯0π11⋯0π2π1⋯⋮⋮⋮⋮⋮πn+q−1πn+q−2⋯πn)and we have that
(ϵ∗ϵ)=G(y∗y)=G1y∗+G2yWriting $(\epsilon^, \epsilon)as\bar{\epsilon}and(y^, y)as\bar{y}$ , we have that
p(ˉϵ)=p(ˉy)=p(y)p(y∗|y) p(ˉϵ)=(2πσ2)−(n+q)/2e−(ˉϵˉϵ′)/2σ2=(2πσ2)−(n+q)/2e−ˉy′G′Gˉy)/2σ2Considering the equation G2y=−G1y∗+ˉϵ, by standard regression theory, we get:
ˆy∗=E(y∗|y)=−(G′1G1)−1G′1G2yvar(y∗|y)=(G′1G1)−1σ2We can now split p(ˉϵ) into
p(y∗|y)=(2πσ2)−q/2|G′1G1|−1/2e−(y∗−ˆy∗)′G′1G1(y∗−ˆy∗)/2σ2p(y)=(2πσ2)−n/2|G′1G1|1/2e−(G1y∗+G2y)′(G1y∗+G2y)/2σ2So, the different steps of the algoritm for computing the likelihood are:
Solve V=G′
Compute by Cholesky G′Gˆz∗=G′aand|G′G|
The processing defines the linear transformation Tzt=(ˆa∗ ˆat) and the searched determinant.