A process {Xt} is said to be an ARMA(p,q) process if
{Xt} is stationary
∀t.Xt−ϕ1Xt−1−...−ϕqXt−p=at+θ1at−1+...+θqat−q using backward shift operation notation Bh=xt−h: Φ(B)xt=(1−ϕ1B−...−ϕpBp)xt=(1+θ1B+...+θqBq)at=Θ(B)at where at∼NID(0,σ2)
{Xt} is an ARMA(p,q) process with mean μ if {Xt−μ} is an ARMA(p,q) process.
Moving average model(MA(q))
MA(∞) If {at}∼NID(0,σ2) then we say that {Xt} is a MA(∞) process of {at} if ∃{ψn},∑∞∣ψj∣<∞ and Xt=∑∞ψjat−j where t∈Z.
We can calculate ACF of a stochastic process {Xt} a.l.s. {Xt} can be writtin in the form of a MA(∞) process
Also, MA(∞) is a required condition for {Xt} to be stationary.
Theorem The MA(∞) process is stationary with 0 mean and autocovariance function γ(k)=σ2∑∞ψjψj+∣k∣
MA(q)Xt=∑i=0qθiat−i=Θ(B)atθ0=1,B is the backward shift operator, BhXt=Xt−h and at∼NID(0,σ2)
Given Xt=(∑1pϕiXt−i)+at, is stationary is all p roots lie outside of the unit circle
Yule-Walker equations For the first p autocorrelations:
ρ(k)=1∑pϕiρ∣k−i∣
Partial Autocorrelation Function (PACF)
ϕkk=corr(Xt,Xt+k∣Xt+1,...,Xt+k−1) the correlation between Xt,Xt+k after their mutual linear dependency on the intervening variables has been removed.
For a given lag k, ∀j∈{1,2,...,k}.
ρi=1∑kϕkiρj−i
We regard the ACFs are given, take regression parameters ϕki and wish to solve for ϕkk. which all together forms the Yule-Walker equations.
Example For lag 1, ρ1=ϕ11,ρ0⇒ρ1=ϕ11
For lag 2,
ρ1=ϕ21+ϕ22ρ1
ρ2=ϕ21ρ1+ϕ22
⇒ϕ22=1−ρ12ρ2−ρ12
Causal and invertible
Causal/stationary if Xt can be expressed as an MA(∞) process
Invertible if Xt can be expressed as an AR(∞) process.
Duality between AR amd MA processes
A finite-order stationary AR(p) process corresponds to a MA(∞) process, and a finite-order invertible MA(q) corresponds to an AR(∞) process.
Example
Given model Xt−ϕ1Xt−1−ϕ2Xt−2=at=θat−1
Assume the process is causal, then Xt=∑0∞ψiat−i=at∑0∞ψiBi=ψ(B)at by causal process ϕ(B)Xt=θ(B)at⇒Xt=ϕ(B)θ(B)at by ARMA model ⇒Θ(B)/Φ(B)=Ψ(B)
Replace back into the model 1+θB=(∑0∞ψiBi)(1−ϕ1B−ϕ2B2)
Assume the process is invertible, then at=∑0∞πiXt−i=Xt∑0∞πiBi, similarly we get Φ(B)=Θ(B)Π(B)
Wold Decomposition
Any zero-mean process {Xt} wgucg us bit deterministic can be expressed as a sum of Xt=Ut+Vt where {Ut} denotes an MA(∞) process and {Vt} is a deterministic process which is uncorrelated with {Ut} - deterministic if the values Xn+j,j≥1 of the process {Xt} were perfectly predicatable in term of μn=sp{Xt} - If Xn comes from a deterministic process, it can be predicted (or determined) by its past observations of the process
Model identification
process
ACF
PACF
AR(p)
tails off
cuts off after lag p
MA(q)
cuts off after lag q
tails off
ARMA(p,q)
tails off after (q-p)
tails off after (p-q)
Model Adequacy
The overall tests that check an entire group of residual autocorrelation functions are called portmanteau tests.
Box and PierceQ=n∑1mρ^k2∼χm−(p+q)2 Ljung and BoxQ=∑1mn−kn(n+2)ρ^k2∼χm−(p+q)2 n is the number of observations m is the max lag p,q are fitted model
Model selection
AIC=−2logML+2k
BIC=−2logML+klogn
BIC puts more penalties on the number of parameters
Example: Application of ARMA in Investment
Alternative assets modeling
yt and rt denote observable appraisal and latent economic returns.
Goal to infer unobservable economic returns using appraisal returns
Geltner method commercial real state
yt=ϕyt−1+(1−ϕ)rt=∑∞ϕj(1−ϕ)rt−j=∑∞wjrt−j
(by substitute yt−1) where ϕ∈(0,1),wj:=ϕj(1−ϕ) is the weight
yt=ϕ^yt−1+a^t,r^t=1−ϕ^a^t
var(r^t)=(1−ϕ^)2σ2
Gertmansky, Low, & Markorov
yt=∑qwirt−i
where wi∈(0,1),∑wi=1 Since yt is a linear combination of white noise