Autoregressive (AR) and Moving Average (MA) model
Autoregressive(AR) and moving average(MA) model
A process \(\{X_t\}\) is said to be an ARMA(p,q) process if
- \(\{X_t\}\) is stationary
- \(\forall t. X_t - \phi_1X_{t-1}-...-\phi_qX_{t-p} = a_t + \theta_1a_{t-1}+...+\theta_qa_{t-q}\)
using backward shift operation notation \(B^h=x_{t-h}\):
\(\Phi(B)x_t = (1-\phi_1B - ... - \phi_p B^p)x_t = (1+\theta_1B + ...+\theta_qB^q)a_t = \Theta(B)a_t\) where \(a_t \sim NID(0, \sigma^2)\)
\(\{X_t\}\) is an ARMA(p,q) process with mean \(\mu\) if \(\{X_t-\mu\}\) is an ARMA(p,q) process.
Moving average model(MA(q))
MA(\(\infty\)) If \(\{a_t\}\sim NID(0, \sigma^2)\) then we say that \(\{X_t\}\) is a MA(\(\infty\)) process of \(\{a_t\}\) if \(\exists\{\psi_n\}, \sum^\infty |\psi_j|<\infty\) and \(X_t = \sum^\infty \psi_j a_{t-j}\) where \(t\in\mathbb{Z}\).
We can calculate ACF of a stochastic process \(\{X_t\}\) a.l.s. \(\{X_t\}\) can be writtin in the form of a MA(\(\infty\)) process
Also, MA(\(\infty\)) is a required condition for \(\{X_t\}\) to be stationary.
Theorem The MA(\(\infty\)) process is stationary with 0 mean and autocovariance function \(\gamma(k) = \sigma^2 \sum^\infty \psi_j\psi_{j+|k|}\)
MA(q) \(X_t = \sum_{i=0}^q \theta_i a_{t-i} = \Theta(B)a_t\) \(\theta_0 = 1, B\) is the backward shift operator, \(B^hX_t = X_{t-h}\) and \(a_t\sim NID(0, \sigma^2)\)
Under MA(q) model
Similarly,
Then, the autocorrelation function (ACF) will be
Autoregressive model of order p (AR(p))
\(X_t - \phi_1X_{t-1}-...-\phi_pX_{t-p} = \Phi(B)X_t = a_t\)
where \(a_t\sim NID(0, \sigma^2), B^hX_t = X_{t-h}, h\in\mathbb{Z}, \Phi(B)=(1-\phi_1B-...-\phi_p B^p)\)
AR(1)
Notice that for a \(AR(1)\) process, \(a\sim NID(0, \sigma^2)\) and \(a_t\) is uncorrelated with all previous \(X_s, s<t\)
is a \(MA(\infty)\) process
Causal or future independent AR process when \(|\phi|< 1\) for an \(AR(1)\)
Checking stationarity of AR(p)
\(\Phi(B) = 1-\phi_1B-...-\phi_pB^p=0\) must have all the roots line outside the unit circle.
ACF
AR(1) Case
For \(k\in\mathbb{Z}^+\), multiply \(X_{t-k}\) on both sides
Taking expectation, consider \(E(a_tX_{t-k})\)
\begin{align} cov(a_t, X_{t-k}) &= E(a_t X_{t-k})-E(a_t)E(X_{t-k})\ &= E(a_t X_{t-k}) - 0\ &= cov(a_t, \sum_0^\infty \phi^i a_{t-k-i}) = 0 \end{align}
\(a_t\) is uncorrelated with previous \(a\)'s.
since \(cov(X_t,X_{t-k}) = E(X_tX_{t-k})-0\)
By induction, \(\gamma(k)=\phi^k\gamma(0)\)
AR(2) Case
Multiple both sides by \(X_t\)
Taking expectation, note that \(X_t\) is a lin.comb of \(a\).
Multiple both sides by \(X_{t-1}\) and take expectations
Multiple both sides by \(X_{t-2}\) and take expectations
... Using this pattern
with base case
AR(p) case
Given \(X_t = (\sum_1^p \phi_iX_{t-i}) + a_t\), is stationary is all \(p\) roots lie outside of the unit circle
Yule-Walker equations
For the first \(p\) autocorrelations:
Partial Autocorrelation Function (PACF)
\(\phi_{kk} = corr(X_t, X_{t+k}\mid X_{t+1},...,X_{t+k-1})\)
the correlation between \(X_t, X_{t+k}\) after their mutual linear dependency on the intervening variables has been removed.
For a given lag \(k\), \(\forall j \in \{1,2,...,k\}\).
We regard the ACFs are given, take regression parameters \(\phi_{ki}\) and wish to solve for \(\phi_{kk}\).
which all together forms the Yule-Walker equations.
Example
For lag 1, \(\rho_1 = \phi_{11},\rho_0\Rightarrow \rho_1=\phi_{11}\)
For lag 2,
Causal and invertible
Causal/stationary if \(X_t\) can be expressed as an MA(\(\infty\)) process
Invertible if \(X_t\) can be expressed as an AR(\(\infty\)) process.
Duality between AR amd MA processes
A finite-order stationary AR(p) process corresponds to a MA(\(\infty\)) process, and a finite-order invertible MA(q) corresponds to an AR(\(\infty\)) process.
Example
Given model \(X_t - \phi_1 X_{t-1} - \phi_2 X_{t-2} = a_t = \theta a_{t-1}\)
Assume the process is causal, then \(X_t = \sum_0^\infty \psi_i a_{t-i} = a_t\sum_0^\infty \psi_i B^i = \psi(B)a_t\) by causal process
\(\phi(B)X_t = \theta(B) a_t \Rightarrow X_t = \frac{\theta(B)a_t}{\phi(B)}\) by ARMA model
\(\Rightarrow \Theta(B)/\Phi(B)=\Psi(B)\)
Replace back into the model \(1+\theta B = (\sum_0^\infty \psi_iB^i)(1-\phi_1B - \phi_2B^2)\)
Consider \(B\), \(\theta B = \psi_1B -\phi_1B\Rightarrow \psi_1 = \phi_1 + \theta\)
Consider \(B^2\), \(0 = -\theta_2B^2-\psi_1\theta_1B +\psi_2B^2\Rightarrow \psi_2 = \phi_2 + \phi_1(\phi_1+ \theta)\)
Assume the process is invertible, then
\(a_t = \sum_0^\infty \pi_i X_{t-i} = X_t\sum_0^\infty \pi_i B^i\),
similarly we get \(\Phi(B)=\Theta(B)\Pi(B)\)
Wold Decomposition
Any zero-mean process \(\{X_t\}\) wgucg us bit deterministic can be expressed as a sum of \(X_t = U_t + V_t\) where \(\{U_t\}\) denotes an MA(\(\infty\)) process and \(\{V_t\}\) is a deterministic process which is uncorrelated with \(\{U_t\}\) - deterministic if the values \(X_{n+j}, j\geq 1\) of the process \(\{X_t\}\) were perfectly predicatable in term of \(\mu_n=sp\{X_t\}\) - If \(X_n\) comes from a deterministic process, it can be predicted (or determined) by its past observations of the process
Model identification
process | ACF | PACF |
---|---|---|
AR(p) | tails off | cuts off after lag p |
MA(q) | cuts off after lag q | tails off |
ARMA(p,q) | tails off after (q-p) | tails off after (p-q) |
Model Adequacy
The overall tests that check an entire group of residual autocorrelation functions are called portmanteau tests.
Box and Pierce \(Q = n \sum_1^m \hat\rho_k^2 \sim \chi^2_{m-(p+q)}\)
Ljung and Box \(Q=\sum_1^m \frac{n(n+2)\hat\rho_k^2}{n-k}\sim \chi^2_{m-(p+q)}\)
\(n\) is the number of observations
\(m\) is the max lag
\(p,q\) are fitted model
Model selection
BIC puts more penalties on the number of parameters
Example: Application of ARMA in Investment
Alternative assets modeling
\(y_t\) and \(r_t\) denote observable appraisal and latent economic returns.
Goal to infer unobservable economic returns using appraisal returns
Geltner method commercial real state
(by substitute \(y_{t-1}\)) where \(\phi\in (0,1), w_j := \phi^j (1-\phi)\) is the weight
Gertmansky, Low, & Markorov
where \(w_i\in(0,1), \sum w_i = 1\)
Since \(y_t\) is a linear combination of white noise
Factor Modeling
The economic returns can be regressed by the market returns