# Analysis of An $$AR(1)$$ Model in pyMC3¶

:

%matplotlib inline
import pymc3 as pm
import numpy as np
import matplotlib.pyplot as plt
import arviz as az

plt.style.use('seaborn-darkgrid')
print('Running on PyMC3 v{}'.format(pm.__version__))

Running on PyMC3 v3.7


Consider the following AR(1) process, initialized in the infinite past:

$y_t = \theta y_{t-1} + \epsilon_t,$

where $$\epsilon_t \overset{iid}{\sim} {\cal N}(0,1)$$. Suppose you’d like to learn about $$\theta$$ from a a sample of observations $$Y^T = \{ y_0, y_1,\ldots, y_T \}$$.

First, let’s generate our sample.

:

np.random.seed(seed=42)

T = 100
y = np.zeros((T,))

for i in range(1,T):
y[i] = 0.95 * y[i-1] + np.random.normal()

plt.plot(y); Consider the following prior for $$\theta$$: $$\theta \sim {\cal N}(0,\tau^2)$$. We can show that the posterior distribution of $$\theta$$ is of the form

$\theta |Y^T \sim {\cal N}( \tilde{\theta}_T, \tilde{V}_T),$

where

$\begin{split}\begin{eqnarray} \tilde{\theta}_T &=& \left( \sum_{t=1}^T y_{t-1}^2 + \tau^{-2} \right)^{-1} \sum_{t=1}^T y_{t}y_{t-1} \\ \tilde{V}_T &=& \left( \sum_{t=1}^T y_{t-1}^2 + \tau^{-2} \right)^{-1} \end{eqnarray}\end{split}$
:

tau = 1.0
with pm.Model() as ar1:
beta = pm.Normal('beta', mu=0, sigma=tau)
likelihood = pm.AR('likelihood', beta, sigma=1.0, observed=y)

:

with ar1:
trace = pm.sample(1000, tune=2000, cores=4)

Auto-assigning NUTS sampler...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [beta]
Sampling 4 chains, 0 divergences: 100%|██████████| 12000/12000 [00:01<00:00, 7564.71draws/s]

:

az.plot_trace(trace); :

mup = ((y[:-1]**2).sum() + tau**-2)**-1 * np.dot(y[:-1],y[1:])
Vp =  ((y[:-1]**2).sum() + tau**-2)**-1

print('Mean: {:5.3f} (exact = {:5.3f})'.format(trace['beta'].mean(), mup))
print('Std: {:5.3f} (exact = {:5.3f})'.format(trace['beta'].std(), np.sqrt(Vp)))

Mean: 0.956 (exact = 0.956)
Std: 0.032 (exact = 0.032)


## Extension to AR(p)¶

We can instead estimate an AR(2) model using pyMC3.

$y_t = \phi_1 y_{t-1} + \phi_2 y_{t-2} + \epsilon_t.$

The AR distribution infers the order of the process by size the of rho argmument passed to AR.

:

with pm.Model() as ar2:
beta = pm.Normal('beta', mu=0, sigma=tau, shape=2)
likelihood = pm.AR('likelihood', beta, sigma=1.0, observed=y)

:

with ar2:
trace = pm.sample(1000, tune=2000, cores=4)

Auto-assigning NUTS sampler...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [beta]
Sampling 4 chains, 0 divergences: 100%|██████████| 12000/12000 [00:03<00:00, 3772.52draws/s]
The number of effective samples is smaller than 25% for some parameters.

:

az.plot_trace(trace); You can also pass the set of AR parameters as a list.

:

with pm.Model() as ar3:
beta0 = pm.Normal('beta0', mu=0, sigma=tau)
beta1 = pm.Uniform('beta1', -1, 1)
likelhood = pm.AR('y', [beta0, beta1], sigma=1.0, observed=y)

:

with ar3:
trace = pm.sample(1000, tune=2000, cores=4)

Auto-assigning NUTS sampler...

:

az.plot_trace(trace); 