Autoregressive Models

When we use regression analysis to model the trend of a time series, the dependent variable is the observed data Yi and the independent variable is just the time period, t. An alternative approach is to model Yi as a function of one or more prior observations of Y. As with exponential smoothing, there are many variations on this basic idea of an autoregressive model. In this section, we'll examine one such model from the general class of AutoRegressive Integrated Moving Average, or ARIMA models. These are also sometimes called Box-Jenkins models, after the two statisticians George Box and Gwilym Jenkins who developed the techniques in the 1960s and early 1970s.

The ARIMA analysis properly is performed on a stationary (without trend) time series. If there is a trend, the Box-Jenkins approach first transforms Y into a stationary series, usually by means of taking differences. That is, if Y were a steadily increasing series, we might find that the first differences, Y* = Yt – Yt-1, is a stationary series.

An ARIMA model has three parameters:

ParameterRole Within the ARIMA Model
pAutoregressive order: the number of lagged (prior) values of Y to include in the model. A model that only uses one prior period has an AR order=1. If the model includes the two most recent periods, it has AR order=2, and so on.
dDifferencing order: the number of times Y is differenced to generate the stationary series Y*
qMoving average order: the number of lagged values of error terms (residuals) to include in the model.

The typically notation for an ARIMA model references all three parameters, or at least the nonzero parameters. Thus, an AR(1) model uses one lagged value of the observed values of Y. An AR(1,1) model uses one lagged value of the first differences of Y and so forth.

Our time series plainly is not stationary, so we will want to use differencing to "flatten" the curve. However, to illustrate the techniques we'll first estimate an AR(1) model, and then an AR(1,1) model. We'll compare the results of the two models to each other and then to our exponential smoothing estimates.

  1. Return to the Time Series results window.

  2. Once again, click the red triangle at the top of the results window and now select ARIMA.

  3. This opens a new panel; complete it as shown in Figure 17.9 and click Estimate.

Figure 17.9. Specifying an AR(1) Model

Autoregressive models are regression models. The AR(1) model (see results in Figure 17.10) can be expressed as the following equation:


Figure 17.10. Parameter Estimates for This AR(1) Model

In this case, both the estimated intercept and slope are significant and we see that, at the margin, the estimated index in the next period is nearly 96% of the prior period's index. As with the smoothing methods in the Time Series platform, we can also save the computations from this model by clicking on the red triangle at the top of the results panel and choosing Save Columns.

Before discussing the summary statistics for this model, let's estimate the AR(1,1) model and then compare the results. Recall that this next model uses the monthly change in the index (the "first difference") rather than the index itself as the basic element in the model.

  1. Click on the red triangle, and select ARIMA again.

  2. This time, set both the Autoregressive Order and the Differencing Order (see Figure 17.9) to 1. The parameter estimates for this model should look like Figure 17.11.

Figure 17.11. Parameter Estimates for AR(1,1) Model

We cannot directly compare these estimates to those for the AR(1,1) model, because prior to estimating the regression, JMP calculated the first differences. At the margin, the incremental change in the monthly change in the index is about 75% less than the prior month. The negative coefficient leads to oscillations, because if one month has a positive change, the forecast will be for a reduction in the following period and if a month has a negative change, the forecast will be positive. Although the computations are made for the relatively stationary series of first differences, JMP reconstructs the time series for the sake of the forecast graph, showing the general upward trend.

Now let's compare our autoregressive models to each other and to the smoothing models. Everything we need is in the Model Comparison panel, as shown in Figure 17.12. Because the panel is so wide, we've eliminated some of the reported statistics, focusing on the ones that we've discussed earlier.

Figure 17.12. Comparing Four Time Series Methods

Every time series is different, and we can't say categorically that one method will fit better than another. In this example, the AR(1,1) model outperforms the others on the four criteria that we have studied. It has the smallest variance, MAPE, and MAE of the four as well as the highest RSquare. Although we did not discuss the AIC (Akaike's "A" Information Criterion), we might note that it ranks second among the four models using that standard.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.21.166.99