Autoregression

An autoregression is a time series model that typically uses the previous values of the same series as an explanatory factor for the regression in order to predict the next value. Let's say that we have measured and kept track of a metric over time, called yt, which is measured at time t when this value is regressed on previous values from that same time series. For example, yt on yt-1:

As shown in the preceding equation, the previous value yt-1 has become the predictor here and yt is the response value that is to be predicted. Also, εt is normally distributed with a mean of zero and variance of 1. The order of the autoregression model is defined by the number of previous values that are being used by the model to determine the next value. Therefore, the preceding equation is a first-order autoregression, or AR(1). If we have to generalize it, a kth order autoregression, written as AR(k), is a multiple linear regression in which the value of the series at any time (t) is a (linear) function of the values at times t1t2, tk.

The following list shows what the following values means for an AR(1) model:

  • When β= 0, yt, it is equivalent to white noise
  • When β= 1 and β0= 0, yt, it is equivalent to a random walk
  • When β= 1 and β≠ 0yt, it is equivalent to a random walk with drift
  • When β< 1, yt, it tends to oscillate between positive and negative values
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.221.254.61