Other time series models

In this chapter, we spent most of our time on studying models that describe a time series in terms of the patterns of correlations between different points in time. This approach led us to the ARIMA family of models, which we have seen are highly configurable and have successfully been employed in many real-world problems. There is a diverse array of methods that have been applied to the time series problem and in fact we have seen a few elsewhere in this book as well.

The neural networks that we studied in Chapter 4, Neural Networks, and the hidden Markov models that we saw in Chapter 8, Probabilistic Graphical Models, are two such examples. Sometimes, we can treat a time series as a regression problem, and so techniques from this area can be leveraged too.

One other important class of methods is exponential smoothing. There are two key premises behind methods that use this approach. The first of these is that a time series is usually decomposed into a number of different components. These include the trend, which describes a gradual shift in the mean of the time series, as well as the seasonal and cyclical components that contain patterns that repeat.

The second main idea is that if we remove trends and repeating patterns, we can predict the next time point as a weighted sum of the most recent time points where weights follow an exponential decay into the past. Thus, a forecast is computed as an exponentially weighted moving average. It has been shown that there are overlaps between the methods we have seen in this chapter and methods of exponential smoothing.

All the examples we have mentioned so far involve modeling how a time series behaves in time. Thus, the methods in question are known as time-domain methods. A radically different approach to modeling time series is to study their frequency properties using frequency domain methods, also known as spectral methods. The key intuition behind these methods is that we can decompose virtually any time series as a linear combination of sine and cosine waves of varying frequency, phase, and amplitude. This in turn stems from an initial intuition of decomposing a time series to find its periodic components, that is to say, the components that tend to repeat consistently over time.

Just as we used ACF and PACF plots to define the contribution of different time lags in the autocorrelation function of a time series, spectral methods make use of spectral density plots that show the contribution of different frequencies that make up a time series. The field of spectral analysis of time series has much in common with digital signal processing, drawing on important tools such as the Fast Fourier Transform and wavelets.

Note

Aside from references given earlier in this chapter, an online textbook that discusses exponential smoothing as well as other techniques not mentioned here is Forecasting: principles and practice, Rob J. Hyndman and George Athanasopoulos. The URL is https://www.otexts.org/book/fpp. To learn more about digital signal processing, the definitive text introducing the field is Digital Signal Processing, John G. Proakis and Dimitris K. Manolakis, Prentice Hall.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.130.199