%matplotlib inline function 218
ACF (autocorrelation function) 41 – 42, 62, 83, 103
add_changepoints_to_plot method 374
add_country_holidays method 385
ADF (augmented Dickey-Fuller) test 38, 108, 147, 188, 205, 222, 362, 411
AIC (Akaike information criterion) 205, 361, 411
selecting model using 114 – 116
antidiabetic drug prescriptions, forecasting 216 – 229
forecasting and evaluating performance 225 – 229
importing required libraries and loading data 218 – 219
conducting residual analysis 224
performing model selection 222 – 224
visualizing series and components 219 – 220
ARIMA (autoregressive integrated moving average) model 142 – 143
ARLSTM (autoregressive LSTM) model 320 – 328
household electric power consumption forecast 355 – 356
ARMA (autoregressive moving average) process
identifying stationary 106 – 111
ARMA (p,q) (autoregressive moving average) model 61, 129, 140, 157, 167
AR(p) (autoregressive) model 61, 84, 140
augmented Dickey-Fuller test. See ADF (augmented Dickey-Fuller) test
autocorrelation function. See ACF (autocorrelation function)
automating time series forecasting 361 – 395
automated forecasting libraries 362 – 363
advanced functionality 370 – 381
basic forecasting with 365 – 369
implementing robust forecasting process 381 – 393
autoregressive process 81 – 100
predicting average weekly foot traffic 82 – 83
partial autocorrelation function 89 – 92
average weekly foot traffic, predicting 82 – 83
bandwidth usage, forecasting 102 – 105, 132 – 136
changepoint_prior_scale parameter 379
CNNs (convolutional neural networks) 305 – 320, 412
household electric power consumption forecast 351 – 355
as multi-output model 315 – 317
as single-step model 310 – 313
compile_and_fit function 272, 295, 309, 322, 332
autoregressive moving average process
identifying stationary 106 – 111
devising modeling procedure 111 – 132
Akaike information criterion 113 – 116
forecasting bandwidth usage 102 – 105, 132 – 136
consumption, forecasting 203 – 213
CPI (consumer price index) 397
DataWindow class 249, 253 – 260, 273, 295, 309, 322, 343 – 345
implementing DataWindow class 253 – 260
applying baseline models 260 – 268
multi-output baseline model 266 – 268
multi-step baseline models 263 – 266
single-step baseline model 260 – 262
how deep learning models are trained 249 – 253
implementing DataWindow class 253 – 260
household electric power consumption forecast
preparing for modeling 342 – 346
implementing deep neural network 276 – 283
as multi-output model 282 – 283
as single-step model 278 – 280
implementing linear model 271 – 276
implementing multi-output linear model 275 – 276
implementing single-step linear model 272
Dense layer 273, 296, 298, 311, 321, 349 – 350
disposable income, forecasting 203 – 213
DNNs (deep neural networks) 270, 276 – 283
household electric power consumption forecast 350 – 351
implementing as multi-output model 282 – 283
implementing as multi-step model 281 – 282
implementing as single-step model 278 – 280
EPS (earnings per share) 15, 142
exogenous predictors or input variables 180
external variables, adding to model 180 – 196
household electric power consumption forecast 338 – 342
identifying seasonal period 339 – 341
removing unnecessary columns 338
splitting and scaling data 341 – 342
GDP, forecasting using SARIMAX model 186 – 195
Granger causality test 201 – 203
grangercausalitytests function 207
GRU (gated recurrent unit) 288
historical mean, forecasting 17 – 22
implementing historical mean baseline 19 – 22
setting up for baseline implementations 17 – 19
holidays_prior_scale parameter 379
household electric power consumption, forecasting 329 – 358
data wrangling and preprocessing 333 – 338
identifying seasonal period 339 – 341
removing unnecessary columns 338
splitting and scaling data 341 – 342
combining CNN with LSTM 354 – 355
selecting best model 356 – 358
objective of project 331 – 332
preparing for modeling 342 – 346
defining DataWindow class 343 – 345
hyperbolic tangent (tanh) activation function 292
Interpretable Machine Learning (Molnar) 317
multi-step baseline models 263 – 264
naive prediction of the future 25 – 26
LDA (linear discriminant analysis) 245
Ljung-Box test 120 – 121, 150, 170, 191, 208, 224, 406, 412
LSTM layer 296, 314, 321, 351
LSTM (long short-term memory) model 287 – 304
household electric power consumption forecast 351, 354 – 355
as multi-output model 299 – 302
as single-step model 295 – 297
recurrent neural networks 288 – 290
MAE (mean absolute error) 77, 98, 136, 261, 285, 298, 318, 328, 342, 368, 397
make_future_dataframe method 367
MA (moving average) model 55, 63
MAPE (mean absolute percentage error) 19, 176, 194, 212, 368
MA(q) (moving average) model 61, 81, 101, 140
mean_absolute_error function 77
mean_squared_error function 51, 74, 96
mean squared error (MSE) 11, 51, 74, 93, 261, 271, 342, 400
monthly air passengers, forecasting number of 163 – 178
monthly average retail price, forecasting 396 – 409
data preprocessing and visualization 398 – 400
developing SARIMA model 404 – 409
modeling with Prophet 400 – 403
objective of project 397 – 398
moving average (MA) model 55, 63
moving average (MA(q)) model 61, 81, 101, 140
moving average process 61 – 80
MSE (mean squared error) 11, 51, 74, 93, 261, 271, 342, 400
implementing CNNs as 315 – 317
implementing DNNs as 282 – 283
implementing LSTM model as 299 – 302
multiple time series 197 – 215
forecasting real disposable income and real consumption 203 – 213
vector autoregression model 199 – 201
designing modeling procedure for 201 – 203
Granger causality test 201 – 203
MultiStepLastBaseline class 263
implementing CNNs as 314 – 315
implementing DNNs as 281 – 282
implementing LSTM model as 297 – 299
predicting last known value 263 – 264
repeating input sequence 264 – 266
naive prediction of the future 14 – 29
defining baseline model 16 – 17
forecasting historical mean 17 – 22
implementing historical mean baseline 19 – 22
setting up for baseline implementations 17 – 19
forecasting last year's mean 23 – 25
implementing naive seasonal forecast 26 – 28
predicting using last known value 25 – 26
naive seasonal forecast, implementing 26 – 28
non-stationary time series 140 – 155
autoregressive integrated moving average model 142 – 143
modifying general modeling procedure 143 – 145
numpy library 19, 33, 67, 127, 333, 365, 418
optimize_ARIMA function 149, 168
optimize_ARMA function 115, 149
optimize_SARIMAX function 188, 223, 390, 405
PACF (partial autocorrelation function) 89 – 92
performance_metrics function 377
plot_diagnostics method 123, 150, 224
plot method 262, 273, 296, 311, 325, 347, 370
advanced functionality 370 – 381
cross-validation and performance metrics 374 – 378
hyperparameter tuning 379 – 381
visualization capabilities 370 – 374
basic forecasting with 365 – 369
implementing robust forecasting process 381 – 393
predicting popularity searches on Google 381 – 388
monthly average retail price forecast 400 – 403
PyTorch Forecasting website 362
Q-Q (quantile-quantile) plot 111, 118 – 120, 411
forecasting random walks 47 – 54
identifying random walks 35 – 47
autocorrelation function 41 – 42
testing for stationarity 38 – 41
recursive_forecast function 193
ReLU (Rectified Linear Unit) activation function 278
antidiabetic drug prescriptions forecast 224
quantile-quantile plot 118 – 120
RNNs (recurrent neural networks) 288 – 290, 305
rolling_forecast function 72, 132, 192, 210, 225
SARIMA(p,d,q)(P,D,Q)m (seasonal autoregressive integrated moving average) model 156
SARIMA (seasonal autoregressive integrated moving average) model
monthly average retail price forecast 404 – 409
forecasting real GDP using 186 – 195
forecasting number of monthly air passengers 163 – 178
comparing performance of methods 176 – 178
forecasting with an ARIMA model 165 – 171
forecasting with SARIMA model 171 – 175
identifying seasonal patterns in time series 160 – 163
naive seasonal forecast 26 – 28
seasonality_mode parameter 379
seasonality_prior_scale parameter 379
Sequential model 273, 296, 311
implementing CNNs as 310 – 313
implementing DNNs as 278 – 280
implementing LSTM model as 295 – 297
split_to_inputs_labels function 255
autoregressive moving average process 106 – 111
finding order of stationary autoregressive process 85 – 92
statsmodels library 42, 67, 91, 108, 161, 183, 203, 207, 418
tanh (hyperbolic tangent) activation function 292
timeseries_dataset_from_array function 256
time series forecasting 3 – 13, 410 – 417
automating with Prophet 361 – 395
compared to other regression tasks 12 – 13
time series sometimes do not have features 13
components of time series 5 – 8
deep learning for 233 – 247, 412 – 413
filtering with convolutional neural networks 305 – 319
if forecasting doesn't work 413 – 415
multiple time series 197 – 215
non-stationary time series 140 – 155
other applications of time series data 415 – 416
overview 4 – 8
sources of time series data 416 – 417
statistical methods for 411 – 412
steps in 8 – 11
determining what must be forecast to achieve goals 9
developing forecasting model 10 – 11
setting horizon of forecast 10
VAR (vector autoregression) model 199 – 201
designing modeling procedure for 201 – 203
Granger causality test 201 – 203
18.118.119.229