Applied Time Series Analysis
In-Class Activities and Homework Exercises
Course Content
Course Outcomes
Chapter 1
Lesson 1
Introduce the course structure and syllabus
- Get to know each other
- Describe key concepts in time series analysis
- Explore an example time series interactively
Lesson 2
Use technical language to describe the main features of time series data
- Define time series analysis
- Define time series
- Define sampling interval
- Define serial dependence or autocorrelation
- Define a time series trend
- Define seasonal variation
- Define cycle
- Differentiate between deterministic and stochastic trends
Plot time series data to visualize trends, seasonal patterns, and potential outliers
- Plot a “ts” object
- Plot the estimated trend of a time series by computing the mean across one full period
Lesson 3
Decompose time series into trends, seasonal variation, and residuals
- Define smoothing
- Compute the centered moving average for a time series
- Estimate the trend component using moving averages
Plot time series data to visualize trends, seasonal patterns, and potential outliers
- Plot the estimated trend of a time series using a moving average
- Make box plots to examine seasonality
- Interpret the trend and seasonal pattern observed in a time series
Lesson 4
Use R to describe key features of time series data
- Import CSV data and convert to tsibble format
Decompose time series into trends, seasonal variation, and residuals
- Implement additive decomposition
- Explain how to remove seasonal variation using an estimate for seasonal component of a time series
- Compute the estimators of seasonal variation for an additive model
- Calculate the random component for an additive model
- Compute a seasonally-adjusted time series based on an additive model
Lesson 5
Decompose time series into trends, seasonal variation, and residuals
- Explain the differences between additive and multiplicative models
- Implement multiplicative decomposition
- Compute the estimators of seasonal variation for a multiplicative model
- Calculate the random component for a multiplicative model
- Compute a seasonally-adjusted time series based on a multiplicative model
Chapter 2
Lesson 1
Compute the key statistics used to describe the linear relationship between two variables
- Compute the sample mean
- Compute the sample variance
- Compute the sample standard deviation
- Compute the sample covariance
- Compute the sample correlation coefficient
- Explain sample covariance using a scatter plot
Interpret the key statistics used to describe sample data
- Interpret the sample mean
- Interpret the sample variance
- Interpret the sample standard deviation
- Interpret the sample covariance
- Interpret the sample correlation coefficient
Lesson 2
Define key terms in time series analysis
- Define the ensemble of a time series
- Define the expected value (or mean function) of a time series model
- Define the sample estimate of the population mean of a time series
- Define the variance function of a time series model
- State the constant variance estimator for a time series model
- Explain the stationarity assumption
- Explain the stationary variance assumption
- Define lag
- Define autocorrelation
- Define the second-order stationary time series
- Explain the autocovariance function in Equation (2.11)
- Explain the lag k autocorrelation function in Equation (2.12)
- Define the autocovariance function, acvf
- Define the sample autocorrelation function, acf
Calculate sample estimates of autocovariance and autocorrelation functions from time series data
- Define the sample autocovariance function, c_k
- Define the sample autocorrelation function, r_k
Lesson 3
Explain the theoretical implications of autocorrelation for the estimation of time series statistics
- Explain how positive autocorrelation leads to underestimation of variance in short time series
- Explain how negative autocorrelation can improve efficiency of sample mean estimate
Interpret correlograms to identify significant lags, correlations, trends, and seasonality
- Create a correlogram
- Interpret a correlogram
- Define a sampling distribution
- State the sampling distribution of rk
- Explain the concept of a confidence interval
- Conduct a single hypothesis test using a correlogram
- Describe the problems associated with multiple hypothesis testing in a correlogram
- Differentiate statistical and practical significance
- Diagnose non-stationarity using a correlogram
Chapter 3
Lesson 1
Explain the purpose and limitations of forecasting
- Define lead time
- Define forecasting
- Differentiate causation from correlation
Explain why there is not one correct model to describe a time series
- Explain why there can be several suitable models for a given time series
Use cross-correlation analysis to quantify lead/lag relationships
- Explain forecasting by leading indicators
- Define the population k-lag ccvf
- Define the population k-lag ccf
- Define the sample k-lag ccvf
- Define the sample k-lag ccf
- Estimate an ccf for two time series
- Interpret whether a variable is a leading indicator using a cross-correlogram
Evaluate the limitations of forecasting models based on past trends
- Explain how unexpected future events may invalidate forecast trends
- Avoid over-extrapolation of fitted trends beyond reasonable time horizons
Lesson 2
Implement simple exponential smoothing to estimate local mean levels
- Explain forecasting by extrapolation
- State the assumptions of exponential smoothing
- Define exponential weighted moving average (EWMA)
- State the exponential smoothing forecasting equation
- State the EWMA in geometric series form (in terms of x_t only Eq 3.18)
- Explain the EWMA intuitively
- Define the one-step-ahead prediction error (1PE)
- State the SS1PE used to estimate the smoothing parameter of a EWMA
- Indicate when the EWMA smoothing parameter is optimally set as 1/n
Lesson 3
Implement the Holt-Winter method to forecast time series
- Justify the need for the Holt-Winters method
- Describe how to obtain initial parameters for the Holt-Winters algorithm
- Explain the Holt-Winters update equations for additive decomposition models
- Explain the purpose of the parameters \(\alpha\), \(\beta\), and \(\gamma\)
- Interpret the coefficient estimates \(a_t\), \(b_t\), and \(s_t\) of the Holt-Winters algorithm
- Explain the Holt-Winters forecasting equation for additive decomposition models, Equation (3.22)
Lesson 4
Implement the Holt-Winter method to forecast time series
- Compute the Holt-Winters estimate by hand
- Use HoltWinters() to forecast additive model time series
- Plot the Holt-Winters decomposition of a time series (see Fig 3.10)
- Plot the Holt-Winters fitted values versus the original time series (see Fig 3.11)
- Superimpose plots of the Holt-Winters predictions with the time series realizations (see Fig 3.13)
Lesson 5
Implement the Holt-Winter method to forecast time series
- Explain the Holt-Winters method equations for multiplicative decomposition models
- Explain the purpose of the paramters \(\alpha\), \(\beta\), and \(\gamma\)
- Interpret the coefficient estimates \(a_t\), \(b_t\), and \(s_t\) of the Holt-Winters smoothing algorithm
- Explain the Holt-Winters forecasting equation for multiplicative decomposition models, Equation (3.23)
- Use HoltWinters() to forecast multiplicative model time series
- Plot the Holt-Winters decomposition of a TS (see Fig 3.10)
- Plot the Holt-Winters fitted values versus the original time series (see Fig 3.11)
- Superimpose plots of the Holt-Winters predictions with the time series realizations (see Fig 3.13)
Chapter 4
Lesson 1
Characterize the properties of discrete white noise
- Define Residual error
- Define discrete white noise (DWN)
- Define Gaussian white noise
- Simulate Gaussian white noise with R
- Plot DWN simulation results
- State DWN second order properties
- Explain how to estimate (or fit) a DWN process
- State the assumptions needed to categorize residual error series as white noise
Characterize the properties of a random walk
- Define a random walk
Simulate realizations from basic time series models in R
- Simulate a random walk
- Plot a random walk
Lesson 2
Characterize the properties of a random walk
- Define the second order properties of a random walk
- Define the backward shift operator
- Use the backward shift operator to state a random walk as a sequence of white noise realizations
- Define a random walk with drift
Simulate realizations from basic time series models in R
- Simulate a random walk
- Plot a random walk
Fit time series models to data and interpret fitted parameters
- Motive the need for differencing in time series analysis
- Define the difference operator
- Explain the relationship between the difference operator and the backward shift operator
- Test whether a series is a random walk using first differences
- Explain how to estimate a random walk with increasing slope using Holt-Winters
- Estimate the drift parameter of a random walk
Lesson 3
Characterize the properties of an \(AR(p)\) stochastic process
- Define an \(AR(p)\) stochastic process
- Express an \(AR(p)\) process using the backward shift operator
- State an \(AR(p)\) forecast (or prediction) function
- Identify stationarity of an \(AR(p)\) process using the backward shift operator
- Determine the stationarity of an \(AR(p)\) process using a characteristic equation
Check model adequacy using diagnostic plots like correlograms of residuals
- Characterize a random walk’s second order characteristics using a correlogram
- Define partial autocorrelations
- Explain how to use a partial correlogram to decide what model would be suitable to estimate an \(AR(p)\) process
- Demonstrate the use of partial correlogram via simulation
Lesson 4
Fit time series models to data and interpret fitted parameters
- Fit an \(AR(p)\) model to simulated data
- Explain the difference between parameters of the data generating process and estimates
- Calculate confidence intervals for AR coefficient estimates
- Interpret AR coefficient estimates in the context of the source and nature of historical data
Check model adequacy using diagnostic plots like correlograms of residuals
- Compare AR fitted models to an underlying data generating process
- Explain the limitations of stochastic model fitting as evidence in favor or against real world arguments.
Chapter 5
Lesson 1
Explain the difference between stochastic and deterministic trends in time series
- Describe deterministic trends as smooth, predictable changes over time
- Define stochastic trends as random, unpredictable fluctuations
- Explain the different treatment of stochastic and deterministic trends when forecasting
Fit linear regression models to time series data
- Define a linear time series model
- Explain why ordinary linear regression systematically underestimates of the standard error of parameter estimates when the error terms are autocorrelated
- Apply generalized least squares GLS in R to estimate linear regression model parameters
- Explain how to estimate the autocorrelation input for the GLS algorithm
- Compare GLS and OLS standard error estimates to evaluate autocorrelation bias
- Identify an appropriate function to model the trend in a given time series
- Represent seasonal factors in a regression model using indicator variables
- Fit a linear model for a simulated time series with linear time trend and \(AR(p)\) error
- Use acf and pacf to test for autocorrelation in the residuals
- Estimate a seasonal indicator model using GLS
- Forecast using a fitted GLS model with seasonal indicator variables
Apply differencing to nonstationary time series
- Transform a non-stationary linear to a stationary process using differencing
- State how to remove a polynomial trend of order \(m\)
Simulate time series
- Simulate a time series with a linear time trend and a \(AR(p)\) error
Lesson 2
Fit linear regression models to time series data
- Describe a Fourier series
- Explain how a few terms in a Fourier series can be used to fit a seasonal component
- Motivate the use of the harmonic seasonal model
- Represent seasonal factors using harmonic seasonal terms
Lesson 3
Fit linear regression models to time series data
- State the additive model with harmonic seasonal component
- Simulate a time series with harmonic seasonal components
- Identify an appropriate function to model the trend in a given time series
- Identify a parsimonious set of harmonic terms for use in a regression model
- Fit the additive model with harmonic seasonal component to real-world data
- Evaluate residuals using a correlogram and partial correlogram to ensure they meet the assumptions
Apply model selection criteria
- Use AIC to aid in model selection
Lesson 4
Apply logarithmic transformations to time series
- Explain when to use a log-transformation
- Estimate a harmonic seasonal model using GLS with a log-transformed series
- Explain how to use logarithms to linearize certain non-linear trends
Apply non-linear models to time series
- Explain when to use non-linear models
- Simulate a time series with an exponential trend
- Fit a time series model with an exponential trend
Lesson 5
Apply logarithmic transformations to time series
- Apply a log-transformation to a multiplicative time series
Apply the bias correction factor for inverse transformations
- State the bias correction procedure for log-transform estimates
- Explain when to use the bias correction factor
- Use the bias correction factor for a log-transform model
- Forecast using the inverse-transform and bias correction of a log-transformed model
Chapter 6
Lesson 1
Characterize the properties of moving average (MA) models
- Define a moving average (MA) process
- Write an MA(q) model in terms of the backward shift operator
- State the mean and variance of an MA(q) process
- Explain the autocorrelation function of an MA(q) process
- Define an invertible MA process
Fit time series models to data and interpret fitted parameters
- Determine an appropriate MA(q) model to fit to a time series based on the ACF plot
- Fit an MA(q) model to data in R using the arima() function
- Assess model fit by examining residual diagnostic plots
- Interpret the fitted MA coefficients
Lesson 2
Define autoregressive moving average (ARMA) models
- Write the equation for an ARMA(p,q) model
- Express an ARMA model in terms of the backward shift operators for the AR and MA components
- State facts about ARMA processes related to stationarity, invertibility, special cases, parsimony, and parameter redundancy
- Use ACF and PACF plots to determine if an AR, MA or ARMA model is appropriate for a time series
Apply an iterative time series modeling process
- Fit a regression model to capture trend and seasonality
- Examine residual diagnostic plots to assess autocorrelation
- Fit an ARMA model to the residuals if needed
- Check the residuals of the ARMA model for white noise
- Forecast the original series by combining the regression and ARMA model forecasts
Chapter 7
Lesson 1
Explain the concept of non-stationarity in time series
- Define non-stationarity and its implications for time series analysis
- Identify non-stationary behavior in time series plots
Apply differencing to remove non-stationarity
- Explain the concept of differencing and its role in removing non-stationarity
- Use differencing to transform non-stationary time series into stationary ones
- Interpret the results of differencing on time series plots and ACF/PACF
Identify integrated models and ARIMA notation
- Define integrated models and their relationship to differencing
- Understand the ARIMA notation and its components (p, d, q)
- Recognize the role of the ‘d’ parameter in ARIMA models
Lesson 2
Identify seasonal ARIMA models
- Define seasonal ARIMA models and their notation (p, d, q)(P, D, Q)[m]
- Identify the need for seasonal ARIMA models in time series with seasonal patterns
Apply the fitting procedure for seasonal ARIMA models
- Describe the steps involved in fitting seasonal ARIMA models
- Determine the appropriate order of differencing (d and D) based on ACF/PACF plots
- Select the order of AR and MA terms (p, q, P, Q) using ACF/PACF plots and model selection criteria
Fit seasonal ARIMA models to time series data using R
- Use R to fit seasonal ARIMA models
- Interpret the output for the ARIMA models, including coefficients and model diagnostics
- Forecast future values using the fitted seasonal ARIMA model
Lesson 3
Explain the concept of volatility in time series
- Define volatility and its importance in financial and climate time series
- Identify patterns of volatility in time series plots
Interpret ARCH and GARCH models
- Define ARCH models and their extensions, including GARCH models
- Understand the role of ARCH/GARCH models in capturing time-varying volatility
Simulate and fit GARCH models using R
- Simulate GARCH processes using R
- Fit GARCH models to time series data using R
- Interpret the R output for GARCH models, including coefficients and model diagnostics
Apply GARCH modeling to real-world time series
- Use GARCH models to analyze volatility in financial time series
- Apply GARCH models to climate time series to understand changing variability
- Incorporate GARCH models into forecasts and simulations for improved accuracy