Definitions

Chapter 1

  • Sampling Interval: When a variable is measured sequentially in time over or at a fixed interval.
  • Discrete-time stochastic process: A sequence of random variables defined at fixed sampling intervals.
  • Serially dependent: Observations close together in time tend to be correlated
  • Seasonal variation: A repeating pattern within each year, although the term is applied more generally to repeating patterns within any fixed period, such as restaurant bookings on different days of the week
  • Multiple time series: Multiple time series of different variables.
  • Stochastic trends: Trends that seem to change direction at unpredictable times rather than displaying relatively consistent patterns.
  • Random walk: A mathematical model to handled Stochastic trends data.
  • Price index: Is the ratio of the cost of a basket of goods now to its cost in some base year.

Chapter 2

  • Asymptotically: As the sample size approaches infinity.
  • Ensemble mean: The average taken across all the possible time series that might have been produced.
  • Ergodic: A time series model that is stationary in the mean if the time average for a single time series tends to the ensemble mean as the length of the time series increases.
  • Second-order properties: Include the mean, variance, and serial correlation.
  • Lag: The number of time steps between the variables.
  • Autocorrelation: A correlation of a variable with itself at different times. Also known as serial correlation.
  • Autocovariance function (acvf): \(\gamma_{k} = E[(x_t - \mu)(x_{t+k} - \mu)]\)
  • Autocorrelation function (acf): \(\rho_k = \frac{\gamma_{k}}{\sigma^2}\)
  • Sample acf: \(r_k = \frac{c_k}{c_0}\)
  • Correlogram: a plot of \(r_k\) against k. The x-axis gives the lag (k) and the y-axis gives the autocorrelation (\(r_k\)) at each lag.

Chapter 3

  • Leading variables:
  • Cross-correlation function:
  • Cross-correlgoram:
  • Second-order stationary:
  • Exponential smoothing:
  • Holt Winters:

Chapter 4

  • Autogressive process:
  • White noise: Used to refer to series that contained all frequencies in equal proportions, analogous to white light.
  • Purely random: Another tearm for white noise.
  • Random walk: A fundamental non-stationary model based on discrete white noise
  • Discrete white noise: A times series with variables \(w_1, w_2, . . . , w_n\) that are independent and identically distributed with a mean of zero.
  • Synthetic series: Time series simulated using a model.
  • Bootstrapping: Simulations to generate plausible future scenarios and to construct confidence intervals for model parameters.

Chapter 5

  • Serially correlated:
  • Generalized Least Squares: The procedure is essentially based on maximising the likelihood given the autocorrelation in the data and is implemented in R in the gls() function (within the nlme library). _ Time Series Forecast: Think of a forecast from a regression model as an expected value conditional on past trends continuing into the future.

Chapter 6

  • strictly stationary:
  • second-order stationary:
  • Moving average (MA):
  • ARMA: