Autocorrelation (also called serial Correlation) measures how a time series is related to a lagged version of itself. Helps identify repeating patterns, seasonality, or persistence in data.
Formally, for a time series , the autocorrelation at lag is:
It takes values between and :
- : Positive correlation → if is above its mean, tends to be above its mean too.
- : Negative correlation → if is above its mean, tends to be below its mean.
- : No linear relationship between and its past at lag .
Intuition
Autocorrelation tells you how predictable a series is from its past:
- Stock returns: low/no autocorrelation (mostly random).
- Daily temperatures: high autocorrelation at lag 1 (yesterday’s temperature is a good predictor of today’s).
- Strong seasonal effects: autocorrelation spikes at seasonal lags (e.g., 7 days for weekly seasonality).
Why it matters
- Used to detect Decomposition in Time Series & Stationary Time Series.
- Helps decide AR and MA terms in ARIMA.
- Checked via Autocorrelation Function (ACF) and Partial Autocorrelation Function (PACF) plots (ACF, PACF Plots) to select lagged values.
Hurst exponents
- The Hurst exponent () is a measure used to characterize the long-term memory of time series.
- It helps to determine the presence of autocorrelation or persistence in the data.
- The goal of the Hurst exponent is to provide us with a scalar value that will help us to identify whether a series is
- random walking,
- trending,
- Mean reverting.