Time series - Modeling Approaches
Understand the spectrum of time‑series modeling approaches, including classical linear, fractional and time‑varying, vector/exogenous, non‑linear/heteroskedastic, and model‑free machine‑learning methods.
Summary
Read Summary
Flashcards
Save Flashcards
Quiz
Take Quiz
Quick Practice
How does an autoregressive model express the current value of a series?
1 of 12
Summary
Types of Time Series Models
Time series models form the foundation of forecasting and analyzing data that changes over time. Understanding when and how to use different models is essential for practitioners. This section covers the main model types, organized from simpler linear approaches to more complex frameworks.
Classical Linear Models
Classical linear models represent the primary tools for time series analysis. Each builds upon the others to handle different data characteristics.
Autoregressive (AR) Models
An autoregressive model expresses the current value as a linear combination of its own past values. Think of it as the series "remembering" its recent history:
$$yt = \phi1 y{t-1} + \phi2 y{t-2} + \cdots + \phip y{t-p} + \varepsilont$$
Here, $yt$ depends on $p$ previous values of itself, with coefficients $\phi1, \phi2, \ldots, \phip$, plus random error $\varepsilont$. This is denoted AR($p$).
Why use it? AR models work well when the current value has direct memory of past values—like stock prices or temperature readings, which tend to persist near their recent levels.
Moving-Average (MA) Models
A moving-average model expresses the current value as a linear combination of past random shocks (errors), not past values:
$$yt = \varepsilont + \theta1 \varepsilon{t-1} + \theta2 \varepsilon{t-2} + \cdots + \thetaq \varepsilon{t-q}$$
This is denoted MA($q$), where $q$ is the number of past shocks included. The coefficients $\theta1, \theta2, \ldots, \thetaq$ weight how much each past shock influences the current value.
Why use it? MA models work well when shocks have temporary impacts that fade quickly—like weather disruptions affecting production for a few periods, then disappearing.
Differencing and Integration
Real-world data often exhibits trends—a persistent upward or downward movement—making it nonstationary. A stationary series fluctuates around a constant mean with constant variance, which is crucial for classical time series methods to work properly.
Differencing removes trends by computing changes between consecutive periods:
$$\Delta yt = yt - y{t-1}$$
When a series becomes stationary only after differencing $d$ times, it is called integrated of order $d$, or I($d$). A series that is stationary without differencing is I(0), while a series requiring one differencing is I(1).
Why care about this? Using non-stationary data directly can produce spurious correlations and unreliable forecasts. Differencing stabilizes the data.
Autoregressive Moving-Average (ARMA) Models
ARMA models combine autoregressive and moving-average components:
$$yt = \phi1 y{t-1} + \cdots + \phip y{t-p} + \varepsilont + \theta1 \varepsilon{t-1} + \cdots + \thetaq \varepsilon{t-q}$$
Denoted ARMA($p$, $q$), these models capture both the memory of past values (AR part) and the effect of past shocks (MA part). ARMA models often require fewer coefficients to fit data than AR or MA alone.
Autoregressive Integrated Moving-Average (ARIMA) Models
ARIMA models integrate differencing with ARMA components, denoted ARIMA($p$, $d$, $q$). The process works in stages:
Difference the series $d$ times to achieve stationarity
Apply an ARMA($p$, $q$) model to the differenced data
For example, ARIMA(1,1,1) means: difference once, then apply an AR(1) and MA(1) model. ARIMA is extremely flexible and underlies many practical forecasting systems.
Fractionally Integrated Models
Classical integration (I($d$)) requires $d$ to be a non-negative integer. However, Autoregressive Fractionally Integrated Moving-Average (ARFIMA) models allow $d$ to be any real number, often between 0 and 1.
<extrainfo>
ARFIMA models capture "long memory"—where shocks have very persistent but gradually fading effects. For example, if a virus spreads through a population, its impact decays slowly over many periods. Standard ARIMA models assume shocks eventually die out; ARFIMA better captures slow decay. However, fractionally integrated models are more specialized and typically covered only in advanced time series courses.
</extrainfo>
Vector Autoregression (VAR) Models
So far, we've considered single time series. Vector autoregression models extend AR concepts to multiple series that may influence each other:
$$\begin{pmatrix} y{1,t} \\ y{2,t} \\ \vdots \\ y{n,t} \end{pmatrix} = \Phi1 \begin{pmatrix} y{1,t-1} \\ y{2,t-1} \\ \vdots \\ y{n,t-1} \end{pmatrix} + \cdots + \Phip \begin{pmatrix} y{1,t-p} \\ y{2,t-p} \\ \vdots \\ y{n,t-p} \end{pmatrix} + \begin{pmatrix} \varepsilon{1,t} \\ \varepsilon{2,t} \\ \vdots \\ \varepsilon{n,t} \end{pmatrix}$$
Here, each series depends on its own past values and the past values of all other series. Matrices $\Phi1, \ldots, \Phip$ capture these cross-dependencies.
Why use it? Economic and financial systems are interconnected. For instance, interest rates, inflation, and unemployment all affect each other. VAR models let you capture these relationships and trace how shocks spread through the system.
Models with Exogenous Inputs
Often, external variables affect your primary series without being affected by it. Autoregressive models with exogenous inputs (ARX) include these external series:
$$yt = \phi1 y{t-1} + \cdots + \phip y{t-p} + \beta0 xt + \beta1 x{t-1} + \cdots + \varepsilont$$
Here, $xt$ is an exogenous (external) variable that influences $yt$ but is determined outside the model. For example, a retailer's sales might depend on its own past sales (AR part) and also on advertising spending (exogenous input).
ARIMAX models combine this idea with differencing and moving-average components for greater flexibility.
Non-Linear and Heteroskedastic Models
Real data often violates the assumptions of classical linear models.
Non-Linear Autoregressive Exogenous (NARX) Models
NARX models extend ARX frameworks to capture non-linear relationships. Instead of a linear combination, they use non-linear functions:
$$yt = f(y{t-1}, y{t-2}, \ldots, xt, x{t-1}, \ldots) + \varepsilont$$
where $f$ could be a polynomial, neural network, or other non-linear function.
<extrainfo>
The image above shows various non-linear transformations (square root, exponential, linear, hyperbolic, parabolic, logistic). These illustrate how data might be non-linearly related. NARX models capture such relationships but are more computationally intensive than linear models.
</extrainfo>
Autoregressive Conditional Heteroskedasticity (ARCH/GARCH) Models
Classical time series models assume constant variance (homoskedasticity)—the scatter around the trend stays roughly the same over time. However, financial returns and other real data often show heteroskedasticity—variance that changes over time, with periods of high and low volatility.
ARCH models (Autoregressive Conditional Heteroskedasticity) model how variance evolves. The variance at time $t$ depends on squared past errors:
$$\text{Var}(\varepsilont | \text{history}) = \alpha0 + \alpha1 \varepsilon{t-1}^2 + \alpha2 \varepsilon{t-2}^2 + \cdots$$
Large past shocks make the variance large today, capturing the intuition that markets are more volatile after big moves.
GARCH models (Generalized ARCH) add another layer by making variance depend on its own past values:
$$\text{Var}(\varepsilont | \text{history}) = \alpha0 + \alpha1 \varepsilon{t-1}^2 + \cdots + \beta1 \text{Var}(\varepsilon{t-1}) + \cdots$$
This allows volatility clusters to persist longer and is widely used in finance for modeling stock returns and option pricing.
Time-Varying Autoregressive Models
Time-varying autoregressive models relax the assumption that coefficients remain constant over time. Instead, coefficients evolve:
$$yt = \phi1(t) y{t-1} + \phi2(t) y{t-2} + \cdots + \varepsilont$$
where $\phi1(t), \phi2(t), \ldots$ change as $t$ advances.
Why use it? Structural breaks and regime changes are common. An economy's behavior before and after a major policy shift differs significantly. A company's operations change as it grows. Time-varying models adapt to these changes.
One practical approach represents the coefficients using basis-function expansions—expressing $\phii(t)$ as a weighted sum of smooth functions that capture how coefficients drift over time. This avoids estimating too many parameters while allowing flexibility.
<extrainfo>
The referenced work by Zhang, Z. G.; Chan, S. C.; and Chen, X. (2013) develops a Kalman filter-based recursive method for tracking how the frequency spectrum of nonstationary signals changes over time. This is an algorithmic approach for handling time-varying dynamics, but specific algorithmic details are typically outside the scope of introductory exams.
</extrainfo>
Model-Free and Machine-Learning Approaches
Wavelet-Transform Methods
<extrainfo>
Wavelet-transform based methods, such as locally stationary wavelets, provide a model-free approach to analyzing time series. Rather than assuming a specific model form (like ARIMA), wavelets decompose a series into components at different frequencies and time scales. This approach is useful for understanding what frequencies drive a series and for handling series with changing frequency content over time. However, wavelet methods are typically covered in specialized time series courses rather than introductory treatments.
</extrainfo>
Hidden Markov Models
<extrainfo>
Hidden Markov models (HMMs) assume an underlying Markov process that evolves unseen (hidden) according to probabilistic rules. The observed data emerges from this hidden state process. For instance, in speech recognition, the hidden states represent phonemes, and the observed data are sound frequencies. HMMs are powerful for capturing regime switches and latent structure but require careful specification and estimation. They appear in specialized applications more than general time series forecasting.
</extrainfo>
Summary and Relationships
The classical linear models form a hierarchy:
AR and MA are building blocks.
ARMA combines them for efficiency.
ARIMA adds differencing to handle trends.
VAR extends to multiple series.
ARX/ARIMAX add external inputs.
ARCH/GARCH model time-varying variance.
NARX and time-varying models capture non-linearities and structural change.
Start with simpler models and add complexity only if the data and your goals demand it. Overly complex models fit noise rather than signal and forecast poorly.
Flashcards
How does an autoregressive model express the current value of a series?
As a linear combination of previous values.
How does a moving‑average model express the current value of a series?
As a linear combination of past random shocks.
What type of series does an integrated model describe?
Series that become stationary after differencing.
What three elements are combined in an autoregressive integrated moving‑average model?
Differencing, autoregressive components, and moving‑average components.
What is unique about the order of integration in an autoregressive fractionally integrated moving‑average model?
It is allowed to be fractional.
What recursive method did Zhang, Chan, and Chen (2013) create for these models?
A Kalman filter‑based recursive method for tracking the time‑varying spectrum of nonstationary signals.
How does a vector autoregression model extend standard autoregressive concepts?
It applies them to multiple series that influence each other.
What is the relationship between external series and the primary series in models with exogenous inputs?
External series affect the primary series but are not influenced by it.
What is the primary function of a non‑linear autoregressive exogenous model?
To capture non‑linear relationships with external inputs.
What does an autoregressive conditional heteroskedasticity model describe?
Changes in variance over time.
What kind of analysis approach do locally stationary wavelets provide?
Model‑free analysis.
What does a hidden Markov model assume about the underlying process?
It assumes an underlying Markov process with unobserved states.
Quiz
Time series - Modeling Approaches Quiz Question 1: What methodological approach did Zhang, Chan, and Chen (2013) introduce for tracking the time‑varying spectrum of nonstationary signals?
- Kalman filter‑based recursive method (correct)
- Wavelet‑based spectral decomposition
- Fourier transform with sliding windows
- Neural network time‑frequency estimator
Time series - Modeling Approaches Quiz Question 2: What does an autoregressive conditional heteroskedasticity (ARCH) model describe?
- Time‑varying volatility (variance) of a series (correct)
- Time‑varying mean of a series
- Fractional integration of a series
- Non‑linear relationships with exogenous variables
Time series - Modeling Approaches Quiz Question 3: Which method provides a model‑free analysis of time series using wavelet transforms?
- Locally stationary wavelet analysis (correct)
- Hidden Markov modeling
- Vector autoregression
- Autoregressive fractionally integrated modeling
Time series - Modeling Approaches Quiz Question 4: What does a moving‑average (MA) model represent in time‑series analysis?
- A linear combination of past random shocks (errors) (correct)
- A linear combination of previous observations
- A differenced version of the series
- A combination of past observations and shocks
Time series - Modeling Approaches Quiz Question 5: How do time‑varying autoregressive (TV‑AR) models differ from standard autoregressive models?
- Their coefficients are allowed to change over time (correct)
- They incorporate moving‑average terms
- They apply fractional differencing to the series
- They model multiple interdependent series simultaneously
Time series - Modeling Approaches Quiz Question 6: Which type of model expands the autoregressive framework to jointly model several interrelated time series?
- Vector autoregression (VAR) models (correct)
- Univariate autoregressive (AR) models
- Moving‑average (MA) models
- State‑space models
What methodological approach did Zhang, Chan, and Chen (2013) introduce for tracking the time‑varying spectrum of nonstationary signals?
1 of 6
Key Concepts
Time Series Models
Autoregressive model
Moving-average model
Integrated model
ARMA model
ARIMA model
ARFIMA model
Time‑varying autoregressive model
Vector autoregression
ARCH model
Probabilistic Models
Hidden Markov model
Definitions
Autoregressive model
A statistical model that expresses a variable as a linear combination of its own previous values.
Moving-average model
A model that represents a variable as a linear combination of past random shock terms.
Integrated model
A transformation that makes a non‑stationary time series stationary by differencing.
ARMA model
A combined autoregressive and moving‑average model for stationary time series.
ARIMA model
An autoregressive integrated moving‑average model that incorporates differencing to handle non‑stationarity.
ARFIMA model
An autoregressive fractionally integrated moving‑average model allowing fractional orders of integration.
Time‑varying autoregressive model
An autoregressive model whose coefficients evolve over time, often via basis‑function expansions.
Vector autoregression
A multivariate model that captures linear interdependencies among multiple time series.
ARCH model
An autoregressive conditional heteroskedasticity model describing time‑varying volatility in a series.
Hidden Markov model
A probabilistic model assuming an underlying Markov process with unobserved (hidden) states.