Moving-average model

In time series analysis, the moving-average model (MA model), also called the moving-average process, is a standard approach for modeling univariate time series.

An MA model expresses the current value of a time series as a linear function of current and past random shocks (error terms) with finite lag length. In contrast to an autoregressive model, which regresses the variable on its past values, the moving-average model relies solely on the dependency structure of the error terms.

Together with the autoregressive (AR) model, the moving-average model is a special case and key component of the more general ARMA and ARIMA models of time series, which have a more complicated stochastic structure. Contrary to the AR model, the finite MA model is always stationary.

The moving-average model should not be confused with the moving average, a distinct concept despite some similarities.