The typical regression task predicts the value of a target variable based on the values of one or more feature variables. For example, to predict the price of a house based on its characteristics like size, number of rooms, etc.
But in some cases, we want to predict the value of a variable based on its past values. In our example, we would predict the price of a house based on its previous prices instead of its characteristics.
We call this modeling Time Series Analysis. A Time Series is a collection of observations – values of a variable – ordered along the time. The time elapsed between observations varies years, months, days, seconds, or even milliseconds.
A simple model for Time Series Analysis is moving averages (MA). This model assumes that the following observation is the average of all past observations.
Another model for Time Series Analysis is the autoregressive model (AR). This model assumes that the following observation is linear dependent on the last p values of the time series. In other words, it’s a linear regression model applied to the p previous observations.
Finally, from combining these models, we form the more general models autoregressive-moving-averages (ARMA) and the autoregressive-integrated-moving-averages model (ARIMA).