top of page

Forecast Models

ARIMA

ARIMA (AutoRegressive Integrated Moving Average) is AI model used for stock forecasting. It models time series data with three components: Autoregression (AR), Integration (I), and Moving Average (MA).

 

The AR component is the relationship between the current value in a time series and preceding values. The I component is the number of differences required to make the time series stationary (i.e., constant mean and variance over time). MA component is the relationship between the current value and a random error term.

 

ARIMA models are often used to predict future values of a time series based on patterns seen in historical data. They can also be used to identify underlying trends, seasonality, and other patterns in the data.

VAR

Vector autoregression (VAR) is a AI model used to analyse the dynamic relationship between a set of time series variables, where each variable depends on its own past values, as well as on the past values of other variables in the system. VAR models are commonly used in economics, finance, and other fields to analyse the interdependent behaviour of multiple variables over time. The model predicts the future behaviour of each variable by estimating the relationships and interdependence among them based on their past behaviours. VAR models can also be used for forecasting, hypothesis testing and policy analysis.

Garch

GARCH (Generalized Autoregressive Conditional Heteroskedasticity) is a AI model used in finance for predicting changes in volatility of financial assets over time. It is used to model the relatively high volatility of financial markets, such as stock prices or exchange rates. The GARCH model extends the traditional autoregressive (AR) model by including a flexible specification of the variance of the error term in the equation, which takes into account the fact that volatility changes over time. This makes it particularly useful in financial risk management, where it can be used to calculate risk measures and assess the impact of market volatility on portfolio returns.

LSTM

LSTM stands for Long Short-Term Memory. It is a type of artificial neural network that is designed to overcome the limitations of standard recurrent neural networks. LSTM networks are particularly well-suited for tasks that require the processing of sequential data, such as speech recognition, language translation, and video analysis.

 

The key feature of LSTM networks is the use of memory cells, which allow information to flow through the network over multiple time steps. The memory cells can store and retrieve information from previous time steps, allowing the network to remember long-term patterns in the data.

 

LSTM networks are composed of four main components: an input gate, a forget gate, an output gate, and a memory cell. These components work together to control the flow of information through the network and allow it to learn from sequential data.

​

bottom of page