iPredict

Time-series forecasting software

Time-series Forecasting Methods

IPredict offers a wide selection of time-series forecasting algorithms, regression forecasting, smoothing and curve or model fitting algorithms. Together with all classical algorithms you will find proprietary and advanced algorithms with superior performance in statistical forecasting. Take a look at IPredict benchmarks to see the difference in performance of some of these algorithms. To see how these methods are applied to everyday financial forecasting take a look at our free demo, the example program that applies Wavelet Forecasting to financial data.

All proprietary algorithms (Holt Winter’s Modified Multiple Seasonalities, Haar Denoising, Daubechies Linear Denoising, Daubechies Exponential Denoising, Wavelet Forecasting, Fractal Projection, Active Moving Average) are subject to copyright and all rights are reserved to IPredict.

The most important feature of IPredict is the Excel wizard that allows you, without forecasting expertise or statistics knowledge, to produce a reliable forecast in minutes. The wizard automates the time-series forecasting tasks, computes the best parameters given the time-series and allows you to choose visually the best approach for your problem. The best fit is calculated using the most appropriate technique by ranking using a user selected statistical error.

Classical Algorithms

Simple Moving Average

The Simple Moving Average smooth past data by arithmetically averaging over a specified period and projecting forward in time. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

Geometric Moving Average

The Geometric Moving Average smooth past data by geometrically averaging over a specified period and projecting forward in time. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

Triangular Moving Average

The Triangular Moving Average is a weighted moving average with weights that form a triangular shape. The projection technique is the same of the Simple Moving Average. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

Parabolic Moving Average

The Parabolic Moving Average is a weighted moving average with weights that form a parabolic shape. The projection technique is the same of the Simple Moving Average. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

Double Moving Average

The Double Moving Average applies in sequence for two times the Simple Moving Average algorithm. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

Exponential Moving Average

The Exponential Moving Average is summarized by the equation:

it is a weighted moving average with weights that decrease exponentially going backwards in time. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

Double Exponential Moving Average

The Double Exponential Moving Average applies the Exponential Moving Average twice. This is normally considered a smoothing algorithm and has poor forecasting results in most cases. If the equation of the single exponential moving average can be expressed as:

then the equation of the double exponential moving average can be expressed as:

Holt’s Double Exponential

Holt’s Double Exponential is similar to the Double Exponential Moving Average. It allows you to specify the two smoothing constants used in the process. It is useful for data in a simple linear trend.

Triple Exponential Moving Average

The Triple Exponential Moving Average applies three times the Exponential Moving Average. This is normally considered a smoothing algorithm and has poor forecasting results in most cases. If the equation of the single exponential moving average can be expressed as:

and the equation of the double exponential moving average can be expressed as:

then the equation of the triple exponential moving average can be expressed as:

Holt’s Triple Exponential

Holt’s Triple Exponential is the classical Holt’s forecasting algorithm. It is useful for data in a simple linear trend.

Adaptive Response Rate Exponential Smoothing

The Adaptive Exponential Smoothing automatically adjusts the smoothing parameters based on the forecast error. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

Holt Winter’s Additive

The Holt Winter’s Additive method is applicable when the time series contains a seasonal component. This method assumes the time series is composed by a linear trend and a seasonal cycle, it constructs three statistically correlated series (smoothed, seasonal and trend) and projects forward the identified trend and seasonality.

Holt Winter’s Multiplicative

Like Holt Winter’s Additive this method can be applied to a seasonal time series. The model assumes that the components of the time series (smoothed, seasonal and trend) are multiplied together giving as result a more ‘active’ time series.

Holt Winter’s Modified Multiple Seasonalities

The Modified Multiple Seasonalities is a proprietary algorithm based on Holt Winter’s algorithm that can take into consideration multiple seasonalities. This model is especially suited for financial time series. Two versions of this algorithm are now available with different computational logic.

Additive Decomposition

Additive Decomposition computes the decomposition of the time series into its components, trend, seasonality, cyclical and error. It projects the identified parts to the future and sums the resulting projection to form the forecast. The model is assumed to be additive (that is all parts are summed up to give the forecast). The model equation is:

where T is the trend, S the seasonality, C the cycle and ε the error.

Multiplicative Decomposition

Multiplicative Decomposition like Additive Decomposition computes the decomposition of the time series into its components, trend, seasonality, cyclical and error and then projects to the future. The model is assumed to be multiplicative (that is all parts are multiplied by each other to give the forecast). The model equation is:

where T is the trend, S the seasonality, C the cycle and ε the error.

Sparse Series Croston’s Exponential

This is an useful algorithm for sparse time series. The model is equivalent to an Exponential Moving Average both in quantities and in time.

Economic Time-series Forecasting

A new kind of algorithm suited for economic time-series like sales, inventories and similar. This algorithm assumes that the data it is fed has some periodicity (like an yearly periodicity) and computes the forecast starting from at least a couple of such periods and projecting it to the future.


Curve and Bayesian Model Fitting

Linear Trend / Regression

The Linear Trend fits the time series to a straight line and projecting forward in time. The model equation is:

where is commonly referred to as slope and as intercept.

Linear Trend And Additive Seasonality

The Linear Trend and Additive Seasonality assume the data is made of a linear trend plus a single additive seasonality. It then fits the equation to the data using a Bayesian fit and projects forward in time. The model equation is:

where is the amplitude of the seasonality, is the frequency, is the linear trend slope and is the linear trend intercept.

Linear Trend And Multiplicative Seasonality

The Linear Trend and Multiplicative Seasonality assume the data is made of a linear trend plus a single multiplicative seasonality. It then fits the equation to the data using a Bayesian fit and projects forward in time. The model equation is:

where is the amplitude of the seasonality, is the frequency, is the linear trend slope and is the linear trend intercept.

Linear Trend And Multiple Seasonalities

The Linear Trend and Multiple Seasonalities assume the data is made of a linear trend plus multiple additive or multiplicative seasonalities. It then fits the equation to the data using a Bayesian fit and projects forward in time. It is especially useful for financial time series.

Polynomial

The Polynomial algorithm fits a polynomial equation (up to the desired order) to the data using a Bayesian fit and projects forward in time. You must be careful not to over fit the data with a very long polynomial. The model equation is:

Logarithmic

The Logarithmic algorithm fits a logarithmic equation to the data using a Bayesian fit and projects forward in time. The model equation is:

Exponential

The Exponential algorithm fits an exponential equation to the data using a Bayesian fit and projects forward in time. The model equation is:


Wavelet Smoothing and Forecasting

Frequency Identification

One of the biggest issues when dealing with algorithms that require seasonality indexes is to compute the seasonalities. All competitors require you to compute these magic numbers and even the most advanced packages aren’t able to tell this simple and intuitive figure: the seasonality of a time series. The Frequency Identification algorithm computes the frequencies that are inside the input time series.

Haar Denoising

Removing noise from a time series is always difficult and current algorithms (averages, exponential averages, etc…) always introduce a lag in data or change the statistical properties of the underlying time series. Haar Denoising is able to remove the noise that is in the time series using the Haar Wavelet transform and a proprietary algorithm.

Daubechies Linear Denoising

Like Haar Denoising the Daubechies Linear Denoising applies a Daubechies Wavelet transform and a proprietary algorithm to linearly denoise the time series.

Daubechies Exponential Denoising

The denoising in this case decreases exponentially with the wavelet power.

Wavelet Forecasting

This algorithm uses the Daubechies Wavelet transform to produce a forecast with a proprietary algorithm.


Additional Functions and Algorithms

Random Number Generation

Special care must be taken into account when generating test beds for ideas and models. Usually the stock Rand() Excel function is not enough. IPredict provides four statistically independent Uniform Random Number Generators and one Gaussian Generator.

Wavelet Transforms

This is the base of the Wavelet forecasting algorithm. The wavelet transform is the base of a lot of today’s algorithms in the field of Digital Signal Processing, Quantum Mechanics, Image Processing and Speech Recognition among others. The forward and inverse Daubechies and Haar transforms are included.

Fourier Transforms

These are the classical Fourier Transforms for forecasting. IPredict includes Sine, Cosine and Fast Fourier Transforms.

Hurst Exponent

The estimate of the Hurst Exponent is very important to help understand the properties of a time-series and how much it looks like a random walk (i.e. how much the time-series is predictable). Three methods are provided based on Daubechies, Haar and Rescaled Range algorithms.


Advanced Algorithms

Fractal Projection

The Fractal Projection algorithm represents a new approach to forecasting. This algorithm is able to project to the future a pattern stretching it in times or values as appropriate.

Active Moving Average

The Active Moving Average belongs to the class of Exponential Moving Averages but its calculation is quadratic in nature and is much quicker than standard Moving Averages in “following” the original signal.


Kernel Smoothing

Kernel smoothing weights every single data point in a time-series with weights coming from a generating function called kernel.

Gaussian Kernel Smoothing

The Gaussian Kernel Smoothing is a classical Kernel Smoothing algorithm. The kernel function is the following:

Hilbert Kernel Smoothing

The Hilbert Kernel Smoothing function is:

Triangle Kernel Smoothing

The Triangle Kernel Smoothing function is:

Epanechnicov Kernel Smoothing

The Epanechnicov Kernel Smoothing function is:

Quartic Kernel Smoothing

The Quartic Kernel Smoothing function is:

Triweight Kernel Smoothing

The Triweight Kernel function is:

Cosine Kernel Smoothing

The Cosine Kernel function is:

Savitsky-Golay Smoothing

This algorithm is a smoothing filter that essentially applies a polynomial regression of a certain degree to a time-series. The advantage of the Savitsky-Golay filter is that it tends to preserve certain features of the time-series like local minima and maxima.

Spline Smoothing

Spline smoothing consists in computing the spline that approximates the input data and projecting it to internal or external points. It is useful when compressing or expanding data before a projection is done. Our library provides three different methods that allow you to smooth a time series in different contexts like filling zeroes of a time-series or projecting based on a different scale.


Digital Signal Processing Algorithms

Fourier Transform

The Fourier Transform is the basis of several forecasting algorithms and engineering computations. We offer the classical Fast Fourier Transform and the Sine and Cosine Fourier Transforms. This is the standard implementation well known in the industry. See this page on the Fast Fourier Transform for further information.

Fourier Power Spectrum

The Fourier Power Spectrum or Power Spectral Density of a signal is a real function that expresses the energy of the signal per single component frequency. This is normally called the spectrum of a signal.

Complex Fourier Transform

This is the general Discrete Fourier Transforms that can take as parameter complex valued signals and returns complex valued transforms. The algorithm is computed using the original algorithm and thus is “slow” compared to its “fast” counterpart but can compute arbitrary length transforms and use also complex numbers as input

Hilbert Transform

The Hilbert Transform is a basic tool used to derive the analytic representation of a signal.

Analytic Signal

The analytic representation of a signal is normally used in Digital Signal Processing since it gives access more easily to many properties of the signal itself. The Analytic Signal can be used to compute the Instantaneous Frequency of a signal and to apply modulation and demodulation (signal envelope) techniques

Short-Time Fourier Transform

The Short-time Fourier Transform is a time-frequency distribution used to analyze digital signals. It can be used to measure as precisely as possible the fundamental frequency of a signal. The short-time Fourier Transform is computed in practice as a sequence of Fourier Transforms applied to a moving window of the original signal multiplied by one of the many windows available. See this page on Spectral Windows for a list of the available windows.

Spectrogram

The Spectrogram is used to analyze the spectral density of a signal. It is computed as the squared magnitude of the Short-time Fourier Transform of a signal.

Wigner-Ville Time-frequency Distribution

The Wigner-Ville Time-frequency Distribution is functionally equivalent to the Spectrogram or Short-time Fourier transform in that it is used to analyze in the plane time-frequency the evolution of a signal. It has better temporal and frequency resolutions with respect to its counterparts but this comes at the cost of introducing negative energies in the spectral representation. Computationally it is very similar to a Short-time Fourier transform. See this page on Digital Signal Processing for further information on the method.


Forecasting Methods   Holt Winter’s, Series Decomposition and Wavelet Benchmarks
Time Series Forecasting   Use of the Moving Average in Time-series Forecasting
Forecasting Concepts   Denoising Techniques
Error Statistics   Computational Performance
Fast Fourier Transform   Moving Averages
Kernel Smoothing   Active Moving Average
Savitsky-Golay Smoothing   Fractal Projection
Downloading Financial Data from Yahoo   Multiple Regression
Digital Signal Processing   Principal Component Analysis
Curve Analysis   Options Pricing with Black-Scholes
Markowitz Optimal Portfolio   Time-series preprocessing