IPredict offers a wide selection of time-series forecasting algorithms, regression
forecasting, smoothing and curve or model fitting algorithms. Together with all
classical algorithms you will find proprietary and advanced algorithms with superior
performance in statistical forecasting.

Take a look at IPredict benchmarks to see the difference
in performance of some of these algorithms.

To see how these methods are applied to everyday financial forecasting take a look
at our free online stock market prediction or
download the example program that applies Wavelet Forecasting to financial
data.

All proprietary algorithms (Holt Winter's Modified Multiple Seasonalities, Haar Denoising, Daubechies Linear Denoising, Daubechies Exponential Denoising, Wavelet Forecasting, Fractal Projection, Active Moving Average) are subject to copyright and all rights are reserved to IPredict.

The most important feature of IPredict is the Excel wizard that allows you, without forecasting expertise or statistics knowledge, to produce a reliable forecast in minutes. The wizard automates the time-series forecasting tasks, computes the best parameters given the time-series and allows you to choose visually the best approach for your problem. The best fit is calculated using the most appropriate technique by ranking using a user selected statistical error.

The Simple Moving Average smooth past data by arithmetically averaging over a specified period and projecting forward in time. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

The Geometric Moving Average smooth past data by geometrically averaging over a specified period and projecting forward in time. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

The Triangular Moving Average is a weighted moving average with weights that form a triangular shape. The projection technique is the same of the Simple Moving Average. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

The Parabolic Moving Average is a weighted moving average with weights that form a parabolic shape. The projection technique is the same of the Simple Moving Average. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

The Double Moving Average applies in sequence for two times the Simple Moving Average algorithm. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

The Exponential Moving Average is summarized by the equation:

X'_{t} = αX_{t }+ (1-α)X'_{t-1
}it is a weighted moving average with weights that decrease exponentially going
backwards in time. This is normally considered a smoothing algorithm and has poor
forecasting results in most cases.

The Double Exponential Moving Average applies the Exponential Moving Average twice.
This is normally considered a smoothing algorithm and has poor forecasting results
in most cases.

If the equation of the single exponential moving average can be expressed as in
(6):

X'_{t} = αX_{t }+ (1-α)X'_{t-1
}then the equation of the double exponential moving average can be expressed
as:

X''_{t} = αX^{'}_{t }
+ (1-α)X''_{t-1}

Holt's Double Exponential is similar to the Double Exponential Moving Average. It allows you to specify the two smoothing constants used in the process. It is useful for data in a simple linear trend.

The Triple Exponential Moving Average applies three times the Exponential Moving
Average. This is normally considered a smoothing algorithm and has poor forecasting
results in most cases.

If the equation of the single exponential moving average can be expressed as in
(6):

X'_{t} = αX_{t }+ (1-α)X'_{t-1
}and the equation of the double exponential moving average can be expressed
as in (7):

X''_{t} = αX^{'}_{t }
+ (1-α)X''_{t-1
}then the equation of the triple exponential moving average can be expressed
as:

X'''_{t} = αX^{''}_{t }
+ (1-α)X'''_{t-1}

Holt's Triple Exponential is the classical Holt's forecasting algorithm. It is useful for data in a simple linear trend.

The Adaptive Exponential Smoothing automatically adjusts the smoothing parameters based on the forecast error. This is normally considered a smoothing algorithm and has poor forecasting results in most cases.

The Holt Winter's Additive method is applicable when the time series contains a seasonal component. This method assumes the time series is composed by a linear trend and a seasonal cycle, it constructs three statistically correlated series (smoothed, seasonal and trend) and projects forward the identified trend and seasonality.

Like Holt Winter's Additive this method can be applied to a seasonal time series. The model assumes that the components of the time series (smoothed, seasonal and trend) are multiplied together giving as result a more 'active' time series.

The Modified Multiple Seasonalities is a proprietary algorithm based on Holt Winter's algorithm that can take into consideration multiple seasonalities. This model is especially suited for financial time series. Two versions of this algorithm are now available with different computational logic.

Additive Decomposition computes the decomposition of the time series into its components,
trend, seasonality, cyclical and error. It projects the identified parts to the
future and sums the resulting projection to form the forecast. The model is assumed
to be additive (that is all parts are summed up to give the forecast).

The model equation is:

X'_{t} = T_{t }+ S_{t }+ C_{t
}+ ε_{t
}where T is the trend, S the seasonality, C the cycle and ε the error.

Multiplicative Decomposition like Additive Decomposition computes the decomposition
of the time series into its components, trend, seasonality, cyclical and error and
then projects to the future. The model is assumed to be multiplicative (that is
all parts are multiplied by each other to give the forecast).

The model equation is:

X'_{t} = T_{t }* S_{t }* C_{t
}* ε_{t
}where T is the trend, S the seasonality, C the cycle and ε the error.

This is a very useful algorithm for sparse time series. The model is equivalent to an Exponential Moving Average both in quantities and in time.

A new kind of algorithm suited for economic time-series like sales, inventories and similar. This algorithm assumes that the data it is fed has some periodicity (like an yearly periodicity) and computes the forecast starting from at least a couple of such periods and projecting it to the future.

The Linear Trend fits the time series to a straight line and projecting forward
in time.

The model equation is:

X'_{t} = m*t + q

The Linear Trend and Additive Seasonality assume the data is made of a linear trend
plus a single additive seasonality. It then fits the equation to the data using
a Bayesian fit and projects forward in time.

The model equation is:

X'_{t} = m*t + A*sin(ωt) + q

The Linear Trend and Multiplicative Seasonality assume the data is made of a linear
trend plus a single multiplicative seasonality. It then fits the equation to the
data using a Bayesian fit and projects forward in time.

The model equation is:

X'_{t} = m*t * A*sin(ωt) * q

The Linear Trend and Multiple Seasonalities assume the data is made of a linear trend plus multiple additive or multiplicative seasonalities. It then fits the equation to the data using a Bayesian fit and projects forward in time. It is especially useful for financial time series.

The Polynomial algorithm fits a polynomial equation (up to the desired order) to
the data using a Bayesian fit and projects forward in time. You must be careful
not to over fit the data with a very long polynomial.

The model equation is:

X'_{t} = A + B*t + C*t^{2} + D*t^{3}
+ ...

The Logarithmic algorithm fits a logarithmic equation to the data using a Bayesian
fit and projects forward in time.

The model equation is:

X'_{t} = A + B*log(t)

The Exponential algorithm fits an exponential equation to the data using a Bayesian
fit and projects forward in time.

The model equation is:

X'_{t} = A + B*exp(t)

One of the biggest issues when dealing with algorithms that require seasonality indexes is to compute the seasonalities. All competitors require you to compute these magic numbers and even the most advanced packages aren't able to tell this simple and intuitive figure: the seasonality of a time series. The Frequency Identification algorithm computes the frequencies that are inside the input time series.

Removing noise from a time series is always difficult and current algorithms (averages, exponential averages, etc...) always introduce a lag in data or change the statistical properties of the underlying time series. Haar Denoising is able to remove the noise that is in the time series using the Haar Wavelet transform and a proprietary algorithm.

Like Haar Denoising the Daubechies Linear Denoising applies a Daubechies Wavelet transform and a proprietary algorithm to linearly denoise the time series.

The denoising in this case decreases exponentially with the wavelet power.

This algorithm uses the Daubechies Wavelet transform to produce a forecast with a proprietary algorithm.

Special care must be taken into account when generating test beds for ideas and models. Usually the stock Rand() Excel function is not enough. IPredict provides four statistically independent Uniform Random Number Generators and one Gaussian Generator.

This is the base of the Wavelet forecasting algorithm. The wavelet transform is the base of a lot of today's algorithms in the field of Digital Signal Processing, Quantum Mechanics, Image Processing and Speech Recognition among others. The forward and inverse Daubechies and Haar transforms are included.

These are the classical Fourier Transforms for forecasting. IPredict includes Sine, Cosine and Fast Fourier Transforms.

The estimate of the Hurst Exponent is very important to help understand the properties of a time-series and how much it looks like a random walk (i.e. how much the time-series is predictable). Three methods are provided based on Daubechies, Haar and Rescaled Range algorithms.

The Fractal Projection algorithm represents a new approach to forecasting. This algorithm is able to project to the future a pattern stretching it in times or values as appropriate.

The Active Moving Average belongs to the class of Exponential Moving Averages but its calculation is quadratic in nature and is much quicker than standard Moving Averages in "following" the original signal.

The Gaussian Kernel Smoothing is a classical Kernel Smoothing algorithm.

The kernel function is the following:

K(t) = e^{-λ * t2}

The Hilbert Kernel Smoothing function is:

K(t) = -π / t

The Triangle Kernel Smoothing function is:

K(T) = 1 - Abs(t)

The Epanechnicov Kernel Smoothing function is:

K(t) = 3/4 * (1 - t^{2})

The Quartic Kernel Smoothing function is:

K(t) = 15/16 * (1 - t^{2})^{2}

The Triweight Kernel function is:

K(t) = 35/32 * (1 - t^{2})^{3}

The Cosine Kernel function is:

K(t) = -π/4 * Cos(π/2 * t)

This algorithm is a smoothing filter that essentially applies a polynomial regression of a certain degree to a time-series. The advantage of the Savitsky-Golay filter is that it tends to preserve certain features of the time-series like local minima and maxima.

Spline smoothing consists in computing the spline that approximates the input data and projecting it to internal or external points. It is useful when compressing or expanding data before a projection is done. Our library provides three different methods that allow you to smooth a time series in different contexts like filling zeroes of a time-series or projecting based on a different scale.

The
Fourier Transform is the basis of several forecasting algorithms and engineering
computations. We offer
the classical Fast Fourier Transform and the Sine and Cosine Fourier Transforms.
This is the standard implementation well known in the industry.

See this page on the Fast Fourier Transform for further information.

The Fourier Power Spectrum or Power Spectral Density of a signal is a real function that expresses the energy of the signal per single component frequency. This is normally called the spectrum of a signal.

This is the general Discrete Fourier Transforms that can take as parameter complex valued signals and returns complex valued transforms. The algorithm is computed using the original algorithm and thus is "slow" compared to its "fast" counterpart but can compute arbitrary length transforms and use also complex numbers as input

The Hilbert Transform is a basic tool used to derive the analytic representation of a signal.

The analytic representation of a signal is normally used in Digital Signal Processing since it gives access more easily to many properties of the signal itself. The Analytic Signal can be used to compute the Instantaneous Frequency of a signal and to apply modulation and demodulation (signal envelope) techniques

The Short-time Fourier Transform is a time-frequency distribution used to analyze digital signals. It can be used to measure as precisely as possible the fundamental frequency of a signal. The short-time Fourier Transform is computed in practice as a sequence of Fourier Transforms applied to a moving window of the original signal multiplied by one of the many windows available. See this page on Spectral Windows for a list of the available windows.

The Spectrogram is used to analyze the spectral density of a signal. It is computed as the squared magnitude of the Short-time Fourier Transform of a signal.

The Wigner-Ville Time-frequency Distribution is functionally equivalent to the Spectrogram or Short-time Fourier transform in that it is used to analyze in the plane time-frequency the evolution of a signal. It has better temporal and frequency resolutions with respect to its counterparts but this comes at the cost of introducing negative energies in the spectral representation. Computationally it is very similar to a Short-time Fourier transform. See this page on Digital Signal Processing for further information on the method.