Multi-step Time Series Forecasting with an Ensemble of Varied Length Mixture Models

Research output: Contribution to journalArticlepeer-review

Abstract

Many real-world problems require modelling and forecasting of time series, such as weather temperature, electricity demand, stock prices and foreign exchange rates. Often, the tasks involve predicting over a long-term period, e.g. several weeks or months. Most existing time series models are inheritably for one-step prediction, that is, predicting one time point ahead. Multi-step or long-term prediction is difficult and challenging due to the lack of information and uncertainty or error accumulation. The main existing approaches, iterative and independent, either use one-step model recursively or treat the multi-step task as an independent model. They generally perform poorly in practical applications. In this paper, as an extension of the self-organizing mixture autoregressive model, the varied length mixture (VLM) models, is proposed to model and forecast time series over multi-steps. The key idea is to preserve the dependencies between the time points within the prediction horizon. Training data are segmented to various lengths corresponding to various forecasting horizons, and the VLM models are trained in a self-organizing fashion on these segments to capture these dependencies in its component autoregressive models of various predicting horizons. The VLM models form a probabilistic mixture of these varied length models. A combination of short and long VLM models and an ensemble of them are proposed to further enhance the prediction performance. The effectiveness of the proposed methods and their marked improvements over the existing methods are demonstrated through a number of experiments on synthetic data, real-world foreign exchange rates and weather temperatures.

Bibliographical metadata

Original languageEnglish
Number of pages13
JournalInternational Journal of Neural Systems
DOIs
Publication statusPublished - 29 Dec 2017

Related information

Prizes

Prize: Prize (including medals and awards)

View all prizes ()