Modern Machine Learning in Time Series Forecasting
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Santa Barbara

UC Santa Barbara Electronic Theses and Dissertations bannerUC Santa Barbara

Modern Machine Learning in Time Series Forecasting

Abstract

Because of its high dimensionality, complex dynamics and irregularity, forecasting of time series data has been studied by both statistics and machine learning community for decades. The massive and ever-growing volume of data created by modern applications poses even more serious challenges to practical forecasting tools. (1) Scalability. Modern forecasters should be able to process large amount of diverse time series effectively and efficiently. (2) Correlation Awareness. Modern forecasters should be able to take advantage of correlated time series in addition to the history of the current time series. (3). Generalizability. Modern forecasters should be able to generalize to a different data domain than the domain they are trained on. While much effort has been devoted to tackle these issues, forecasting in general is still an open problem. In this dissertation, we propose complementary approaches targeting these challenges towards large-scale forecasting. We begin with novel neural architectures that are able to deal with various time series for more accurate predictions. Specifically, we present the following methods: Layerwise Recurrent Temporal Convolution Networks (LRTCN) combines the strengths of classic Recurrent Neural Nets (RNN) and Convolution Neural Nets (CNNs) to process long time series. Convolutional Transformer (ConvTrans) aims to enhance locality of attention-based forecasting model to further improve the performance, as well as to break the memory bottleneck for long time series. Attention-guided Autoregression (AGA), on the other hand, brings complicated autoregressive models and simple regreesive models together via tailored attentino mechanism to quickly respond to change points in forecasting. Next, we present Attention Cross Time Series (ACTS) that refers to correlated time series in order to guide current forecasting. We study its application to epidemic data and show its effectiveness in predicting cases and deaths of COVID-19 outbreak across the United States. Finally, we present Domain Adaptation Forecasters (DATS) that enables adapting a forecaster trained on a data-rich source domain to a data-scarce target domain. Based on the idea of domain-invariant data representations in existing domain adaptation approaches, we propose to align a subset of learned features in the forecaster across domains. In the meantime, we keep remaining features domain-specific so that domain-dependent forecasts can be made for each domain.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View