Skip to main content
eScholarship
Open Access Publications from the University of California

[SoK] The Great GAN Bake Off, An Extensive Systematic Evaluation of Generative Adversarial Network Architectures for Time Series Synthesis

Abstract

There is no standard approach to compare the success ofdifferent neural network architectures utilized for time seriessynthesis. This hinders the evaluation and decision process,as to which architecture should be leveraged for an unknowndata set. We propose a combination of metrics, which empiri-cally evaluate the performance of neural network architecturestrained for time series synthesis. With these measurementswe are able to account for temporal correlations, spatial cor-relations and mode collapse issues within the generated timeseries.

We further investigate the interaction of different genera-tor and discriminator architectures between each other. Theconsidered architectures include recurrent neural networks,temporal convolutional networks and transformer-based net-works. So far, the application of transformer-based models islimited for time series synthesis. Hence, we propose a newtransformer-based architecture, which is able to synthesisetime series. We evaluate the proposed architectures and theircombinations in over 500 experiments, amounting to over2500 computing hours. We provide results for four data sets,one univariate and three multivariate. The data sets vary withregard to length, as well as patterns in temporal and spatialcorrelations.

We use our metrics to compare the performance of genera-tive adversarial network architectures for time series synthesis.To verify our findings we utilize quantitative and qualitativeevaluations. Our results indicate that temporal convolutionalnetworks currently outperform recurrent neural network andtransformer based approaches with regard to fidelity and flex-ibility of the generated time series data. Temporal convolu-tional network architecture are the most stable architecture fora mode collapse prone data set. The performance of the trans-former models strongly depends on the data set characteristics,it struggled to synthesise data sets with high temporal andspatial correlations. Discriminators with recurrent networkarchitectures suffer from vanishing gradients. We also show,that the performance of the generative adversarial networksdepends more on the discriminator rather than the generator.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View