Skip to main content
eScholarship
Open Access Publications from the University of California

UC Berkeley

UC Berkeley Electronic Theses and Dissertations bannerUC Berkeley

Relaxed Wasserstein, Generative Adversarial Networks, Variational Autoencoders and their applications

Abstract

Statistical divergences play an important role in many data-driven applications. Two notable examples are Distributionally Robust Optimization (DRO) problems and Generative Adversarial Networks (GANs).

In the first section of my dissertation, we propose a novel class of statistical divergence called Relaxed Wasserstein (RW) divergence, which combines Wasserstein distance and Bregman divergence. We begin with its strong probabilistic properties, and then to illustrate its uses, we introduce Relaxed Wasserstein GANs (RWGANs) and compare it empirically with several state-of-the-art GANs in image generation. We show that it strikes a balance between training speed and image quality. We also discuss the potential use of Relaxed Wasserstein to construct ambiguity sets in DRO problems.

In the second section of my dissertation, we show the application of another type of generative neural network, the Variational AutoEncoder (VAE), to metagenomic binning problems in bioinformatics. Shotgun sequencing is used to produce short reads from DNA sequences in a sample from a microbial community, which could contain thousands species of discovered or unknown microbes. The short reads are then assembled by connecting overlapping subsequences and thus forming longer sequences called contigs. Metagenomic binning is the process of grouping contigs from multiple organisms based on their genomes of origin. We propose a new network structure called MetaAE, which combines compositional and reference-based information in a nonlinear way. We show that this binning algorithm improves the performance of state-of-the-art binners by 20\% on two independent synthetic datasets.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View