Skip to main content
Open Access Publications from the University of California


UCLA Electronic Theses and Dissertations bannerUCLA

Conditional Divergence Triangle for Joint Training of Generator, Energy-based and Inference Models


This paper proposes a conditional version of Divergence Triangle [1] as a framework to train generator, energy-based and inference models jointly with the information of labels, where the learning of the above three models are integrated perfectly in a unified probabilistic formulation. Experiments demonstrate that, within this one framework, we are able to complete the following tasks together, (1) control the fine-grained categories to generate realistic images, (2) obtain the meaningful representation of observed data in the low dimensions, and also (3) conduct label classification on unobserved data. Additionally, I also discuss a possible extension on Conditional Divergence Triangle model at the end of this paper for future work.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View