Amortized Inference in Latent Space Energy-Based Prior Model
- Author(s): Zhai, Xufan
- Advisor(s): Wu, Yingnian
- et al.
This thesis discusses amortized inference in the latent space energy-based prior model(EBM), where the EBM serves as the prior of a generator neural network. The sampling of the prior and posterior can be done by short-run MCMC, however, the MCMC sampling of the posterior distribution can be time consuming due to the complexity of the posterior distribution. We propose to amortize the MCMC sampling in the posterior distribution with an inference network. Image experiments showed that amortization produces similar results to short-run MCMC sampling and is more time efficient; the generator also shows better stability under amortization.