WHY?

Choosing an appropriate prior is important for VAE. This paper suggests two-layered VAE with flexible VampPrior.

WHAT?

The original variational lower-bound of VAE can be decomposed as follows.

The first component is the negative reconstruction error, the second component is the expectation of the entropy of the variational posterior, and the last component is the cross-entropy betwen the aggregated posterior and the prior. Usually the prior is given with a simple distribution such as Gaussian Normal, but a prior that optimized the ELBO can be found as the aggregated posterior.

However this not only leads to overfitting, but also expensive to compute. So this paper suggests variational mixture of posteriors prior(VampPrior) that approximates the prior with a mixture of variational posteriors of pseudo-inputs. These pseudo-inputs are learned by backpropagation.

In order to prevent inactive stochastic units problem, this paper suggests two-layered VAE.

image

So?

HVAE with VampPrior achieved good results on various dataset(MNIST, dynamix MNIST, OMNIGLOT, Caltech 101 Silhouette, Frey Faces and Histopathology patches) not only in log-likelihood(LL) but also in quality reducing blurring problem in standard VAE.

Tomczak, Jakub M., and Max Welling. “VAE with a VampPrior.” arXiv preprint arXiv:1705.07120 (2017).