Categorical Reparameterization with Gunbel-Softmax


WHY?

The same motivation with Concrete.

WHAT?

Gumbel-Softmax distribution is the same as Concrete distribution. GS distribution appoaches to on-hot as temperature goes 0. image However, GS samples are not exactly the same as categorical samples resulting bias. This GS estimator becomes close to unbiased but the variance of gradient increase (trade-off). If we do not replace categorical variables with GS variables, we can estimate the gradient with Straight-Through GS estimator. image (3) is score function estimator, (4) is Straight-Through estimator and (5) is GS estimator (pathwise derivative).

So?

GS outperformed in generation of MNIST images in NLL. ST-GS outperformed previous approach in semi-supervised classification.

Critic

Nice visualization. I wonder how this can be used in NLP.

Jang, Eric, Shixiang Gu, and Ben Poole. “Categorical reparameterization with gumbel-softmax.” arXiv preprint arXiv:1611.01144 (2016).



© 2017. by isme2n

Powered by aiden