Latent random variables

Recurrent Neural Networks
Jun 2015

A Recurrent Latent Variable Model for Sequential Data

Jun 2015

In this paper, we explore the inclusion of random variables into the dynamic hidden state of a recurrent neural network (RNN) by combining elements of the variational autoencoder. We argue that through the use of high-level latent random variables, our variational RNN (VRNN) is able to learn to model the kind of variability observed in highly-structured sequential data (such as speech). We empirically evaluate the proposed model against related sequential models on five sequence datasets, four of speech and one of handwriting. Our results show the importance of the role random variables can play in the RNN dynamic hidden state.


Junyoung Chung, Kyle Kastner, Laurent Dinh, Kratarth Goel, Aaron Courville, Yoshua Bengio, A Recurrent Latent Variable Model for Sequential Data. in: arXiv e-prints, 1506.02216, 2015


Linked Profiles

array(1) { ["wp-wpml_current_language"]=> string(2) "en" }

Mila goes virtual

Starting March 16, 2020, Mila shifts its activities to virtual platforms in order to minimize COVID-19 diffusion.

Read more