Recurrent Batch Normalization

Recurrent Neural Networks
Mar 2016

Recurrent Batch Normalization

Mar 2016

We propose a reparameterization of LSTM that brings the benefits of batch normalization to recurrent neural networks. Whereas previous works only apply batch normalization to the input-to-hidden transformation of RNNs, we demonstrate that it is both possible and beneficial to batch-normalize the hidden-to-hidden transition, thereby reducing internal covariate shift between time steps. We evaluate our proposal on various sequential problems such as sequence classification, language modeling and question answering. Our empirical results show that our batch-normalized LSTM consistently leads to faster convergence and improved generalization.

Reference

Tim Cooijmans, Nicolas Ballas, César Laurent, Çağlar Gülçehre, Aaron Courville, Recurrent Batch Normalization, in: arXiv e-print, 1603.09025, 2016

Linked Profiles

array(1) { ["wp-wpml_current_language"]=> string(2) "en" }

Mila goes virtual

Starting March 16, 2020, Mila shifts its activities to virtual platforms in order to minimize COVID-19 diffusion.

Read more