2021-12 Gradient Starvation: A Learning Proclivity in Neural Networks
Pretraining Representations for Data-Efficient Reinforcement Learning
2021-11 Multi-label Iterated Learning for Image Classification with Label Ambiguity
2021-10 Chunked Autoregressive GAN for Conditional Waveform Synthesis
Unifying Likelihood-free Inference with Black-box Sequence Design and Beyond.
2021-09 Overcoming Label Ambiguity with Multi-label Iterated Learning
sai rajeswar mudumba, Pau Rodriguez, Soumye Singhal, David Vazquez and
Aaron Courville
2021-08 Deep Reinforcement Learning at the Edge of the Statistical Precipice
2021-07 Can Subnetwork Structure Be the Key to Out-of-Distribution Generalization?
Continuous Coordination As a Realistic Scenario for Lifelong Learning
2021-06 Haptics-based Curiosity for Sparse-reward Tasks
Sai Rajeswar, Cyril Ibrahim, Nitin Surya, Florian Golemo, David Vazquez,
Aaron Courville and Pedro O. Pinheiro
A Variational Perspective on Diffusion-Based Generative Models and Score Matching
Hierarchical Video Generation for Complex Data.
2021-05 Explicitly Modeling Syntax in Language Models with Incremental Parsing and a Dynamic Oracle.
Understanding by Understanding Not: Modeling Negation in Language Models.
SSW-GAN: Scalable Stage-wise Training of Video GANs
StructFormer: Joint Unsupervised Induction of Dependency and Constituency Structure from Masked Language Modeling
Yikang Shen, Yi Tay, Che Zheng, Dara Bahri, Donald Metzler and
Aaron Courville
Out-of-Distribution Generalization via Risk Extrapolation (REx)
Iterated learning for emergent systematicity in VQA
Learning Task Decomposition with Order-Memory Policy Network
Yuchen Lu, Yikang Shen, Siyuan Zhou,
Aaron Courville , Joshua B. Tenenbaum and Chuang Gan
Systematic generalisation with group invariant predictions
Integrating Categorical Semantics into Unsupervised Domain Translation
Convex Potential Flows: Universal Probability Distributions with Optimal Transport and Convex Optimization
Neural Approximate Sufficient Statistics for Implicit Models
Yanzhi Chen, Dinghuai Zhang, Michael U. Gutmann,
Aaron Courville and Zhanxing Zhu
Data-Efficient Reinforcement Learning with Self-Predictive Representations
2021-04 Touch-based Curiosity for Sparse-Reward Tasks.
Sai Rajeswar, Cyril Ibrahim, Nitin Surya, Florian Golemo, David Vázquez,
Aaron C. Courville and Pedro O. Pinheiro
2021-03 Learning Task Decomposition with Ordered Memory Policy Network.
Yuchen Lu, Yikang Shen, Siyuan Zhou,
Aaron Courville , Joshua B. Tenenbaum and Chuang Gan
Pretraining Reward-Free Representations for Data-Efficient Reinforcement Learning
2021-01 Generative Compositional Augmentations for Scene Graph Prediction
Boris Knyazev, Harm de Vries, Cătălina Cangea, Graham W. Taylor,
Aaron Courville and Eugene Belilovsky
Emergent Communication under Competition.
2020-11 Bijective-Contrastive Estimation
Gradient Starvation: A Learning Proclivity in Neural Networks
Recursive Top-Down Production for Sentence Generation with Latent Trees
Supervised Seeded Iterated Learning for Interactive Language Learning
Pix2Shape: Towards Unsupervised Learning of 3D Scenes from Images Using a View-Based Representation
2020-10 Generative adversarial networks
NU-GAN: High resolution neural upsampling with GAN.
Explicitly Modeling Syntax in Language Model improves Generalization.
Recursive Top-Down Production for Sentence Generation with Latent Trees.
Integrating Categorical Semantics into Unsupervised Domain Translation
2020-07 Data-Efficient Reinforcement Learning with Momentum Predictive Representations. (arXiv:2007.05929v2 [cs.LG] UPDATED)
Data-Efficient Reinforcement Learning with Momentum Predictive Representations
Countering Language Drift with Seeded Iterated Learning
Yuchen Lu, Soumye Singhal, Florian Strub,
Aaron Courville and Olivier Pietquin
AR-DAE: Towards Unbiased Neural Entropy Gradient Estimation
Generative Graph Perturbations for Scene Graph Prediction.
Boris Knyazev, Harm de Vries, Catalina Cangea, Graham W. Taylor,
Aaron C. Courville and Eugene Belilovsky
A Large-Scale, Open-Domain, Mixed-Interface Dialogue-Based ITS for STEM
2020-05 Graph Density-Aware Losses for Novel Compositions in Scene Graph Generation
Boris Knyazev, Harm de Vries, Catalina Cangea, Graham W. Taylor,
Aaron C. Courville and Eugene Belilovsky
2020-04 On Bonus Based Exploration Methods In The Arcade Learning Environment
Detecting Semantic Anomalies
2020-03 Countering Language Drift with Seeded Iterated Learning
Yuchen Lu, Soumye Singhal, Florian Strub, Olivier Pietquin and
Aaron Courville
2020-02 Solving ODE with Universal Flows: Approximation Theory for Flow-Based Models
Augmented Normalizing Flows: Bridging the Gap Between Generative Flows and Latent Variable Models.
2020-01 Unsupervised Learning of Dense Visual Representations
Pedro O. O. Pinheiro, Amjad Almahairi, Ryan Benmalek, Florian Golemo and
Aaron C. Courville
What Do Compressed Deep Neural Networks Forget
2019-12 CLOSURE: Assessing Systematic Generalization of CLEVR Models.
2019-11 Selective Brain Damage: Measuring the Disparate Impact of Model Pruning
Ordered Memory
Yikang Shen, Shawn Tan, Arian Hosseini, Zhouhan Lin, Alessandro Sordoni and
Aaron C. Courville
Deep Generative Modeling of LiDAR Data
2019-10 Icentia11K: An Unsupervised Representation Learning Dataset for Arrhythmia Subtype Discovery.
VideoNavQA: Bridging the Gap between Visual and Embodied Question Answering
Improved Conditional VRNNs for Video Prediction
Batch Weight for Domain Adaptation With Mass Shift
2019-09 Selfish Emergent Communication
{COMPANYNAME}11K: An Unsupervised Representation Learning Dataset for Arrhythmia Subtype Discovery
MelGAN: Generative Adversarial Networks for Conditional Waveform Synthesis
No Press Diplomacy: Modeling Multi-Agent Gameplay
No-Press Diplomacy: Modeling Multi-Agent Gameplay
2019-08 Benchmarking Bonus-Based Exploration Methods on the Arcade Learning Environment
2019-06 Adversarial Computation of Optimal Transport Maps
Investigating Biases in Textual Entailment Datasets.
Note on the bias and variance of variational inference.
2019-05 On the Spectral Bias of Neural Networks
Hierarchical Importance Weighted Autoencoders
Representation Mixing for TTS Synthesis
Brief Report: Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks
Reproducibility in Machine Learning
2019-03 Counterpoint by Convolution
Cheng-Zhi Anna Huang, Tim Cooijmans, Adam Roberts,
Aaron Courville and Douglas Eck
2019-01 Maximum Entropy Generators for Energy-Based Models.
Probability Distillation: A Caveat and Alternatives.
Stochastic Neural Network with Kronecker Flow.
2018-12 Towards Text Generation with Adversarially Learned Neural Outlines
2018-11 Planning in Dynamic Environments with Conditional Autoregressive Models.
Harmonic Recomposition using Conditional Autoregressive Modeling.
Blindfold Baselines for Embodied QA.
2018-10 Sim-to-Real Transfer with Neural-Augmented Robot Simulation
Deep Learning. Das umfassende Handbuch
2018-09 EnGAN: Latent Space MCMC and Maximum Entropy Generators for Energy-based Models
On Difficulties of Probability Distillation
Chin-Wei Huang, Faruk Ahmed, Kundan Kumar, Alexandre Lacoste and
Aaron Courville
W2GAN: RECOVERING AN OPTIMAL TRANSPORT MAP WITH A GAN
Leygonie Jacob, Jennifer She, Amjad Almahairi, Sai Rajeswar and
Aaron Courville
Pix2Scene: Learning Implicit 3D Representations from Images
Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks
Systematic Generalization: What Is Required and Can It Be Learned?
Unsupervised one-to-many image translation
Convergence Properties of Deep Neural Networks on Separable Data
Manifold Mixup: Learning Better Representations by Interpolating Hidden States
On the Learning Dynamics of Deep Neural Networks.
Visual Reasoning with Multi-hop Feature Modulation
Florian Strub, Mathieu Seurin, Ethan Perez, Harm de Vries, Jérémie Mary, Philippe Preux,
Aaron C. Courville and Olivier Pietquin
Improving Explorability in Variational Inference with Annealed Variational Objectives
2018-08 Approximate Exploration through State Abstraction.
2018-07 Feature-wise transformations
Neural Autoregressive Flows
Augmented CycleGAN: Learning Many-to-Many Mappings from Unpaired Data
Mutual Information Neural Estimation.
2018-06 On the Spectral Bias of Deep Neural Networks
Learning Distributed Representations from Reviews for Collaborative Filtering
Manifold Mixup: Better Representations by Interpolating Hidden States.
Manifold Mixup: Encouraging Meaningful On-Manifold Interpolation as a Regularizer.
Straight to the Tree: Constituency Parsing with Neural Syntactic Distance
Mine: mutual information neural estimation
2018-03 Generating Contradictory, Neutral, and Entailing Sentences
2018-02 Hierarchical Adversarially Learned Inference
Mohamed Ishmael Belghazi, Sai Rajeswar, Olivier Mastropietro, Negar Rostamzadeh, Jovana Mitrovic and
Aaron Courville
Bayesian Hypernetworks
David Krueger, Chin-Wei Huang, Riashat Islam, Ryan Turner, Alexandre Lacoste and
Aaron Courville
Neural Language Modeling by Jointly Learning Syntax and Lexicon
Learning Generative Models with Locally Disentangled Latent Factors
Inferring Identity Factors for Grouped Examples
FiLM: Visual Reasoning with a General Conditioning Layer.
2018-01 MINE: Mutual Information Neural Estimation.
HoME: a Household Multimodal Environment.
Publications collected and formatted using Paperoni