Home

Inspiring the development of artificial intelligence for the benefit of all 

A professor talks to his students in a café/lounge.

Located in the heart of Quebec’s AI ecosystem, Mila is a community of more than 1,200 researchers specializing in machine learning and dedicated to scientific excellence and innovation.

About

Featured

Faculty 

Founded in 1993 by Professor Yoshua Bengio, Mila today brings together over 130 professors affiliated with Université de Montréal, McGill University, Polytechnique Montréal and HEC Montréal. Mila also welcomes professors from Université Laval, Université de Sherbrooke and École de technologie supérieure (ÉTS). 

Browse the online directory

Photo of Yoshua Bengio

Latest Publications

A logistics provider’s profit maximization facility location problem with random utility maximizing followers
David Pinzon Ulloa
Bernard Gendron
Accelerated Benders Decomposition and Local Branching for Dynamic Maximum Covering Location Problems
Steven Lamontagne
Ribal Atallah
The maximum covering location problem (MCLP) is a key problem in facility location, with many applications and variants. One such variant is… (see more) the dynamic (or multi-period) MCLP, which considers the installation of facilities across multiple time periods. To the best of our knowledge, no exact solution method has been proposed to tackle large-scale instances of this problem. To that end, in this work, we expand upon the current state-of-the-art branch-and-Benders-cut solution method in the static case, by exploring several acceleration techniques. Additionally, we propose a specialised local branching scheme, that uses a novel distance metric in its definition of subproblems and features a new method for efficient and exact solving of the subproblems. These methods are then compared through extensive computational experiments, highlighting the strengths of the proposed methodologies.
GIST: Generated Inputs Sets Transferability in Deep Learning
Florian Tambon
Giuliano Antoniol
Promoting Exploration in Memory-Augmented Adam using Critical Momenta
Pranshu Malviya
Goncalo Mordido
Aristide Baratin
Reza Babanezhad Harikandeh
Jerry Huang
Razvan Pascanu
Adaptive gradient-based optimizers, particularly Adam, have left their mark in training large-scale deep learning models. The strength of su… (see more)ch optimizers is that they exhibit fast convergence while being more robust to hyperparameter choice. However, they often generalize worse than non-adaptive methods. Recent studies have tied this performance gap to flat minima selection: adaptive methods tend to find solutions in sharper basins of the loss landscape, which in turn hurts generalization. To overcome this issue, we propose a new memory-augmented version of Adam that promotes exploration towards flatter minima by using a buffer of critical momentum terms during training. Intuitively, the use of the buffer makes the optimizer overshoot outside the basin of attraction if it is not wide enough. We empirically show that our method improves the performance of several variants of Adam on standard supervised language modelling and image classification tasks.

AI for Humanity

Socially responsible and beneficial development of AI is a fundamental component of Mila’s mission. As a leader in the field, we wish to contribute to social dialogue and the development of applications that will benefit society.

Learn more

A person looks up at a starry sky.