Portrait of Adam M. Oberman

Adam M. Oberman

Associate Academic Member
Canada CIFAR AI Chair
Full Professor, McGill University, Department of Mathematics and Statistics
Research Topics
AI Safety
Deep Learning
Generative Models
Machine Learning Theory
Representation Learning

Biography

I am a professor at McGill University, in the Department of Mathematics and Statistics. My research revolves around the application of advanced mathematical techniques to the field of deep learning. My primary areas of expertise include generative modelling, stochastic optimization methods, fairness/bias removal in computer vision, and generalization in reinforcement learning.

Before joining McGill in 2012, I held a tenured faculty position at Simon Fraser University and completed a postdoctoral fellowship at the University of Texas, Austin. I obtained my undergraduate education at the University of Toronto and pursued graduate studies at the University of Chicago. I have also held visiting positions at the University of California, Los Angeles (UCLA) and at the National Institute for Research in Digital Science and Technology (INRIA) in Paris.

My early research encompassed the fields of partial differential equations and scientific computing, where I made significant contributions to areas like numerical optimal transportation, geometric PDEs and stochastic control problems.

I teach two comprehensive theory courses on machine learning, covering topics such as statistical learning theory and kernel theory.

For prospective graduate students interested in working with me, please apply to both Mila – Quebec Artificial Intelligence Institute and the Department of Mathematics and Statistics at McGill. Alternatively, applicants may consider co-supervision opportunities with advisors from the computer science program at McGill or Université de Montréal.

Current Students

Master's Research - McGill University
Independent visiting researcher - University of Technology Sydney
PhD - McGill University
Co-supervisor :
PhD - McGill University
PhD - Université de Montréal
Principal supervisor :
PhD - McGill University

Blog Posts

by
Tiago Salvador
Stephanie Cairns
Vikram Voleti

Publications

Improving Continuous Normalizing Flows using a Multi-Resolution Framework
Vikram Voleti
Chris Finlay
Recent work has shown that Continuous Normalizing Flows (CNFs) can serve as generative models of images with exact likelihood calculation an… (see more)d invertible generation/density estimation. In this work we introduce a Multi-Resolution variant of such models (MRCNF). We introduce a transformation between resolutions that allows for no change in the log likelihood. We show that this approach yields comparable likelihood values for various image datasets, with improved performance at higher resolutions, with fewer parameters, using only 1 GPU.
A principled approach for generating adversarial images under non-smooth dissimilarity metrics
Aram-Alexandre Pooladian
Chris Finlay
Tim Hoheisel
A principled approach for generating adversarial images under non-smooth dissimilarity metrics
Aram-Alexandre Pooladian
Chris J. Finlay
Tim Hoheisel
Deep neural networks perform well on real world data but are prone to adversarial perturbations: small changes in the input easily lead to m… (see more)isclassification. In this work, we propose an attack methodology not only for cases where the perturbations are measured by
Deep PDE Solvers for Subgrid Modelling and Out-of-Distribution Generalization
Patrick Chatain
Climate and weather modelling (CWM) is an important area where ML models are used for subgrid modelling: making predictions of processes occ… (see more)urring at scales too small to be resolved by standard solution methods(Brasseur & Jacob, 2017). These models are expected to make accurate predictions, even on out-of-distribution (OOD) data, and are additionally expected to respect important physical constraints of the ground truth model (Kashinath et al., 2021). While many specialized ML PDE solvers have been developed, the particular requirements of CWM models have not been addressed so far. The goal of this work is to address them. We propose and develop a novel architecture, which matches or exceeds the performance of standard ML models, and which demonstrably succeeds in OOD generalization. The architecture is based on expert knowledge of the structure of PDE solution operators, which permits the model to also obey important physical constraints