Portrait of Adam M. Oberman is unavailable

Adam M. Oberman

Associate Academic Member
Canada CIFAR AI Chair
Full Professor, McGill University, Department of Mathematics and Statistics


I am a professor at McGill University, in the Department of Mathematics and Statistics. My research revolves around the application of advanced mathematical techniques to the field of deep learning. My primary areas of expertise include generative modelling, stochastic optimization methods, fairness/bias removal in computer vision, and generalization in reinforcement learning.

Before joining McGill in 2012, I held a tenured faculty position at Simon Fraser University and completed a postdoctoral fellowship at the University of Texas, Austin. I obtained my undergraduate education at the University of Toronto and pursued graduate studies at the University of Chicago. I have also held visiting positions at the University of California, Los Angeles (UCLA) and at the National Institute for Research in Digital Science and Technology (INRIA) in Paris.

My early research encompassed the fields of partial differential equations and scientific computing, where I made significant contributions to areas like numerical optimal transportation, geometric PDEs and stochastic control problems.

I teach two comprehensive theory courses on machine learning, covering topics such as statistical learning theory and kernel theory.

For prospective graduate students interested in working with me, please apply to both Mila – Quebec Artificial Intelligence Institute and the Department of Mathematics and Statistics at McGill. Alternatively, applicants may consider co-supervision opportunities with advisors from the computer science program at McGill or Université de Montréal.

Current Students

PhD - McGill University
Co-supervisor :
Master's Research - McGill University
Master's Research - McGill University
Postdoctorate - McGill University
Co-supervisor :
PhD - McGill University
Master's Research - McGill University


Multi-Resolution Continuous Normalizing Flows
Vikram Voleti
Chris Finlay
Addressing Sample Inefficiency in Multi-View Representation Learning
Kumar Krishna Agrawal
Arna Ghosh
Deep PDE Solvers for Subgrid Modelling and Out-of-Distribution Generalization
Patrick Chatain
EuclidNets: An Alternative Operation for Efficient Inference of Deep Learning Models
Xinlin Li
Mariana Parazeres
Alireza Ghaffari
Masoud Asgharian
Vahid Nia
A Reproducible and Realistic Evaluation of Partial Domain Adaptation Methods
Tiago Salvador
Unsupervised Domain Adaptation (UDA) aims at classifying unlabeled target images leveraging source labeled ones. In the case of an extreme l… (see more)abel shift scenario between the source and target domains, where we have extra source classes not present in the target domain, the UDA problem becomes a harder problem called Partial Domain Adaptation (PDA). While different methods have been developed to solve the PDA problem, most successful algorithms use model selection strategies that rely on target labels to find the best hyper-parameters and/or models along training. These strategies violate the main assumption in PDA: only unlabeled target domain samples are available. In addition, there are also experimental inconsistencies between developed methods - different architectures, hyper-parameter tuning, number of runs - yielding unfair comparisons. The main goal of this work is to provide a realistic evaluation of PDA methods under different model selection strategies and a consistent evaluation protocol. We evaluate 6 state-of-the-art PDA algorithms on 2 different real-world datasets using 7 different model selection strategies. Our two main findings are: (i) without target labels for model selection, the accuracy of the methods decreases up to 30 percentage points; (ii) only one method and model selection pair performs well on both datasets. Experiments were performed with our PyTorch framework, BenchmarkPDA, which we open source.
A principled approach for generating adversarial images under non-smooth dissimilarity metrics
Aram-Alexandre Pooladian
Chris Finlay
Tim Hoheisel