Portrait of Christian Gagné

Christian Gagné

Associate Academic Member
Canada CIFAR AI Chair
Full Professor, Université Laval, Department of Electrical and Computer Engineering
Director of IID (Institute Intelligence and Data), Institute Intelligence and Data (IID)
Research Topics
Computer Vision
Deep Learning
Learning to Program
Medical Machine Learning
Representation Learning

Biography

Christian Gagné has been a professor in the Department of Electrical and Computer Engineering at Université Laval since 2008.

He is the director of the Institute Intelligence and Data (IID), holds a Canada CIFAR AI Chair, and is an associate member of Mila – Quebec Artificial Intelligence Institute.

Gagné is also a member of Université Laval’s Computer Vision and Systems Laboratory (LVSN), as well as its Robotics, Vision and Machine Intelligence Research Centre (CeRVIM) and its Big Data Research Centre (CRDM). He is a member of the REPARTI and UNIQUE strategic clusters of the FRQNT, the VITAM centre of the FRQS, and the International Observatory on the Societal Impacts of AI and Digital Technologies (OBVIA).

Gagné’s research focuses on the development of methods for machine learning and stochastic optimization. In particular, he is interested in deep neural networks, representation learning and transfer, meta-learning and multitasking. He is also interested in optimization approaches based on probabilistic models and evolutionary algorithms, including black-box optimization and automatic programming. An important part of his work is the practical application of these techniques in fields like computer vision, microscopy, healthcare, energy and transportation.

Current Students

PhD - Université Laval
PhD - Université Laval
Master's Research - Université Laval
PhD - Université Laval
PhD - Université Laval
PhD - Université Laval
PhD - Université Laval

Publications

Deep Active Learning: Unified and Principled Method for Query and Training
Changjian Shui
Fan Zhou
Boyu Wang
In this paper, we are proposing a unified and principled method for both the querying and training processes in deep batch active learning. … (see more)We are providing theoretical insights from the intuition of modeling the interactive procedure in active learning as distribution matching, by adopting the Wasserstein distance. As a consequence, we derived a new training loss from the theoretical analysis, which is decomposed into optimizing deep neural network parameters and batch query selection through alternative optimization. In addition, the loss for training a deep neural network is naturally formulated as a min-max optimization problem through leveraging the unlabeled data information. Moreover, the proposed principles also indicate an explicit uncertainty-diversity trade-off in the query batch selection. Finally, we evaluate our proposed method on different benchmarks, consistently showing better empirical performances and a better time-efficient query strategy compared to the baselines.
Session details: Digital entertainment technologies and arts track papers
Mike Preuss
Session details: Digital entertainment technologies and arts track posters
Mike Preuss
Session details: Digital entertainment technologies and arts track posters
Mike Preuss
Session details: Digital entertainment technologies and arts track papers
Mike Preuss