Accueil

Inspirer le développement de l'intelligence artificielle au bénéfice de tous·tes

Un professeur s'entretient avec ses étudiants dans un café/lounge.

Situé au cœur de l’écosystème québécois en intelligence artificielle (IA), Mila rassemble une communauté de plus de 1200 personnes spécialisées en apprentissage automatique et dédiées à l’excellence scientifique et l’innovation.

À propos

À la une

Corps professoral

Fondé en 1993 par le professeur Yoshua Bengio, Mila regroupe aujourd'hui plus de 140 professeur·e·s affilié·e·s à l'Université de Montréal, l'Université McGill, Polytechnique Montréal et HEC Montréal. L'institut accueille également des professeur·e·s de l'Université Laval, de l'Université de Sherbrooke, de l'École de technologie supérieure (ÉTS) et de l'Université Concordia.

Consultez l'annuaire en ligne

Photo de Yoshua Bengio

Publications récentes

3D Foundation Model-Based Loop Closing for Decentralized Collaborative SLAM
Pierre-Yves Lajoie
Benjamin Ramtoula
Daniele De Martini
Decentralized Collaborative Simultaneous Localization and Mapping (C-SLAM) techniques often struggle to identify map overlaps due to signifi… (voir plus)cant viewpoint variations among robots. Motivated by recent advancements in 3D foundation models, which can register images despite large viewpoint differences, we propose a robust loop closing approach that leverages these models to establish inter-robot measurements. In contrast to resource-intensive methods requiring full 3D reconstruction within a centralized map, our approach integrates foundation models into existing SLAM pipelines, yielding scalable and robust multi-robot mapping. Our contributions include: 1) integrating 3D foundation models to reliably estimate relative poses from monocular image pairs within decentralized C-SLAM; 2) introducing robust outlier mitigation techniques critical to the use of these relative poses and 3) developing specialized pose graph optimization formulations that efficiently resolve scale ambiguities. We evaluate our method against state-of-the-art approaches, demonstrating improvements in localization and mapping accuracy, alongside significant gains in computational and memory efficiency. These results highlight the potential of our approach for deployment in large-scale multi-robot scenarios.
The role of Large Language Models in IoT security: A systematic review of advances, challenges, and opportunities
Saeid Jamshidi
Negar Shahabi
Amin Nikanjam
Kawser Wazed Nafi
Carol Fung
Wavefunction Flows: Efficient Quantum Simulation of Continuous Flow Models
David Layden
R. Sweke
Vojtvech Havl'ivcek
Anirban Chowdhury
Flow models are a cornerstone of modern machine learning. They are generative models that progressively transform probability distributions … (voir plus)according to learned dynamics. Specifically, they learn a continuous-time Markov process that efficiently maps samples from a simple source distribution into samples from a complex target distribution. We show that these models are naturally related to the Schr\"odinger equation, for an unusual Hamiltonian on continuous variables. Moreover, we prove that the dynamics generated by this Hamiltonian can be efficiently simulated on a quantum computer. Together, these results give a quantum algorithm for preparing coherent encodings (a.k.a., qsamples) for a vast family of probability distributions--namely, those expressible by flow models--by reducing the task to an existing classical learning problem, plus Hamiltonian simulation. For statistical problems defined by flow models, such as mean estimation and property testing, this enables the use of quantum algorithms tailored to qsamples, which may offer advantages over classical algorithms based only on samples from a flow model. More broadly, these results reveal a close connection between state-of-the-art machine learning models, such as flow matching and diffusion models, and one of the main expected capabilities of quantum computers: simulating quantum dynamics.
TGM: a Modular and Efficient Library for Machine Learning on Temporal Graphs
Tran Gia Bao Ngo
Jure Leskovec
Michael M. Bronstein
Matthias Fey
Well-designed open-source software drives progress in Machine Learning (ML) research. While static graph ML enjoys mature frameworks like Py… (voir plus)Torch Geometric and DGL, ML for temporal graphs (TG), networks that evolve over time, lacks comparable infrastructure. Existing TG libraries are often tailored to specific architectures, hindering support for diverse models in this rapidly evolving field. Additionally, the divide between continuous- and discrete-time dynamic graph methods (CTDG and DTDG) limits direct comparisons and idea transfer. To address these gaps, we introduce Temporal Graph Modelling (TGM), a research-oriented library for ML on temporal graphs, the first to unify CTDG and DTDG approaches. TGM offers first-class support for dynamic node features, time-granularity conversions, and native handling of link-, node-, and graph-level tasks. Empirically, TGM achieves an average 7.8x speedup across multiple models, datasets, and tasks compared to the widely used DyGLib, and an average 175x speedup on graph discretization relative to available implementations. Beyond efficiency, we show in our experiments how TGM unlocks entirely new research possibilities by enabling dynamic graph property prediction and time-driven training paradigms, opening the door to questions previously impractical to study. TGM is available at https://github.com/tgm-team/tgm

IA pour l'humanité

Le développement socialement responsable et bénéfique de l'IA est une dimension fondamentale de la mission de Mila. En tant que chef de file, nous souhaitons contribuer au dialogue social et au développement d'applications qui seront bénéfiques pour la société.

En savoir plus

Une personne regarde un ciel étoilé.