Portrait de Jian Tang

Jian Tang

Membre académique principal
Chaire en IA Canada-CIFAR
Professeur agrégé, HEC Montréal, Département de sciences de la décision
Professeur associé, Université de Montréal, Département d'informatique et de recherche opérationnelle (DIRO)
Fondateur, BioGeometry
Sujets de recherche
Biologie computationnelle
Grands modèles de langage (LLM)
IA pour la science
Modèles génératifs
Modélisation moléculaire
Réseaux de neurones en graphes

Biographie

Jian Tang est professeur agrégé au département de sciences de la décision de HEC. Il est aussi professeur associé au département informatique et recherche opérationnelle (DIRO) de l'Université de Montréal et un membre académique principal à Mila – Institut québécois d’intelligence artificielle. Il est titulaire d'une chaire de recherche en IA Canada-CIFAR et le fondateur de BioGeometry, une entreprise en démarrage spécialisée dans l'IA générative pour la découverte d'anticorps. Ses principaux domaines de recherche sont les modèles génératifs profonds, l'apprentissage automatique des graphes et leurs applications à la découverte de médicaments. Il est un leader international dans le domaine de l'apprentissage automatique des graphes, et son travail représentatif sur l'apprentissage de la représentation des nœuds, LINE, a été largement reconnu et cité plus de 5 000 fois. Il a également réalisé de nombreux travaux pionniers sur l'IA pour la découverte de médicaments, notamment le premier cadre d'apprentissage automatique à source ouverte pour la découverte de médicaments, TorchDrug et TorchProtein.

Étudiants actuels

Doctorat - UdeM
Superviseur⋅e principal⋅e :
Doctorat - Université de Montréal
Doctorat - UdeM
Superviseur⋅e principal⋅e :
Doctorat - UdeM
Doctorat - UdeM

Publications

FreqPolicy: Efficient Flow-based Visuomotor Policy via Frequency Consistency
Yifei Su
Ning Liu
Dong Chen
Zhen Zhao
Kun Wu
Meng Li
Zhiyuan Xu
Zhengping Che
Overcoming Long-Context Limitations of State-Space Models via Context-Dependent Sparse Attention
Efficient long-context modeling remains a critical challenge for natural language processing (NLP), as the time complexity of the predominan… (voir plus)t Transformer architecture scales quadratically with the sequence length. While state-space models (SSMs) offer alternative sub-quadratic solutions, they struggle to capture long-range dependencies effectively. In this work, we focus on analyzing and improving the long-context modeling capabilities of SSMs. We show that the widely used synthetic task, associative recall, which requires a model to recall a value associated with a single key without context, insufficiently represents the complexities of real-world long-context modeling. To address this limitation, we extend the associative recall to a novel synthetic task, \emph{joint recall}, which requires a model to recall the value associated with a key given in a specified context. Theoretically, we prove that SSMs do not have the expressiveness to solve multi-query joint recall in sub-quadratic time complexity. To resolve this issue, we propose a solution based on integrating SSMs with Context-Dependent Sparse Attention (CDSA), which has the expressiveness to solve multi-query joint recall with sub-quadratic computation. To bridge the gap between theoretical analysis and real-world applications, we propose locality-sensitive Hashing Attention with sparse Key Selection (HAX), which instantiates the theoretical solution and is further tailored to natural language domains. Extensive experiments on both synthetic and real-world long-context benchmarks show that HAX consistently outperforms SSM baselines and SSMs integrated with context-independent sparse attention (CISA).
SEEA-R1: Tree-Structured Reinforcement Fine-Tuning for Self-Evolving Embodied Agents
Wanxin Tian
Shijie Zhang
Kevin Zhang
Xiaowei Chi
Chun-Kai Fan
Junyu Lu
Yulin Luo
Qiang Zhou
Yiming Zhao
Ning Liu
Siyu Lin
Zhiyuan Qin
Xiaozhu Ju
Shanghang Zhang
Bering: joint cell segmentation and annotation for spatial transcriptomics with transferred graph embeddings
Kang Jin
Francesca Viggiani
Claire Callahan
Bruce J. Aronow
Jian Shu
Single-cell spatial transcriptomics such as in-situ hybridization or sequencing technologies can provide subcellular resolution that enables… (voir plus) the identification of individual cell identities, locations, and a deep understanding of subcellular mechanisms. However, accurate segmentation and annotation that allows individual cell boundaries to be determined remains a major challenge that limits all the above and downstream insights. Current machine learning methods heavily rely on nuclei or cell body staining, resulting in the significant loss of both transcriptome depth and the limited ability to learn latent representations of spatial colocalization relationships. Here, we propose Bering, a graph deep learning model that leverages transcript colocalization relationships for joint noise-aware cell segmentation and molecular annotation in 2D and 3D spatial transcriptomics data. Graph embeddings for the cell annotation are transferred as a component of multi-modal input for cell segmentation, which is employed to enrich gene relationships throughout the process. To evaluate performance, we benchmarked Bering with state-of-the-art methods and observed significant improvement in cell segmentation accuracies and numbers of detected transcripts across various spatial technologies and tissues. To streamline segmentation processes, we constructed expansive pre-trained models, which yield high segmentation accuracy in new data through transfer learning and self-distillation, demonstrating the generalizability of Bering.
DOLPHIN advances single-cell transcriptomics beyond gene level by leveraging exon and junction reads
Kailu Song
Yumin Zheng
Bowen Zhao
David H. Eidelman
The advent of single-cell sequencing has revolutionized the study of cellular dynamics, providing unprecedented resolution into the molecula… (voir plus)r states and heterogeneity of individual cells. However, the rich potential of exon-level information and junction reads within single cells remains underutilized. Conventional gene-count methods overlook critical exon and junction data, limiting the quality of cell representation and downstream analyses such as subpopulation identification and alternative splicing detection. We introduce DOLPHIN, a deep learning method that integrates exon-level and junction read data, representing genes as graph structures. These graphs are processed by a variational graph autoencoder to improve cell embeddings. DOLPHIN not only demonstrates superior performance in cell clustering, biomarker discovery, and alternative splicing detection but also provides a distinct capability to detect subtle transcriptomic differences at the exon level that are often masked in gene-level analyses. By examining cellular dynamics with enhanced resolution, DOLPHIN provides new insights into disease mechanisms and potential therapeutic targets.
Landscape of Thoughts: Visualizing the Reasoning Process of Large Language Models
Zhanke Zhou
Xiao Feng
Sanmi Koyejo
Bo Han
Efficient Regression-Based Training of Normalizing Flows for Boltzmann Generators
Oscar Davis
Michael Bronstein
Avishek Joey Bose
Simulation-free training frameworks have been at the forefront of the generative modelling revolution in continuous spaces, leading to large… (voir plus)-scale diffusion and flow matching models. However, such modern generative models suffer from expensive inference, inhibiting their use in numerous scientific applications like Boltzmann Generators (BGs) for molecular conformations that require fast likelihood evaluation. In this paper, we revisit classical normalizing flows in the context of BGs that offer efficient sampling and likelihoods, but whose training via maximum likelihood is often unstable and computationally challenging. We propose Regression Training of Normalizing Flows (RegFlow), a novel and scalable regression-based training objective that bypasses the numerical instability and computational challenge of conventional maximum likelihood training in favour of a simple
Cosmic Ray Muon Polarization to Facilitate Atmospheric Neutrino Physics
Mingchen Sun
Shihan Zhao
Rui-Xuan Gao
He-Sheng Liu
Aiyu Bai
Atmospheric neutrinos (ATNs) offer a paradigm for understanding neutrino properties, while it is critical to quantify uncertainties in flux … (voir plus)modeling. Since ATNs are produced simultaneously with cosmic ray muons, precision measurements of cosmic ray muons, including arrival direction, energy spectra, and spin polarization, will help reduce ATN production uncertainties and facilitate atmospheric neutrino physics. This letter proposes using an array strategy to measure the spin polarization of cosmic ray muons, thereby strengthening the emergent synergies between cosmic ray and atmospheric neutrino physics. Constraints on long-standing atmospheric neutrino flux uncertainties at the percentage level in a few-GeV energy range are achievable within one year using a
Self-Evolving Curriculum for LLM Reasoning
Towards Protein Sequence & Structure Co-Design with Multi-Modal Language Models
Stephen Zhewen Lu
Hongyu Guo
Proteins perform diverse biological functions, governed by the intricate relationship between their sequence and three-dimensional structure… (voir plus). While protein language models (PLMs) have demonstrated remarkable success in functional annotation and structure prediction, their potential for sequence-structure co-design remains underexplored. This limitation arises from pre-training objectives that favor masked token prediction over generative modeling. In this work, we systematically explore sampling strategies to enhance the generative capabilities of PLMs for co-design. Notably, we introduce a ranked iterative decoding with re-masking scheme, enabling PLMs to generate sequences and structures more effectively. Benchmarking ESM3 across multiple scales, we demonstrate that using PLMs effectively at sampling time for co-design tasks can outperform specialized architectures that lack comparable scaling properties. Our work advances the field of computational protein design by equipping PLMs with robust generative capabilities tailored to sequence-structure interdependence.
Design of Ligand-Binding Proteins with Atomic Flow Matching
Junqi Liu
Shaoning Li
Zhi Yang
Origin of Nonlinear Circular Photocurrent in 2D Semiconductor
<mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" display="inline"><mml:mrow><mml:msub><mml:mrow><mml:mi>MoS</mml:mi></mml:mrow><mml:mn>2</mml:mn></mml:msub></mml:mrow></mml:math>
Yanchong Zhao
Fengyu Chen
Jing Liang
Mohammad Saeed Bahramy
Mingwei Yang
Yao Guang
Xiaomei Li
Zheng Wei
Jiaojiao Zhao
Mengzhou Liao
Cheng Shen
Qinqin Wang
Rong Yang
Kenji Watanabe
Takashi Taniguchi
Zhiheng Huang
Dongxia Shi
Kaihui Liu
Zhipei Sun … (voir 3 de plus)
Ji Feng
Luojun Du
Guangyu Zhang