Can AI Read the Minds of Corporate Executives?
Zhenzhen Fan
Ruslan Goyenko
Issam Hadj Laradji
Fred Liu
Chengyu Zhang
Can Workers Meaningfully Consent to Workplace Wellbeing Technologies?
Shreya Chowdhary
Anna Kawakami
Jina Suh
Mary L Gray
Koustuv Saha
A circulating proteome-informed prognostic model of COVID-19 disease activity that relies on 1 routinely available clinical laboratories 2
Antoine Soulé
Karine Tremblay
Simon Rousseau
Abstract
A circulating proteome-informed prognostic model of COVID-19 disease activity that relies on 1 routinely available clinical laboratories 2
Antoine Soulé
Karine Tremblay
Simon Rousseau
Abstract
Combining Spatial and Temporal Abstraction in Planning for Better Generalization
Mingde Zhao
Harm van Seijen
Romain Laroche
Conditional Flow Matching: Simulation-Free Dynamic Optimal Transport
Alexander Tong
Yanlei Zhang
Kilian FATRAS
Conditional Flow Matching: Simulation-Free Dynamic Optimal Transport
Alexander Tong
Yanlei Zhang
Kilian FATRAS
Continuous normalizing flows (CNFs) are an attractive generative modeling technique, but they have thus far been held back by limitations i… (voir plus)n their simulation-based maximum likelihood training. In this paper, we introduce a new technique called conditional flow matching (CFM), a simulation-free training objective for CNFs. CFM features a stable regression objective like that used to train the stochastic flow in diffusion models but enjoys the efficient inference of deterministic flow models. In contrast to both diffusion models and prior CNF training algorithms, our CFM objec-tive does not require the source distribution to be Gaussian or require evaluation of its density. Based on this new objective, we also introduce optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference, as evaluated in our experiments. Training CNFs with CFM improves results on a variety of conditional and unconditional generation tasks such as inferring single cell dynamics, unsupervised image translation, and Schr ¨ odinger bridge inference. Code is available at https://github.com/atong01/ conditional-flow-matching .
Constant Memory Attentive Neural Processes
Frederick Tung
Hossein Hajimirsadeghi
Mohamed Osama Ahmed
Continually learning representations at scale
Alexandre Galashov
Jovana Mitrovic
Dhruva Tirumala
Yee Whye Teh
Timothy Nguyen
Arslan Chaudhry
Contrast-agnostic deep learning–based registration pipeline: Validation in spinal cord multimodal MRI data
Contrasting Intra-Modal and Ranking Cross-Modal Hard Negatives to Enhance Visio-Linguistic Fine-grained Understanding
Contrastive Positive Unlabeled Learning
Anish Acharya
Sujay Sanghavi
Li Jing
Bhargav Bhushanam
I. Dhillon
Self-supervised pretraining on unlabeled data followed by supervised fine-tuning on labeled data is a popular paradigm for learning from lim… (voir plus)ited labeled examples. We extend this paradigm to the classical positive unlabeled (PU) setting, where the task is to learn a binary classifier given only a few labeled positive samples, and (often) a large amount of unlabeled samples (which could be positive or negative). We first propose a simple extension of standard infoNCE family of contrastive losses, to the PU setting; and show that this learns superior representations, as compared to existing unsupervised and supervised approaches. We then develop a simple methodology to pseudo-label the unlabeled samples using a new PU-specific clustering scheme; these pseudo-labels can then be used to train the final (positive vs. negative) classifier. Our method handily outperforms state-of-the-art PU methods over several standard PU benchmark datasets, while not requiring a-priori knowledge of any class prior (which is a common assumption in other PU methods). We also provide a simple theoretical analysis that motivates our methods.