Publications

DynGFN: Towards Bayesian Inference of Gene Regulatory Networks with GFlowNets
Lazar Atanackovic
Alexander Tong
Jason Hartford
Leo J Lee
Bo Wang
One of the grand challenges of cell biology is inferring the gene regulatory network (GRN) which describes interactions between genes and th… (voir plus)eir products that control gene expression and cellular function. We can treat this as a causal discovery problem but with two non-standard challenges: (1) regulatory networks are inherently cyclic so we should not model a GRN as a directed acyclic graph (DAG), and (2) observations have significant measurement noise, so for typical sample sizes there will always be a large equivalence class of graphs that are likely given the data, and we want methods that capture this uncertainty. Existing methods either focus on challenge (1), identifying cyclic structure from dynamics, or on challenge (2) learning complex Bayesian posteriors over DAGs, but not both. In this paper we leverage the fact that it is possible to estimate the"velocity"of gene expression with RNA velocity techniques to develop an approach that addresses both challenges. Because we have access to velocity information, we can treat the Bayesian structure learning problem as a problem of sparse identification of a dynamical system, capturing cyclic feedback loops through time. Since our objective is to model uncertainty over discrete structures, we leverage Generative Flow Networks (GFlowNets) to estimate the posterior distribution over the combinatorial space of possible sparse dependencies. Our results indicate that our method learns posteriors that better encapsulate the distributions of cyclic structures compared to counterpart state-of-the-art Bayesian structure learning approaches.
Better Training of GFlowNets with Local Credit and Incomplete Trajectories
Generative Flow Networks or GFlowNets are related to Monte-Carlo Markov chain methods (as they sample from a distribution specified by an en… (voir plus)ergy function), reinforcement learning (as they learn a policy to sample composed objects through a sequence of steps), generative models (as they learn to represent and sample from a distribution) and amortized variational methods (as they can be used to learn to approximate and sample from an otherwise intractable posterior, given a prior and a likelihood). They are trained to generate an object
Improving and generalizing flow-based generative models with minibatch optimal transport
Alexander Tong
Yanlei Zhang
Kilian FATRAS
Continuous normalizing flows (CNFs) are an attractive generative modeling technique, but they have been held back by limitations in their si… (voir plus)mulation-based maximum likelihood training. We introduce the generalized conditional flow matching (CFM) technique, a family of simulation-free training objectives for CNFs. CFM features a stable regression objective like that used to train the stochastic flow in diffusion models but enjoys the efficient inference of deterministic flow models. In contrast to both diffusion models and prior CNF training algorithms, CFM does not require the source distribution to be Gaussian or require evaluation of its density. A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference, as evaluated in our experiments. Furthermore, we show that when the true OT plan is available, our OT-CFM method approximates dynamic OT. Training CNFs with CFM improves results on a variety of conditional and unconditional generation tasks, such as inferring single cell dynamics, unsupervised image translation, and Schr\"odinger bridge inference.
A theory of continuous generative flow networks
Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target dist… (voir plus)ributions over compositional objects. A key limitation of GFlowNets until this time has been that they are restricted to discrete spaces. We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces, and perform experiments with two goals in mind. First, we illustrate critical points of the theory and the importance of various assumptions. Second, we empirically demonstrate how observations about discrete GFlowNets transfer to the continuous case and show strong results compared to non-GFlowNet baselines on several previously studied tasks. This work greatly widens the perspectives for the application of GFlowNets in probabilistic inference and various modeling settings.
Benchmarking Graph Neural Networks
Vijay Prakash Dwivedi
Chaitanya K. Joshi
Thomas Laurent
Xavier Bresson
Graph neural networks (GNNs) have become the standard toolkit for analyzing and learning from data on graphs. As the field grows, it becomes… (voir plus) critical to identify key architectures and validate new ideas that generalize to larger, more complex datasets. Unfortunately, it has been increasingly difficult to gauge the effectiveness of new models in the absence of a standardized benchmark with consistent experimental settings. In this paper, we introduce a reproducible GNN benchmarking framework, with the facility for researchers to add new models conveniently for arbitrary datasets. We demonstrate the usefulness of our framework by presenting a principled investigation into the recent Weisfeiler-Lehman GNNs (WL-GNNs) compared to message passing-based graph convolutional networks (GCNs) for a variety of graph tasks, i.e. graph regression/classification and node/link prediction, with medium-scale datasets.
Benchmarking Graph Neural Networks
Vijay Prakash Dwivedi
Chaitanya K. Joshi
Thomas Laurent
Xavier Bresson
Graph neural networks (GNNs) have become the standard toolkit for analyzing and learning from data on graphs. As the field grows, it becomes… (voir plus) critical to identify key architectures and validate new ideas that generalize to larger, more complex datasets. Unfortunately, it has been increasingly difficult to gauge the effectiveness of new models in the absence of a standardized benchmark with consistent experimental settings. In this paper, we introduce a reproducible GNN benchmarking framework, with the facility for researchers to add new models conveniently for arbitrary datasets. We demonstrate the usefulness of our framework by presenting a principled investigation into the recent Weisfeiler-Lehman GNNs (WL-GNNs) compared to message passing-based graph convolutional networks (GCNs) for a variety of graph tasks, i.e. graph regression/classification and node/link prediction, with medium-scale datasets.
Benchmarking Graph Neural Networks
Vijay Prakash Dwivedi
Chaitanya K. Joshi
Thomas Laurent
Xavier Bresson
Graph neural networks (GNNs) have become the standard toolkit for analyzing and learning from data on graphs. As the field grows, it becomes… (voir plus) critical to identify key architectures and validate new ideas that generalize to larger, more complex datasets. Unfortunately, it has been increasingly difficult to gauge the effectiveness of new models in the absence of a standardized benchmark with consistent experimental settings. In this paper, we introduce a reproducible GNN benchmarking framework, with the facility for researchers to add new models conveniently for arbitrary datasets. We demonstrate the usefulness of our framework by presenting a principled investigation into the recent Weisfeiler-Lehman GNNs (WL-GNNs) compared to message passing-based graph convolutional networks (GCNs) for a variety of graph tasks, i.e. graph regression/classification and node/link prediction, with medium-scale datasets.
A circulating proteome-informed prognostic model of COVID-19 disease activity that relies on 1 routinely available clinical laboratories 2
Antoine Soulé
Karine Tremblay
Simon Rousseau
Abstract
Combining Spatial and Temporal Abstraction in Planning for Better Generalization
Mingde Zhao
Harm van Seijen
Romain Laroche
Design and Application of Adaptive Sparse Deep Echo State Network
Cuili Yang
Sheng Yang
Bing Li
The prediction of appliances energy consumption in building belongs to time series forecasting problem, which can be solved by echo state ne… (voir plus)twork (ESN). However, due to the randomly initialized inputs and reservoir, some redundant or irrelevant components are inevitably generated in original ESN. To solve this problem, the adaptive sparse deep echo state network (ASDESN) is proposed, in which the information is processed layer by layer. Firstly, the principal component analysis (PCA) layer is inserted to penalize the redundant projection transmitted between sub-reservoirs. Secondly, the coordinate descent based adaptive sparse learning method is proposed to generate the sparse output weights. Particularly, the designed adaptive threshold strategy is able to enlarge the sparsity of output weights as network depth increases. Moreover, the echo state property (ESP) of ASDESN is given to ensure its applications. The experiment results in both simulated benchmark and real appliances energy datasets illustrate that the proposed ASDESN outperforms other ESNs with higher prediction accuracy and stability.
Meta-topologies define distinct anatomical classes of brain tumours linked to histology and survival
Julius M Kernbach
Daniel Delev
Georg Neuloh
Hans Clusmann
Simon B. Eickhoff
Victor E Staartjes
Flavio Vasella
Michael Weller
Luca Regli
Carlo Serra
Niklaus Krayenbühl
Kevin Akeret
Biomedical image analysis competitions: The state of current participation practice
Matthias Eisenmann
Annika Reinke
Vivienn Weru
Minu Dietlinde Tizabi
Fabian Isensee
T. Adler
PATRICK GODAU
Veronika Cheplygina
Michal Kozubek
Sharib Ali
Anubha Gupta
Jan. Kybic
Alison Professor Noble
Carlos Ortiz de Sol'orzano
Samiksha Pachade
Caroline Petitjean
Daniel Sage
Donglai Wei
Elizabeth Wilden
Deepak Alapatt … (voir 334 de plus)
Vincent Andrearczyk
Ujjwal Baid
Spyridon Bakas
Niranjan Balu
Sophia Bano
Vivek Singh Bawa
Jorge Bernal
Sebastian Bodenstedt
Alessandro Casella
Jinwook Choi
Olivier Commowick
M. Daum
Adrien Depeursinge
Reuben Dorent
J. Egger
H. Eichhorn
Sandy Engelhardt
Melanie Ganz
Gabriel Girard
Lasse Donovan Hansen
Mattias Paul Heinrich
Nicholas Heller
Alessa Hering
Arnaud Huaulm'e
Hyunjeong Kim
Bennett Landman
Hongwei Bran Li
Jianning Li
Junfang Ma
Anne L. Martel
Carlos Mart'in-Isla
Bjoern Menze
Chinedu Innocent Nwoye
Valentin Oreiller
Nicolas Padoy
Sarthak Pati
Kelly Payette
Carole H. Sudre
K. V. Wijnen
Armine Vardazaryan
Tom Kamiel Magda Vercauteren
Martin Wagner
Chuanbo Wang
Moi Hoon Yap
Zeyun Yu
Chuner Yuan
Maximilian Zenk
Aneeq Zia
David Zimmerer
Rina Bao
Chanyeol Choi
Andrew Cohen
Oleh Dzyubachyk
Adrian Galdran
Tianyuan Gan
Tianqi Guo
Pradyumna Gupta
M. Haithami
Edward Ho
Ikbeom Jang
Zhili Li
Zheng Luo
Filip Lux
Sokratis Makrogiannis
Dominikus Muller
Young-Tack Oh
Subeen Pang
Constantin Pape
Gorkem Polat
Charlotte Rosalie Reed
Kanghyun Ryu
Tim Scherr
Vajira L. Thambawita
Haoyu Wang
Xinliang Wang
Kele Xu
H.-I. Yeh
Doyeob Yeo
Yi Yuan
Yan Zeng
Xingwen Zhao
Julian Ronald Abbing
Jannes Adam
Nagesh Adluru
Niklas Agethen
S. Ahmed
Yasmina Al Khalil
Mireia Alenya
Esa J. Alhoniemi
C. An
Talha E Anwar
Tewodros Arega
Netanell Avisdris
D. Aydogan
Yi-Shi Bai
Maria Baldeon Calisto
Berke Doga Basaran
Marcel Beetz
Cheng Bian
Hao-xuan Bian
Kevin Blansit
Louise Bloch
Robert Bohnsack
Sara Bosticardo
J. Breen
Mikael Brudfors
Raphael Brungel
Mariano Cabezas
Alberto Cacciola
Zhiwei Chen
Yucong Chen
Dan Chen
Minjeong Cho
Min-Kook Choi
Chuantao Xie Chuantao Xie
Dana Cobzas
Jorge Corral Acero
Sujit Kumar Das
Marcela de Oliveira
Hanqiu Deng
Guiming Dong
Lars Doorenbos
Cory Efird
Di Fan
Mehdi Fatan Serj
Alexandre Fenneteau
Lucas Fidon
Patryk Filipiak
Ren'e Finzel
Nuno Renato Freitas
C. Friedrich
Mitchell J. Fulton
Finn Gaida
Francesco Galati
Christoforos Galazis
Changna Gan
Zheyao Gao
Sheng Gao
Matej Gazda
Beerend G. A. Gerats
Neil Getty
Adam Gibicar
Ryan J. Gifford
Sajan Gohil
Maria Grammatikopoulou
Daniel Grzech
Orhun Guley
Timo Gunnemann
Chun-Hai Guo
Sylvain Guy
Heonjin Ha
Luyi Han
Ilseok Han
Ali Hatamizadeh
Tianhai He
Ji-Wu Heo
Sebastian Hitziger
SeulGi Hong
Seungbum Hong
Rian Huang
Zi-You Huang
Markus Huellebrand
Stephan Huschauer
M. Hussain
Tomoo Inubushi
Ece Isik Polat
Mojtaba Jafaritadi
Seonghun Jeong
Bailiang Jian
Yu Jiang
Zhifan Jiang
Yu Jin
Smriti Joshi
A. Kadkhodamohammadi
R. A. Kamraoui
Inhak Kang
Jun-Su Kang
Davood Karimi
April Ellahe Khademi
Muhammad Irfan Khan
Suleiman A. Khan
Rishab Khantwal
Kwang-Ju Kim
Timothy Lee Kline
Satoshi Kondo
Elina Kontio
Adrian Krenzer
Artem Kroviakov
Hugo J. Kuijf
Satyadwyoom Kumar
Francesco La Rosa
Abhishek Lad
Doohee Lee
Minho Lee
Chiara Lena
Hao Li
Ling Li
Xingyu Li
F. Liao
Kuan-Ya Liao
Arlindo L. Oliveira
Chaonan Lin
Shanhai Lin
Akis Linardos
M. Linguraru
Han Liu
Tao Liu
Dian Liu
Yanling Liu
Joao Lourencco-Silva
Jing Lu
Jia Lu
Imanol Luengo
Christina Bach Lund
Huan Minh Luu
Yingqi Lv
Uzay Macar
Leon Maechler
L. SinaMansour
Kenji Marshall
Moona Mazher
Richard McKinley
Alfonso Medela
Felix Meissen
Mingyuan Meng
Dylan Bradley Miller
S. Mirjahanmardi
Arnab Kumar Mishra
Samir Mitha
Hassan Mohy-ud-Din
Tony C. W. Mok
Gowtham Krishnan Murugesan
Enamundram Naga Karthik
Sahil Nalawade
Jakub Nalepa
M. Naser
Ramin Nateghi
Hammad Naveed
Quang-Minh Nguyen
Cuong Nguyen Quoc
Bruno Oliveira
David Owen
Jimut Bahan Pal
Junwen Pan
Wei-Dong Pan
Winnie Pang
Bogyu Park
Vivek G. Pawar
Kamlesh Pawar
Michael Peven
Lena Philipp
Tomasz Pieciak
Szymon S Płotka
Marcel Plutat
Fattane Pourakpour
Domen Prelovznik
K. Punithakumar
Abdul Qayyum
Sandro Queir'os
Arman Rahmim
Salar Razavi
Jintao Ren
Mina Rezaei
Jonathan Adam Rico
ZunHyan Rieu
Markus Rink
Johannes Roth
Yusely Ruiz-gonzalez
Numan Saeed
Anindo Saha
Mostafa M. Sami Salem
Ricardo Sanchez-matilla
Kurt G Schilling
Weizhen Shao
Zhiqiang Shen
Ruize Shi
Pengcheng Shi
Daniel Sobotka
Th'eodore Soulier
Bella Specktor Fadida
D. Stoyanov
Timothy Sum Hon Mun
Xiao-Fu Sun
Rong Tao
Franz Thaler
Antoine Th'eberge
Felix Thielke
Helena R. Torres
K. Wahid
Jiacheng Wang
Yifei Wang
Wei David Wang
Xiong Jun Wang
Jianhui Wen
Ning Wen
Marek Wodziński
Yehong Wu
Fangfang Xia
Tianqi Xiang
Cheng Xiaofei
Lizhang Xu
Tingting Xue
Yu‐Xia Yang
Lingxian Yang
Kai Yao
Huifeng Yao
Amirsaeed Yazdani
Michael Yip
Hwa-Seong Yoo
Fereshteh Yousefirizi
Shu-Fen Yu
Lei Yu
Jonathan Zamora
Ramy A. Zeineldin
Dewen Zeng
Jianpeng Zhang
Bokai Zhang
Jiapeng Zhang
Fangxi Zhang
Huahong Zhang
Zhongchen Zhao
Zixuan Zhao
Jia Zhao
Can Zhao
Qiuyue Zheng
Yuheng Zhi
Ziqi Zhou
Baosheng Zou
Klaus Maier-Hein
PAUL F. JÄGER
Annette Kopp-Schneider
Lena Maier-Hein