Posterior Sampling of the Initial Conditions of the Universe from Non-linear Large Scale Structures using Score-Based Generative Models
Ronan Legin
Matthew Ho
Pablo Lemos
Shirley Ho
Benjamin Wandelt
Predicting Solar PV Output Based on Hybrid Deep Learning and Physical
Models: Case Study of Morocco
Samira Abousaid
Ismail Belhaj
Abdelaziz Berrado
Hicham Bouzekri
Prognosis of critically ill immunocompromised patients with virus-detected acute respiratory failure
Maxime Bertrand
Virginie Lemiale
Emmanuel Canet
François Barbier
Achille Kouatchet
Alexandre Demoule
Kada Klouche
Anne-Sophie Moreau
Laurent Argaud
Florent Wallet
Jean Herlé Raphalen
Djamel Mokart
Fabrice Bruneel
Frédéric Pène
Elie Azoulay
Summary of the Fourth International Workshop on Deep Learning for Testing and Testing for Deep Learning (DeepTest 2023)
Matteo Biagiola
Nicolás Cardozo
Donghwan Shin
Andrea Stocco
Vincenzo Riccio
A cry for help: Early detection of brain injury in newborns
Charles Onu
Samantha Latremouille
Arsenii Gorin
Junhao Wang
Uchenna Ekwochi
P. Ubuane
O. Kehinde
Muhammad A. Salisu
Datonye Briggs
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Kashif Rasul
Arjun Ashok
Andrew Robert Williams
Arian Khorasani
George Adamopoulos
Rishika Bhagwatkar
Marin Bilovs
Hena Ghonia
Nadhir Hassen
Anderson Schneider
Sahil Garg
Yuriy Nevmyvaka
Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting
Kashif Rasul
Arjun Ashok
Andrew Robert Williams
Arian Khorasani
George Adamopoulos
Rishika Bhagwatkar
Marin Bilovs
Hena Ghonia
N. Hassen
Anderson Schneider
Sahil Garg
Yuriy Nevmyvaka
Over the past years, foundation models have caused a paradigm shift in machine learning due to their unprecedented capabilities for zero-sho… (see more)t and few-shot generalization. However, despite the success of foundation models in modalities such as natural language processing and computer vision, the development of foundation models for time series forecasting has lagged behind. We present Lag-Llama, a general-purpose foundation model for univariate probabilistic time series forecasting based on a decoder-only transformer architecture that uses lags as covariates. Lag-Llama is pretrained on a large corpus of diverse time series data from several domains, and demonstrates strong zero-shot generalization capabilities compared to a wide range of forecasting models on downstream datasets across domains. Moreover, when fine-tuned on relatively small fractions of such previously unseen datasets, Lag-Llama achieves state-of-the-art performance, outperforming prior deep learning approaches, emerging as the best general-purpose model on average. Lag-Llama serves as a strong contender to the current state-of-art in time series forecasting and paves the way for future advancements in foundation models tailored to time series data.
PhyloGFN: Phylogenetic inference with generative flow networks
Ming Yang Zhou
Zichao Yan
Elliot Layne
Nikolay Malkin
Dinghuai Zhang
Moksh J. Jain
Phylogenetics is a branch of computational biology that studies the evolutionary relationships among biological entities. Its long history a… (see more)nd numerous applications notwithstanding, inference of phylogenetic trees from sequence data remains challenging: the high complexity of tree space poses a significant obstacle for the current combinatorial and probabilistic techniques. In this paper, we adopt the framework of generative flow networks (GFlowNets) to tackle two core problems in phylogenetics: parsimony-based and Bayesian phylogenetic inference. Because GFlowNets are well-suited for sampling complex combinatorial structures, they are a natural choice for exploring and sampling from the multimodal posterior distribution over tree topologies and evolutionary distances. We demonstrate that our amortized posterior sampler, PhyloGFN, produces diverse and high-quality evolutionary hypotheses on real benchmark datasets. PhyloGFN is competitive with prior works in marginal likelihood estimation and achieves a closer fit to the target distribution than state-of-the-art variational inference methods. Our code is available at https://github.com/zmy1116/phylogfn.
AAPM Medical Physics Practice Guideline 14.a: Yttrium‐90 microsphere radioembolization
Nathan C. Busse
Muthana S. A. L. Al‐Ghazi
Nadine Abi‐Jaoudeh
Diane Alvarez
Ahmet S. Ayan
Erli Chen
Michael D. Chuong
William A. Dezarn
Stephen A. Graves
Robert F. Hobbs
Mary Ellen Jafari
S. Peter Kim
Nichole M. Maughan
Andrew M. Polemi
Jennifer R. Stickel
Explainable Attention for Few-shot Learning and Beyond
Bahareh Nikpour
A general framework for the practical disintegration of PAC-Bayesian bounds
Paul Viallard
Amaury Habrard
Emilie Morvant
Language-Guided Reinforcement Learning for Hard Attention in Few-Shot Learning
Bahareh Nikpour
Attention mechanisms have demonstrated significant potential in enhancing learning models by identifying key portions of input data, particu… (see more)larly in scenarios with limited training samples. Inspired by human perception, we propose that focusing on essential data segments, rather than the entire dataset, can improve the accuracy and reliability of the learning models. However, identifying these critical data segments, or"hard attention finding,"is challenging, especially in few-shot learning, due to the scarcity of training data and the complexity of model parameters. To address this, we introduce LaHA, a novel framework that leverages language-guided deep reinforcement learning to identify and utilize informative data regions, thereby improving both interpretability and performance. Extensive experiments on benchmark datasets validate the effectiveness of LaHA.