Portrait de Julien Pallage n'est pas disponible

Julien Pallage

Maîtrise recherche - Polytechnique
Superviseur⋅e principal⋅e
Sujets de recherche
Optimisation
Systèmes dynamiques

Publications

Sliced-Wasserstein Distance-based Data Selection
Sliced-Wasserstein Distance-based Data Selection
We propose a new unsupervised anomaly detection method based on the sliced-Wasserstein distance for training data selection in machine learn… (voir plus)ing approaches. Our filtering technique is interesting for decision-making pipelines deploying machine learning models in critical sectors, e.g., power systems, as it offers a conservative data selection and an optimal transport interpretation. To ensure the scalability of our method, we provide two efficient approximations. The first approximation processes reduced-cardinality representations of the datasets concurrently. The second makes use of a computationally light Euclidian distance approximation. Additionally, we open the first dataset showcasing localized critical peak rebate demand response in a northern climate. We present the filtering patterns of our method on synthetic datasets and numerically benchmark our method for training data selection. Finally, we employ our method as part of a first forecasting benchmark for our open-source dataset.
Sliced-Wasserstein-based Anomaly Detection and Open Dataset for Localized Critical Peak Rebates
Bertrand Scherrer
Salma Naccache
Christophe B'elanger
Sliced-Wasserstein-based Anomaly Detection and Open Dataset for Localized Critical Peak Rebates
Bertrand Scherrer
Salma Naccache
Christophe B'elanger
In this work, we present a new unsupervised anomaly (outlier) detection (AD) method using the sliced-Wasserstein metric. This filtering tech… (voir plus)nique is conceptually interesting for MLOps pipelines deploying machine learning models in critical sectors, e.g., energy, as it offers a conservative data selection. Additionally, we open the first dataset showcasing localized critical peak rebate demand response in a northern climate. We demonstrate the capabilities of our method on synthetic datasets as well as standard AD datasets and use it in the making of a first benchmark for our open-source localized critical peak rebate dataset.
Wasserstein Distributionally Robust Shallow Convex Neural Networks
Wasserstein Distributionally Robust Shallow Convex Neural Networks
In this work, we propose Wasserstein distributionally robust shallow convex neural networks (WaDiRo-SCNNs) to provide reliable nonlinear pre… (voir plus)dictions when subject to adverse and corrupted datasets. Our approach is based on a new convex training program for
Online Dynamic Submodular Optimization
We propose new algorithms with provable performance for online binary optimization subject to general constraints and in dynamic settings. W… (voir plus)e consider the subset of problems in which the objective function is submodular. We propose the online submodular greedy algorithm (OSGA) which solves to optimality an approximation of the previous round loss function to avoid the NP-hardness of the original problem. We extend OSGA to a generic approximation function. We show that OSGA has a dynamic regret bound similar to the tightest bounds in online convex optimization with respect to the time horizon and the cumulative round optimum variation. For instances where no approximation exists or a computationally simpler implementation is desired, we design the online submodular projected gradient descent (OSPGD) by leveraging the Lova\'sz extension. We obtain a regret bound that is akin to the conventional online gradient descent (OGD). Finally, we numerically test our algorithms in two power system applications: fast-timescale demand response and real-time distribution network reconfiguration.