Publications

SAGE: Smart home Agent with Grounded Execution
Dmitriy Rivkin
Francois Hogan
Amal Feriani
Adam Sigal
Steve Liu
Spatial Distribution Modeling of Pistacia atlantica using Artificial Neural Network in Khohir National Park
Tymour Rostani Shahraji
Reza Akhavan
Reza Ebrahimi Atani
Tuning Minimum-Norm regularization parameters for optimal MEG connectivity estimation
Elisabetta Vallarino
Ana Sofia Hincapié
Richard M. Leahy
Annalisa Pascarella
Alberto Sorrentino
Sara Sommariva
The regularization parameter of the Minimum Norm Estimate of neural activity impacts connectivity estimationWe study empirically the optimal… (see more) parameter for connectivity estimation using realistic synthetic datasetsWe find the optimal parameter for connectivity estimation is systematically smaller than the optimal parameter for source imaging; different connectivity metrics yield the same resultCode and data are available open source.
Adaptive Resolution Residual Networks
We introduce Adaptive Resolution Residual Networks (ARRNs), a form of neural operator that enables the creation of networks for signal-based… (see more) tasks that can be rediscretized to suit any signal resolution. ARRNs are composed of a chain of Laplacian residuals that each contain ordinary layers, which do not need to be rediscretizable for the whole network to be rediscretizable. ARRNs have the property of requiring a lower number of Laplacian residuals for exact evaluation on lower-resolution signals, which greatly reduces computational cost. ARRNs also implement Laplacian dropout, which encourages networks to become robust to low-bandwidth signals. ARRNs can thus be trained once at high-resolution and then be rediscretized on the fly at a suitable resolution with great robustness.
Criticality of resting-state EEG predicts perturbational complexity and level of consciousness during anesthesia.
Charlotte Maschke
Jordan O'Byrne
Michele Angelo Colombo
Melanie Boly
Olivia Gosseries
Steven Laureys
Mario Rosanova
Stefanie Blain-Moraes
Consciousness has been proposed to be supported by electrophysiological patterns poised at criticality, a dynamical regime which exhibits ad… (see more)aptive computational properties, maximally complex patterns and divergent sensitivity to perturbation. Here, we investigated dynamical properties of the resting-state electroencephalogram of healthy subjects undergoing general anesthesia with propofol, xenon or ketamine. We then studied the relation of these dynamic properties with the perturbational complexity index (PCI), which has shown remarkably high sensitivity in detecting consciousness independent of behavior. All participants were unresponsive under anesthesia, while consciousness was retained only during ketamine anesthesia (in the form of vivid dreams)., enabling an experimental dissociation between unresponsiveness and unconsciousness. We estimated (i) avalanche criticality, (ii) chaoticity, and (iii) criticality-related measures, and found that states of unconsciousness were characterized by a distancing from both the edge of activity propagation and the edge of chaos. We were then able to predict individual subjects’ PCI (i.e., PCImax) with a mean absolute error below 7%. Our results establish a firm link between the PCI and criticality and provide further evidence for the role of criticality in the emergence of consciousness.
Deep PDE Solvers for Subgrid Modelling and Out-of-Distribution Generalization
Adam Oberman
Generative Learning of Continuous Data by Tensor Networks
Alex Meiburg
Jian Hua Chen
Raphaelle Tihon
Alejandro Perdomo-ortiz
Physics-Informed Transformer Networks
Physics-informed neural networks (PINNs) have been recognized as a viable alternative to conventional numerical solvers for Partial Differen… (see more)tial Equations (PDEs). The main appeal of PINNs is that since they directly enforce the PDE equation, one does not require access to costly ground truth solutions for training the model. However, a key challenge is their limited generalization across varied initial conditions. Addressing this, our study presents a novel Physics-Informed Transformer (PIT) model for learning the solution operator for PDEs. Using the attention mechanism, PIT learns to leverage the relationships between its initial condition and query points, resulting in a significant improvement in generalization. Moreover, in contrast to existing physics-informed networks, our model is invariant to the discretization of the input domain, providing great flexibility in problem specification and training. We validated our proposed method on the 1D Burgers’ and the 2D Heat equations, demonstrating notable improvement over standard PINN models for operator learning with negligible computational overhead.
Root phosphatase activity is coordinated with the root conservation gradient across a phosphorus gradient in a lowland tropical forest
Xavier Guilbeault-Mayers
Soil phosphorus (P) is a growth-limiting nutrient in tropical ecosystems, driving diverse P-acquisition strategies among plants. Particularl… (see more)y, mining for inorganic P through phosphomonoesterase (PME) activity is essential, given the substantial proportion of organic P in soils. Yet the relationship between PME activity and other P-acquisition root traits remains unclear. We measured root PME activity and commonly-measured root traits, including root diameter, specific root length (SRL), root tissue density (RTD), and nitrogen concentration ([N]) in 18 co-occurring trees across soils with varying P availability to better understand trees response to P supply. Root [N] and RTD were inversely related, and that axis was related to soil P supply. Indeed, both traits correlated positively and negatively to PME activity, which responded strongly to P supply. Conversely, root diameter was inversely related to SRL, but this axis was not related to P supply. Suggesting that limiting similarity influenced variation along the diameter-SRL axis, explaining high local trait diversity. Meanwhile, environmental filtering tended to impact trait values along the root [N]-RTD axis. Overall, P availability indicator traits like PME activity and root hairs only tended to be associated with these axes, highlighting limitations of these axes in describing convergent adaptations at local sites.
Causal Fair Metric: Bridging Causality, Individual Fairness, and Adversarial Robustness
Ahmad-reza Ehyaei
Samira Samadi
Despite the essential need for comprehensive considerations in responsible AI, factors like robustness, fairness, and causality are often st… (see more)udied in isolation. Adversarial perturbation, used to identify vulnerabilities in models, and individual fairness, aiming for equitable treatment of similar individuals, despite initial differences, both depend on metrics to generate comparable input data instances. Previous attempts to define such joint metrics often lack general assumptions about data or structural causal models and were unable to reflect counterfactual proximity. To address this, our paper introduces a causal fair metric formulated based on causal structures encompassing sensitive attributes and protected causal perturbation. To enhance the practicality of our metric, we propose metric learning as a method for metric estimation and deployment in real-world problems in the absence of structural causal models. We also demonstrate the application of our novel metric in classifiers. Empirical evaluation of real-world and synthetic datasets illustrates the effectiveness of our proposed metric in achieving an accurate classifier with fairness, resilience to adversarial perturbations, and a nuanced understanding of causal relationships.
FETA: Fairness Enforced Verifying, Training, and Predicting Algorithms for Neural Networks
Aishwarya Sivaraman
TorchProbe: Fuzzing Dynamic Deep Learning Compilers
Qidong Su
Gennady G. Pekhimenko
Static and dynamic computational graphs represent two distinct approaches to constructing deep learning frameworks. The former prioritizes c… (see more)ompiler-based optimizations, while the latter focuses on programmability and user-friendliness. The recent release of PyTorch 2.0, which supports compiling arbitrary deep learning programs in Python, signifies a new direction in the evolution of deep learning infrastructure to incorporate compiler techniques in a more dynamic manner and support more dynamic language features like dynamic control flows and closures. Given PyTorch's seamless integration with Python, its compiler aims to support arbitrary deep learning code written in Python. However, the inherent dynamism of Python poses challenges to the completeness and robustness of the compiler. While recent research has introduced fuzzing to test deep learning compilers, there is still a lack of comprehensive analysis on how to test dynamic features. To address this issue, we propose several code transformations to generate test cases involving dynamic features. These transformations preserve the program's semantics, ensuring that any discrepancy between the transformed and original programs indicates the presence of a bug. Through our approach, we have successfully identified twenty previously unknown bugs in the PyTorch compiler and its underlying tensor compiler Triton.