Perspectives sur l’IA pour les responsables des politiques
Co-dirigé par Mila et le CIFAR, ce programme met en relation les décideur·euse·s avec des chercheur·euse·s de pointe en IA grâce à une combinaison de consultations ouvertes et d'exercices de test de faisabilité des politiques. La prochaine session aura lieu les 9 et 10 octobre.
Hugo Larochelle nommé directeur scientifique de Mila
Professeur associé à l’Université de Montréal et ancien responsable du laboratoire de recherche en IA de Google à Montréal, Hugo Larochelle est un pionnier de l’apprentissage profond et fait partie des chercheur·euses les plus respecté·es au Canada.
Mila organise son premier hackathon en informatique quantique le 21 novembre. Une journée unique pour explorer le prototypage quantique et l’IA, collaborer sur les plateformes de Quandela et IBM, et apprendre, échanger et réseauter dans un environnement stimulant au cœur de l’écosystème québécois en IA et en quantique.
Une nouvelle initiative pour renforcer les liens entre la communauté de recherche, les partenaires et les expert·e·s en IA à travers le Québec et le Canada, grâce à des rencontres et événements en présentiel axés sur l’adoption de l’IA dans l’industrie.
Nous utilisons des témoins pour analyser le trafic et l’utilisation de notre site web, afin de personnaliser votre expérience. Vous pouvez désactiver ces technologies à tout moment, mais cela peut restreindre certaines fonctionnalités du site. Consultez notre Politique de protection de la vie privée pour en savoir plus.
Paramètre des cookies
Vous pouvez activer et désactiver les types de cookies que vous souhaitez accepter. Cependant certains choix que vous ferez pourraient affecter les services proposés sur nos sites (ex : suggestions, annonces personnalisées, etc.).
Cookies essentiels
Ces cookies sont nécessaires au fonctionnement du site et ne peuvent être désactivés. (Toujours actif)
Cookies analyse
Acceptez-vous l'utilisation de cookies pour mesurer l'audience de nos sites ?
Multimedia Player
Acceptez-vous l'utilisation de cookies pour afficher et vous permettre de regarder les contenus vidéo hébergés par nos partenaires (YouTube, etc.) ?
Accurate modeling of physical systems governed by partial differential equations is a central challenge in scientific computing. In oceanogr… (voir plus)aphy, high-resolution current data are critical for coastal management, environmental monitoring, and maritime safety. However, available satellite products, such as Copernicus data for sea water velocity at ~0.08 degrees spatial resolution and global ocean models, often lack the spatial granularity required for detailed local analyses. In this work, we (a) introduce a supervised deep learning framework based on neural operators for solving PDEs and providing arbitrary resolution solutions, and (b) propose downscaling models with an application to Copernicus ocean current data. Additionally, our method can model surrogate PDEs and predict solutions at arbitrary resolution, regardless of the input resolution. We evaluated our model on real-world Copernicus ocean current data and synthetic Navier-Stokes simulation datasets.
Accurate modeling of physical systems governed by partial differential equations is a central challenge in scientific computing. In oceanogr… (voir plus)aphy, high-resolution current data are critical for coastal management, environmental monitoring, and maritime safety. However, available satellite products, such as Copernicus data for sea water velocity at ~0.08 degrees spatial resolution and global ocean models, often lack the spatial granularity required for detailed local analyses. In this work, we (a) introduce a supervised deep learning framework based on neural operators for solving PDEs and providing arbitrary resolution solutions, and (b) propose downscaling models with an application to Copernicus ocean current data. Additionally, our method can model surrogate PDEs and predict solutions at arbitrary resolution, regardless of the input resolution. We evaluated our model on real-world Copernicus ocean current data and synthetic Navier-Stokes simulation datasets.
Pandemics like COVID-19 have illuminated the significant disparities in the performance of national healthcare systems (NHCSs) during rapidl… (voir plus)y evolving crises. The challenge of comparing NHCS performance has been a difficult topic in the literature. To address this gap, our study introduces a bi-criteria longitudinal algorithm that merges fuzzy clustering with Data Envelopment Analysis (DEA). This new approach provides a comprehensive and dynamic assessment of NHCS performance and efficiency during the early phase of the pandemic. By categorizing each NHCS as an efficient performer, inefficient performer, efficient underperformer, or inefficient underperformer, our analysis vividly represents performance dynamics, clearly identifying the top and bottom performers within each cluster of countries. Our methodology offers valuable insights for performance evaluation and benchmarking, with significant implications for enhancing pandemic response strategies. The study’s findings are discussed from theoretical and practical perspectives, offering guidance for future health system assessments and policy-making.
Mastery of Key Performance Indicators (KPIs) in the realm of photovoltaic solar power plants is pivotal for evaluating their effectiveness a… (voir plus)nd fine-tuning their operational efficiency. The assessment of these plants' performance has con-sistently stood as a focal point in scientific research. Nevertheless, the investigation into the process of selecting a framework for classifying KPIs, particularly through their categorization based on criteria, sub-criteria, or aspects, has been relatively limited in research. This article addresses this gap by conducting a comprehensive literature review on various KPIs and, drawing upon both literature and practical experience, formulating a set of criteria to serve as the foundation for a Multi-Criteria Decision Analysis (MCDA) method. This intricate taxonomic framework enhances the understanding of infrastructure performance for stakeholders in the solar industry. By streamlining decision-making, it simplifies the selection of KPIs tailored to specific requirements, thus mitigating the complexity arising from the abundance of KPIs in the literature. As a result, decision-makers can make well-informed choices regarding the monitoring and evaluation framework that best suits the performance goals of their solar plant.
2024-05-16
2024 4th International Conference on Innovative Research in Applied Science, Engineering and Technology (IRASET) (publié)
Efficiently solving a vehicle routing problem (VRP) in a practical runtime is a critical challenge for delivery management companies. This p… (voir plus)aper explores both a theoretical and experimental connection between the Capacitated Vehicle Routing Problem (CVRP) and the Constrained Centroid-Based Clustering (CCBC). Reducing a CVRP to a CCBC is a synonym for a transition from an exponential to a polynomial complexity using commonly known algorithms for clustering, i.e K-means. At the beginning, we conduct an exploratory analysis to highlight the existence of such a relationship between the two problems through illustrative small-size examples and simultaneously deduce some mathematically-related formulations and properties. On a second level, the paper proposes a CCBC based approach endowed with some enhancements. The proposed framework consists of three stages. At the first step, a constrained centroid-based clustering algorithm generates feasible clusters of customers. This methodology incorporates three enhancement tools to achieve near-optimal clusters, namely: a multi-start procedure for initial centroids, a customer assignment metric, and a self-adjustment mechanism for choosing the number of clusters. At the second step, a traveling salesman problem (T SP) solver is used to optimize the order of customers within each cluster. Finally, we introduce a process relying on routes cutting and relinking procedure, which calls upon solving a linear and integer programming model to further improve the obtained routes. This step is inspired by the ruin&recreate algorithm. This approach is an extension of the classical cluster-first, route-second method and provides near-optimal solutions on well-known benchmark instances in terms of solution quality and computational runtime, offering a milestone in solving VRP.