Multiple-model coding scheme for electrical signal compression
Corentin Presvôts
Michel Kieffer
Thibault Prevost
Patrick Panciatici
Zuxing Li
Multiple-model coding scheme for electrical signal compression
Corentin Presvôts
Michel Kieffer
Thibault Prevost
Patrick Panciatici
Zuxing Li
Negative Language Transfer Identification in the English Writing of Chinese and Farsi Native Speakers
Mohammad Karimiabdolmaleki
Leticia Farias Wanderley
Mohsen Rezazadeh
Carrie Demmans Epp
Neural Kinematic Bases for Fluids
Yibo Liu
Paul Kry
Kenny Erleben
Sune Darkner
Teseo Schneider
Not All Data Are Unlearned Equally
Aravind Krishnan
Marius Mosbach
Machine unlearning is concerned with the task of removing knowledge learned from particular data points from a trained model. In the context… (see more) of large language models (LLMs), unlearning has recently received increased attention, particularly for removing knowledge about named entities from models for privacy purposes. While various approaches have been proposed to address the unlearning problem, most existing approaches treat all data points to be unlearned equally, i.e., unlearning that Montreal is a city in Canada is treated exactly the same as unlearning the phone number of the first author of this paper. In this work, we show that this all data is equal assumption does not hold for LLM unlearning. We study how the success of unlearning depends on the frequency of the knowledge we want to unlearn in the pre-training data of a model and find that frequency strongly affects unlearning, i.e., more frequent knowledge is harder to unlearn. Additionally, we uncover a misalignment between probability and generation-based evaluations of unlearning and show that this problem worsens as models become larger. Overall, our experiments highlight the need for better evaluation practices and novel methods for LLM unlearning that take the training data of models into account.
Online Interior-point Methods for Time-varying Equality-constrained Optimization
Jean-Luc Lupien
Iman Shames
Performance Smells in ML and Non-ML Python Projects: A Comparative Study
Franccois Belias
Leuson Da Silva
Cyrine Zid
Prism: Dynamic and Flexible Benchmarking of LLMs Code Generation with Monte Carlo Tree Search
Vahid Majdinasab
Amin Nikanjam
Progressive Multi-Source Domain Adaptation for Personalized Facial Expression Recognition
Muhammad Osama Zeeshan
Alessandro Lameiras Koerich
Eric Grange
Scaling Language-Free Visual Representation Learning
David Fan
Shengbang Tong
Jiachen Zhu
Koustuv Sinha
Zhuang Liu
Xinlei Chen
Nicolas Ballas
Yann LeCun
Amir Bar
Saining Xie
Scaling Language-Free Visual Representation Learning
David Fan
Shengbang Tong
Jiachen Zhu
Koustuv Sinha
Zhuang Liu
Xinlei Chen
Nicolas Ballas
Yann LeCun
Amir Bar
Saining Xie
Semantic Commit: Helping Users Update Intent Specifications for AI Memory at Scale
Priyan Vaithilingam
Munyeong Kim
Frida-Cecilia Acosta-Parenteau
Daniel Lee
Amine Mhedhbi
Elena L. Glassman