Portrait of Nizar Islah

Nizar Islah

PhD - Université de Montréal
Supervisor
Co-supervisor
Research Topics
Computational Neuroscience
Continual Learning
Deep Learning
Representation Learning

Publications

Revisiting Replay and Gradient Alignment for Continual Pre-Training of Large Language Models
Istabrak Abbes
Matthew D Riemer
Tsuguchika Tabaru
Hiroaki Kingetsu
A. Chandar
Learning to combine top-down context and feed-forward representations under ambiguity with apical and basal dendrites
Guillaume Etter
Busra Tugce Gurbuz
One of the hallmark features of neocortical anatomy is the presence of extensive top-down projections into primary sensory areas, with many … (see more)impinging on the distal apical dendrites of pyramidal neurons. While it is known that they exert a modulatory effect, altering the gain of responses, their functional role remains an active area of research. It is hypothesized that these top-down projections carry contextual information that can help animals to resolve ambiguities in sensory data. One proposed mechanism of contextual integration is a non-linear integration of distinct input streams at apical and basal dendrites of pyramidal neurons. Computationally, however, it is yet to be demonstrated how such an architecture could leverage distinct compartments for flexible contextual integration and sensory processing when both sensory and context signals can be unreliable. Here, we implement an augmented deep neural network with distinct apical and basal compartments that integrates a) contextual information from top-down projections to apical compartments, and b) sensory representations driven by bottom-up projections to basal compartments, via a biophysically inspired rule. In addition, we develop a new multi-scenario contextual integration task using a generative image modeling approach. In addition to generalizing previous contextual integration tasks, it better captures the diversity of scenarios where neither contextual nor sensory information are fully reliable. To solve this task, this model successfully learns to select among integration strategies. We find that our model outperforms those without the "apical prior" when contextual information contradicts sensory input. Altogether, this suggests that the apical prior and biophysically inspired integration rule could be key components necessary for handling the ambiguities that animals encounter in the diverse contexts of the real world.
GitChameleon 2.0: Evaluating AI Code Generation Against Python Library Version Incompatibilities
The rapid evolution of software libraries poses a considerable hurdle for code generation, necessitating continuous adaptation to frequent v… (see more)ersion updates while preserving backward compatibility. While existing code evolution benchmarks provide valuable insights, they typically lack execution-based evaluation for generating code compliant with specific library versions. To address this, we introduce GitChameleon 2.0, a novel, meticulously curated dataset comprising 328 Python code completion problems, each conditioned on specific library versions and accompanied by executable unit tests. GitChameleon 2.0 rigorously evaluates the capacity of contemporary large language models (LLMs), LLM-powered agents, code assistants, and RAG systems to perform version-conditioned code generation that demonstrates functional accuracy through execution. Our extensive evaluations indicate that state-of-the-art systems encounter significant challenges with this task; enterprise models achieving baseline success rates in the 48-51% range, underscoring the intricacy of the problem. By offering an execution-based benchmark emphasizing the dynamic nature of code libraries, GitChameleon 2.0 enables a clearer understanding of this challenge and helps guide the development of more adaptable and dependable AI code generation methods. We make the dataset and evaluation code publicly available at https://github.com/mrcabbage972/GitChameleonBenchmark.
GitChameleon: Unmasking the Version-Switching Capabilities of Code Generation Models
Eilif Benjamin Muller
Terry Yue Zhuo
Massimo Caccia