Portrait of Amal Zouaq

Amal Zouaq

Associate Academic Member
Full Professor, Polytechnique Montréal, Department of Computer Engineering and Software Engineering
Associate Professor, University of Ottawa, School of Electrical Engineering and Computer Science

Biography

Amal Zouaq is an associate professor in the Computer Engineering and Software Engineering Department at Polytechnique Montréal, and holds a Fonds de recherche du Québec - Santé (FRQS) Dual Chair in Artificial Intelligence and Digital Health.

She is also an IVADO professor, a member of the Computational Linguistics in Québec consortium (CLIQ-AI) and an adjunct professor at the University of Ottawa.

Zouaq’s research interests include AI, natural language processing and the Semantic Web. She is the director of the LAMA-WeST lab, which conducts research on all aspects of natural language processing and AI, including knowledge extraction from unstructured knowledge sources, ontology learning and alignment, knowledge base learning and completion, natural language generation, and the analysis of large pre-trained language models capabilities as well as their exploitation and adaptation for NLP tasks.

She also serves on program committees for many conferences and journals in the fields of knowledge and data engineering, natural language processing, data mining and the Semantic Web.

Current Students

Master's Research - Polytechnique Montréal
Master's Research - Polytechnique Montréal
PhD - Polytechnique Montréal
PhD - Polytechnique Montréal
PhD - Polytechnique Montréal
Principal supervisor :

Publications

Assessing the Generalization Capabilities of Neural Machine Translation Models for SPARQL Query Generation
Samuel Reyd
SORBET: A Siamese Network for Ontology Embeddings Using a Distance-Based Regression Loss and BERT
Francis Gosselin
SORBETmatcher results for OAEI 2023.
Francis Gosselin
Local Structure Matters Most: Perturbation Study in NLU
Louis Clouâtre
Prasanna Parthasarathi
Recent research analyzing the sensitivity of natural language understanding models to word-order perturbations has shown that neural models … (see more)are surprisingly insensitive to the order of words.In this paper, we investigate this phenomenon by developing order-altering perturbations on the order of words, subwords, and characters to analyze their effect on neural models’ performance on language understanding tasks.We experiment with measuring the impact of perturbations to the local neighborhood of characters and global position of characters in the perturbed texts and observe that perturbation functions found in prior literature only affect the global ordering while the local ordering remains relatively unperturbed.We empirically show that neural models, invariant of their inductive biases, pretraining scheme, or the choice of tokenization, mostly rely on the local structure of text to build understanding and make limited use of the global structure.
Local Structure Matters Most: Perturbation Study in NLU
Louis Clouâtre
Prasanna Parthasarathi
Recent research analyzing the sensitivity of natural language understanding models to word-order perturbations has shown that neural models … (see more)are surprisingly insensitive to the order of words.In this paper, we investigate this phenomenon by developing order-altering perturbations on the order of words, subwords, and characters to analyze their effect on neural models’ performance on language understanding tasks.We experiment with measuring the impact of perturbations to the local neighborhood of characters and global position of characters in the perturbed texts and observe that perturbation functions found in prior literature only affect the global ordering while the local ordering remains relatively unperturbed.We empirically show that neural models, invariant of their inductive biases, pretraining scheme, or the choice of tokenization, mostly rely on the local structure of text to build understanding and make limited use of the global structure.