Compositional Generalization in Grounded Language Learning via Induced Model Sparsity
Loading...
Access rights
openAccess
publishedVersion
URL
Journal Title
Journal ISSN
Volume Title
A4 Artikkeli konferenssijulkaisussa
This publication is imported from Aalto University research portal.
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
Authors
Date
2022
Department
Major/Subject
Mcode
Degree programme
Language
en
Pages
13
Series
NAACL 2022 - 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Student Research Workshop, pp. 143-155
Abstract
We provide a study of how induced model sparsity can help achieve compositional generalization and better sample efficiency in grounded language learning problems. We consider simple language-conditioned navigation problems in a grid world environment with disentangled observations. We show that standard neural architectures do not always yield compositional generalization. To address this, we design an agent that contains a goal identification module that encourages sparse correlations between words in the instruction and attributes of objects, composing them together to find the goal. The output of the goal identification module is the input to a value iteration network planner. Our agent maintains a high level of performance on goals containing novel combinations of properties even when learning from a handful of demonstrations. We examine the internal representations of our agent and find the correct correspondences between words in its dictionary and attributes in the environment.Description
Funding Information: We thank Yonatan Bisk for his valuable feedback and suggestions on this work. We also acknowledge the computational resources provided by the Aalto Science-IT project and the support within the Academy of Finland Flagship programme: Finnish Center for Artificial Intelligence (FCAI). Publisher Copyright: © 2022 Association for Computational Linguistics.
Keywords
Other note
Citation
Spilsbury, S & Ilin, A 2022, Compositional Generalization in Grounded Language Learning via Induced Model Sparsity . in NAACL 2022 - 2022 Conference of the North American Chapter of the Association for Computational Linguistics : Human Language Technologies, Proceedings of the Student Research Workshop . Association for Computational Linguistics, pp. 143-155, Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Seattle, Washington, United States, 10/07/2022 . https://doi.org/10.18653/v1/2022.naacl-srw.19