Code and Data for the TACL paper (to appear) Exploring Continual Learning of Compositional Generalization in NLI
We introduce the Continual Compositional Generalization in Inference (C2Gen NLI) challenge, where a model continuously acquires knowledge of constituting primitive inference tasks as a basis for compositional inferences. We explore how continual learning affects compositional generalization in NLI, by designing a continual learning setup for compositional NLI inference tasks.
{
"verdical_label": "negative",
"sick_label": "neutral",
"sent1": "A man fails to make a snowball",
"sent2": "A man plays with a ball",
"mid_sent": "A man makes a snowball",
"label": "neutral"
}
preliminary: split data as you require from the provided dataset
environment: python3.7, pytorch1.7.1
run: the script is provided in the run.sh
Acknowledgement: The code of continual learning strategies come from VisCOLL
Please cite our paper if you are using this dataset.
@article{fu2024exploring,
title={Exploring continual learning of compositional generalization in NLI},
author={Fu, Xiyan and Frank, Anette},
journal={Transactions of the Association for Computational Linguistics},
volume={12},
pages={912--932},
year={2024},
publisher={MIT Press 255 Main Street, 9th Floor, Cambridge, Massachusetts 02142, USA~…}
}