Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution

Ola Rønning, Daniel Hardt, Anders Søgaard

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

Sluicing resolution is the task of identifying the antecedent to a question ellipsis. Antecedents are often sentential constituents, and previous work has therefore relied on syntactic parsing, together with complex linguistic features. A recent model instead used partial parsing as an auxiliary task in sequential neural network architectures to inject syntactic information. We explore the linguistic information being brought to bear by such networks, both by defining subsets of the data exhibiting relevant linguistic characteristics, and by examining the internal representations of the network. Both perspectives provide evidence for substantial linguistic knowledge being deployed by the neural networks.
Sluicing resolution is the task of identifying the antecedent to a question ellipsis. Antecedents are often sentential constituents, and previous work has therefore relied on syntactic parsing, together with complex linguistic features. A recent model instead used partial parsing as an auxiliary task in sequential neural network architectures to inject syntactic information. We explore the linguistic information being brought to bear by such networks, both by defining subsets of the data exhibiting relevant linguistic characteristics, and by examining the internal representations of the network. Both perspectives provide evidence for substantial linguistic knowledge being deployed by the neural networks.
LanguageEnglish
Title of host publicationProceedings of the 2018 EMNLP Workshop BlackboxNLP : Analyzing and Interpreting Neural Networks for NLP
EditorsTal Linzen, Grzegorz Chrupała, Afra Alishahi
Number of pages8
Place of PublicationBrussels
PublisherAssociation for Computational Linguistics
Date2018
Pages66-73
StatePublished - 2018
Event2018 Conference on Empirical Methods in Natural Language Processing - Square Meeting Center, Brussels, Belgium
Duration: 31 Oct 20184 Nov 2018
http://emnlp2018.org/

Conference

Conference2018 Conference on Empirical Methods in Natural Language Processing
LocationSquare Meeting Center
CountryBelgium
CityBrussels
Period31/10/201804/11/2018
Internet address

Cite this

Rønning, O., Hardt, D., & Søgaard, A. (2018). Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution. In T. Linzen, G. Chrupała, & A. Alishahi (Eds.), Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP (pp. 66-73). Brussels: Association for Computational Linguistics.
Rønning, Ola ; Hardt, Daniel ; Søgaard, Anders. / Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution. Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP. editor / Tal Linzen ; Grzegorz Chrupała ; Afra Alishahi. Brussels : Association for Computational Linguistics, 2018. pp. 66-73
@inproceedings{c02eca9a86d54a72abaf7b228d81ea62,
title = "Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution",
abstract = "Sluicing resolution is the task of identifying the antecedent to a question ellipsis. Antecedents are often sentential constituents, and previous work has therefore relied on syntactic parsing, together with complex linguistic features. A recent model instead used partial parsing as an auxiliary task in sequential neural network architectures to inject syntactic information. We explore the linguistic information being brought to bear by such networks, both by defining subsets of the data exhibiting relevant linguistic characteristics, and by examining the internal representations of the network. Both perspectives provide evidence for substantial linguistic knowledge being deployed by the neural networks.",
author = "Ola R{\o}nning and Daniel Hardt and Anders S{\o}gaard",
year = "2018",
language = "English",
pages = "66--73",
editor = "Tal Linzen and Grzegorz Chrupała and Afra Alishahi",
booktitle = "Proceedings of the 2018 EMNLP Workshop BlackboxNLP",
publisher = "Association for Computational Linguistics",
address = "United States",

}

Rønning, O, Hardt, D & Søgaard, A 2018, Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution. in T Linzen, G Chrupała & A Alishahi (eds), Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP. Association for Computational Linguistics, Brussels, pp. 66-73, 2018 Conference on Empirical Methods in Natural Language Processing , Brussels, Belgium, 31/10/2018.

Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution. / Rønning, Ola; Hardt, Daniel; Søgaard, Anders.

Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP. ed. / Tal Linzen; Grzegorz Chrupała; Afra Alishahi. Brussels : Association for Computational Linguistics, 2018. p. 66-73.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

TY - GEN

T1 - Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution

AU - Rønning,Ola

AU - Hardt,Daniel

AU - Søgaard,Anders

PY - 2018

Y1 - 2018

N2 - Sluicing resolution is the task of identifying the antecedent to a question ellipsis. Antecedents are often sentential constituents, and previous work has therefore relied on syntactic parsing, together with complex linguistic features. A recent model instead used partial parsing as an auxiliary task in sequential neural network architectures to inject syntactic information. We explore the linguistic information being brought to bear by such networks, both by defining subsets of the data exhibiting relevant linguistic characteristics, and by examining the internal representations of the network. Both perspectives provide evidence for substantial linguistic knowledge being deployed by the neural networks.

AB - Sluicing resolution is the task of identifying the antecedent to a question ellipsis. Antecedents are often sentential constituents, and previous work has therefore relied on syntactic parsing, together with complex linguistic features. A recent model instead used partial parsing as an auxiliary task in sequential neural network architectures to inject syntactic information. We explore the linguistic information being brought to bear by such networks, both by defining subsets of the data exhibiting relevant linguistic characteristics, and by examining the internal representations of the network. Both perspectives provide evidence for substantial linguistic knowledge being deployed by the neural networks.

M3 - Article in proceedings

SP - 66

EP - 73

BT - Proceedings of the 2018 EMNLP Workshop BlackboxNLP

PB - Association for Computational Linguistics

CY - Brussels

ER -

Rønning O, Hardt D, Søgaard A. Linguistic Representations in Multi-task Neural Networks for Ellipsis Resolution. In Linzen T, Chrupała G, Alishahi A, editors, Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP. Brussels: Association for Computational Linguistics. 2018. p. 66-73.