Correlating Neural and Symbolic Representations of Language

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Analysis methods which enable us to better understand the representations and functioning of neural models of language are increasingly needed as deep learning becomes the dominant approach in NLP. Here we present two methods based on Representational Similarity Analysis (RSA) and Tree Kernels (TK) which allow us to directly quantify how strongly the information encoded in neural activation patterns corresponds to information represented by symbolic structures such as syntax trees. We first validate our methods on the case of a simple synthetic language for arithmetic expressions with clearly defined syntax and semantics, and show that they exhibit the expected pattern of results. We then our methods to correlate neural representations of English sentences with their constituency parse trees.
Original languageEnglish
Title of host publicationProceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Place of PublicationFlorence, Italy
PublisherAssociation for Computational Linguistics
Pages2952-2962
Number of pages11
Publication statusPublished - 1 Jul 2019

Fingerprint

Chemical activation
Semantics
Deep learning

Cite this

Chrupala, G., & Alishahi, A. (2019). Correlating Neural and Symbolic Representations of Language. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (pp. 2952-2962). Florence, Italy: Association for Computational Linguistics.
Chrupala, Grzegorz ; Alishahi, Afra. / Correlating Neural and Symbolic Representations of Language. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy : Association for Computational Linguistics, 2019. pp. 2952-2962
@inproceedings{983dae5e844d4d839401db72f2f5f91e,
title = "Correlating Neural and Symbolic Representations of Language",
abstract = "Analysis methods which enable us to better understand the representations and functioning of neural models of language are increasingly needed as deep learning becomes the dominant approach in NLP. Here we present two methods based on Representational Similarity Analysis (RSA) and Tree Kernels (TK) which allow us to directly quantify how strongly the information encoded in neural activation patterns corresponds to information represented by symbolic structures such as syntax trees. We first validate our methods on the case of a simple synthetic language for arithmetic expressions with clearly defined syntax and semantics, and show that they exhibit the expected pattern of results. We then our methods to correlate neural representations of English sentences with their constituency parse trees.",
author = "Grzegorz Chrupala and Afra Alishahi",
year = "2019",
month = "7",
day = "1",
language = "English",
pages = "2952--2962",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
publisher = "Association for Computational Linguistics",

}

Chrupala, G & Alishahi, A 2019, Correlating Neural and Symbolic Representations of Language. in Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Florence, Italy, pp. 2952-2962.

Correlating Neural and Symbolic Representations of Language. / Chrupala, Grzegorz; Alishahi, Afra.

Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy : Association for Computational Linguistics, 2019. p. 2952-2962.

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

TY - GEN

T1 - Correlating Neural and Symbolic Representations of Language

AU - Chrupala, Grzegorz

AU - Alishahi, Afra

PY - 2019/7/1

Y1 - 2019/7/1

N2 - Analysis methods which enable us to better understand the representations and functioning of neural models of language are increasingly needed as deep learning becomes the dominant approach in NLP. Here we present two methods based on Representational Similarity Analysis (RSA) and Tree Kernels (TK) which allow us to directly quantify how strongly the information encoded in neural activation patterns corresponds to information represented by symbolic structures such as syntax trees. We first validate our methods on the case of a simple synthetic language for arithmetic expressions with clearly defined syntax and semantics, and show that they exhibit the expected pattern of results. We then our methods to correlate neural representations of English sentences with their constituency parse trees.

AB - Analysis methods which enable us to better understand the representations and functioning of neural models of language are increasingly needed as deep learning becomes the dominant approach in NLP. Here we present two methods based on Representational Similarity Analysis (RSA) and Tree Kernels (TK) which allow us to directly quantify how strongly the information encoded in neural activation patterns corresponds to information represented by symbolic structures such as syntax trees. We first validate our methods on the case of a simple synthetic language for arithmetic expressions with clearly defined syntax and semantics, and show that they exhibit the expected pattern of results. We then our methods to correlate neural representations of English sentences with their constituency parse trees.

M3 - Conference contribution

SP - 2952

EP - 2962

BT - Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics

PB - Association for Computational Linguistics

CY - Florence, Italy

ER -

Chrupala G, Alishahi A. Correlating Neural and Symbolic Representations of Language. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Florence, Italy: Association for Computational Linguistics. 2019. p. 2952-2962