Neural data-to-text generation: A comparison between pipeline and end-to-end architectures

    Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

    Abstract

    Traditionally, most data-to-text applications have been designed using a modular pipeline architecture, in which non-linguistic input data is converted into natural language through several intermediate transformations. By contrast, recent neural models for data-to-text generation have been proposed as end-to-end approaches, where the non-linguistic input is rendered in natural language with much less explicit intermediate representations in between. This study introduces a systematic comparison between neural pipeline and end-to-end data-to-text approaches for the generation of text from RDF triples. Both architectures were implemented making use of the encoder-decoder Gated-Recurrent Units (GRU) and Transformer, two state-of-the art deep learning methods. Automatic and human evaluations together with a qualitative analysis suggest that having explicit intermediate steps in the generation process results in better texts than the ones generated by end-to-end approaches. Moreover, the pipeline models generalize better to unseen inputs. Data and code are publicly available.
    Original languageEnglish
    Title of host publicationProceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
    Place of PublicationHong Kong, China
    PublisherAssociation for Computational Linguistics
    Pages552-562
    Number of pages11
    Publication statusPublished - 1 Nov 2019
    Event2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing - Asia World Expo, Hong Kong, China
    Duration: 3 Nov 20197 Nov 2019
    https://www.emnlp-ijcnlp2019.org/

    Conference

    Conference2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing
    Abbreviated title(EMNLP-IJCNLP)
    Country/TerritoryChina
    CityHong Kong
    Period3/11/197/11/19
    Internet address

    Fingerprint

    Dive into the research topics of 'Neural data-to-text generation: A comparison between pipeline and end-to-end architectures'. Together they form a unique fingerprint.

    Cite this