JASs: Joint Attention Strategies for Paraphrase Generation

Isaac K.E. Ampomah*, Sally McClean, Zhiwei Lin, Glenn Hawe

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)


Neural attention based sequence to sequence (seq2seq) network models have achieved remarkable performance on NLP tasks such as image caption generation, paraphrase generation, and machine translation. The underlying framework for these models is usually a deep neural architecture comprising of multi-layer encoder-decoder sub-networks. The performance of the decoding sub-network is greatly affected by how well it extracts the relevant source-side contextual information. Conventional approaches only consider the outputs of the last encoding layer when computing the source contexts via a neural attention mechanism. Due to the nature of information flow across the time-steps within each encoder layer as well flow from layer to layer, there is no guarantee that the necessary information required to build the source context is stored in the final encoding layer. These approaches also do not fully capture the structural composition of natural language. To address these limitations, this paper presents several new strategies to generating the contextual feature vector jointly across all the encoding layers. The proposed strategies consistently outperform the conventional approaches to performing the neural attention computation on the task of paraphrase generation.

Original languageEnglish
Title of host publicationNatural Language Processing and Information Systems - 24th International Conference on Applications of Natural Language to Information Systems, NLDB 2019, Proceedings
EditorsElisabeth Métais, Farid Meziane, Sunil Vadera, Vijayan Sugumaran, Mohamad Saraee
PublisherSpringer Verlag
Number of pages13
ISBN (Print)9783030232801
Publication statusPublished - 2019
Event24th International Conference on Application of Natural Language to Information Systems, NLDB 2019 - Salford, United Kingdom
Duration: 26 Jun 201928 Jun 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11608 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference24th International Conference on Application of Natural Language to Information Systems, NLDB 2019
CountryUnited Kingdom

Bibliographical note

Publisher Copyright:
© 2019, Springer Nature Switzerland AG.

Copyright 2019 Elsevier B.V., All rights reserved.


  • Multi-layer encoder-decoder
  • Neural attention
  • Source context

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)


Dive into the research topics of 'JASs: Joint Attention Strategies for Paraphrase Generation'. Together they form a unique fingerprint.

Cite this