Knowledge transfer in neural language models

Peter John Hampton*, Hui Wang, Zhiwei Lin

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution


The complexity and depth of Information Extraction becomes increasingly apparent as time goes on. Heuristics, shocastic and more recently, neural models have proved challenging to scale into and out of various domains. In this paper we discuss the limitations of current approaches and explore if transferring human knowledge into a neural language model could improve performance in an deep learning setting. We approach this by constructing gazetteers from existing public resources. We demonstrate that leveraging existing knowledge we can increase performance and train such networks faster. We argue a case for further research into leveraging pre-existing domain knowledge and engineering resources to train neural models.

Original languageEnglish
Title of host publicationArtificial Intelligence XXXIV - 37th SGAI International Conference on Artificial Intelligence, AI 2017, Proceedings
EditorsMiltos Petridis, Max Bramer
PublisherSpringer Verlag
Number of pages6
ISBN (Print)9783319710778
Publication statusPublished - Dec 2017
Externally publishedYes
Event37th SGAI International Conference on Artificial Intelligence, AI 2017 - Cambridge, United Kingdom
Duration: 12 Dec 201714 Dec 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10630 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference37th SGAI International Conference on Artificial Intelligence, AI 2017
CountryUnited Kingdom

Bibliographical note

Funding Information:
1 Implementation: 2 This work is partially supported by the EPSRC (Grant REF: EP/P031668/1).

Publisher Copyright:
© Springer International Publishing AG 2017.

Copyright 2017 Elsevier B.V., All rights reserved.


  • Information extraction
  • Named entity recognition

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Knowledge transfer in neural language models'. Together they form a unique fingerprint.

Cite this