In the last few years, many different methods focus on a Recurrent Neural Network (RNN) approach to the generation of natural language from tabular information. The most widely used word based sequence-to-sequence neural methods resort to a preprocessing step called delexicalization (conversely, relexicalization) to deal with the rare word problem. In this work, we present a character-level sequence-to-sequence model with attention mechanism which does not requires delexicalization, tokenization nor even lowercasing. Moreover, our model leverages two major features: it switches between a generation and a copying mechanism when required, showing the ability to directly copy input facts to produce outputs, and it is trained in such a way that significantly reduces recall issues, i.e. the possible omission of input information, a well-known issue in the context of Recurrent Neural Networks.

Switching GRUs on a deep sequence-to-sequence neural model: an application to Natural Language Generation

ROBERTI, MARCO
2017/2018

Abstract

In the last few years, many different methods focus on a Recurrent Neural Network (RNN) approach to the generation of natural language from tabular information. The most widely used word based sequence-to-sequence neural methods resort to a preprocessing step called delexicalization (conversely, relexicalization) to deal with the rare word problem. In this work, we present a character-level sequence-to-sequence model with attention mechanism which does not requires delexicalization, tokenization nor even lowercasing. Moreover, our model leverages two major features: it switches between a generation and a copying mechanism when required, showing the ability to directly copy input facts to produce outputs, and it is trained in such a way that significantly reduces recall issues, i.e. the possible omission of input information, a well-known issue in the context of Recurrent Neural Networks.
ENG
IMPORT DA TESIONLINE
File in questo prodotto:
File Dimensione Formato  
748341_m.roberti-switchinggrusonadeepsequence-to-sequenceneuralmodel.pdf

non disponibili

Tipologia: Altro materiale allegato
Dimensione 1.37 MB
Formato Adobe PDF
1.37 MB Adobe PDF

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14240/54992