PP: Sequence to sequence learning with neural networks

From google institution;

1. Before this, DNN cannot be used to map sequences to sequences. In this paper, we propose a sequence learning that makes minimal assumptions on the sequence structure. 

use lstm to map the input sequence to a vector of a fixed dimensionality;

input sequence-----> lstm -----> vector -----> decoder(lstm) -----> target sequence. 

translation task; 

Limitation: Despite their flexibility and power, DNNs can only be applied to problems whose inputs and targets can be sensibly encoded with vectors of fixed dimensionality.

Sequential problem: speech recognition and machine translation. 

Before: DNNs require that the dimensionality of the inputs and outputs is known and fixed. 

Problem: sequence to sequence problems. 

 

posted @ 2020-02-05 00:42  keeps_you_warm  阅读(115)  评论(0编辑  收藏  举报