|
|
Line 1: |
Line 1: |
| = Introduction =
| |
|
| |
|
|
| |
|
| |
| = Model =
| |
|
| |
| === Theory of Recurrent Neural Networks ===
| |
|
| |
| === RNN Architecture by Graves, 2013 ===
| |
|
| |
| === Long Short-Term Memory Recurrent Neural Network (LSTM) ===
| |
|
| |
| === Input and Output Data Transformation ===
| |
|
| |
|
| |
|
| |
|
| |
|
| |
| = Training and Results =
| |
|
| |
| === Training Method ===
| |
|
| |
| === Scoring Method ===
| |
|
| |
| === Results ===
| |
|
| |
| === Some developments of LSTM ===
| |
|
| |
| === Open questions ===
| |
|
| |
| = Source =
| |
| Sutskever, I. Vinyals, O. & Le. Q. V. Sequence to sequence learning
| |
| with neural networks. In Proc. Advances in Neural Information
| |
| Processing Systems 27 3104–3112 (2014).
| |
| <references />
| |