forked from lazyprogrammer/machine_learning_examples
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathextra_reading.txt
47 lines (32 loc) · 1.43 KB
/
extra_reading.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
https://deeplearningcourses.com/c/deep-learning-advanced-nlp
Bidirectional Recurrent Neural Networks
https://maxwell.ict.griffith.edu.au/spl/publications/papers/ieeesp97_schuster.pdf
Translation Modeling with Bidirectional Recurrent Neural Networks
http://emnlp2014.org/papers/pdf/EMNLP2014003.pdf
Sequence to Sequence Learning with Neural Networks
https://arxiv.org/abs/1409.3215
A Neural Conversational Model
https://arxiv.org/abs/1506.05869v3
Neural Machine Translation by Jointly Learning to Align and Translate (Attention)
https://arxiv.org/abs/1409.0473
Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems (Simplified Attention)
https://arxiv.org/abs/1512.08756
Memory Networks
https://arxiv.org/abs/1410.3916
Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks
http://arxiv.org/abs/1502.05698
End-To-End Memory Networks
http://arxiv.org/abs/1503.08895
Ask Me Anything: Dynamic Memory Networks for Natural Language Processing
https://arxiv.org/abs/1506.07285
WaveNet
https://deepmind.com/blog/wavenet-generative-model-raw-audio/
Tacotron
https://google.github.io/tacotron/
Tacotron 2
https://research.googleblog.com/2017/12/tacotron-2-generating-human-like-speech.html
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
https://arxiv.org/abs/1803.01271
(just released March 2018!)
Relational recurrent neural networks
https://arxiv.org/abs/1806.01822