An attempt at replicating "Neural Turing Machines" (by Alex Graves, Greg Wayne, and Ivo Danihelka) in Keras.
Repeat Copy NTM Memory Use During the Copy Task
Associative Recall (in progress)
Priority Sort (in progress)
To train a repeat copy task with LSTM:
$ python learning_repeat_copy_lstm.py
To train a associative recall task with LSTM:
$ python learning_associative_recall_lstm.py
To train a priority sort task with LSTM:
$ python learning_priority_sort_lstm.py
To train three different tasks one by one with LSTM:
$ python learning_algorithm_lstm.py
- carpedm20/NTM-tensorflow. Check out: https://github.com/carpedm20/NTM-tensorflow
- shawntan/neural-turing-machines. Check out: https://github.com/shawntan/neural-turing-machines
- snipsco/ntm-lasagne. Check out: https://github.com/snipsco/ntm-lasagne
- Training NTM to learning repeat copy.
- Training NTM to learning associative recall.
- Training NTM to learning dynamical n-grams.
- Training NTM to learning priority sort.
- Using NTM for other natural language processing tasks such as neural language model.
Zhibin Quan / @SigmaQuan