A tutorial given by Chris Dyer, Yoav Goldberg, and Graham Neubig at EMNLP 2016 in Austin. The tutorial covers the basic of neural networks for NLP, and how to implement a variety of networks simply and efficiently in the DyNet toolkit.
-
- Computation graphs and their construction
- Neural networks in DyNet
- Recurrent neural networks
- Minibatching
- Adding new differentiable functions
-
Slides, part 2: Case studies in NLP
- Tagging with bidirectional RNNs and character-based embeddings
- Transition-based dependency parsing
- Structured prediction meets deep learning