Topics include Neural Machine Translation, Parsing, Syntax, and more. For now the papers are sorted by conference. Selected papers include summaries.
- Lee et al. Recurrent Additive Networks
- Yang et al. Towards Bidirectional Hierarchical Representations for Attention-Based Neural Machine Translation
- Gu et al. (2017) Trainable Greedy Decoding for Neural Machine Translation
- Gehring et al. A Convolutional Encoder Model for Neural Machine Translation
- Yu et al. The Neural Noisy Channel
- Dozat & Manning. Deep Biaffine Attention for Neural Dependency Parsing
- Bowman et al. Generating Sentences from a Continuous Space
- Dyer et al. Recurrent Neural Network Grammars [bib]
- Kiperwasser & Goldberg. Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations [bib]
- Kalchbrenner et al. Neural Machine Translation in Linear Time
- Lee et al. Fully Character-Level Neural Machine Translation without Explicit Segmentation