- Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context
- XLNet: Generalized Autoregressive Pretraining for Language Understanding
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
- Language Models are Unsupervised Multitask Learners
- Neural Approaches to Conversational AI: Question Answering, Task-Oriented Dialogues and Social Chatbot
- Improving Language Understanding by Generative Pre-Training
- Emotion-Cause Pair Extraction: A New Task to Emotion Analysis in Texts
- A Neural Conversational Model
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- Bridging the Gap between Training and Inference for Neural Machine Translation
manjunath5496 / 10-must-read-technical-papers-on-nlp Goto Github PK
View Code? Open in Web Editor NEW"There is no such thing as reality, only our perception of it."― Becky Mallery