CS224n is a NLP (Deep Learning) course at Stanford. This course is open and you'll find everything in their course website. Gotta learn this course and start my NLP journey. The notes are amazing, the course is amazing, let's get started.
Book:
- Foundations of Statistical Natural Language Processing - Chris Manning, Hinrich SchΓΌtze
- Speech and Language Processing (3rd edition in making) - Dan Jurafsky, James H. Martin
- Natural Language Processing with Python | NLTK Essentials
Natural language processing (NLP) is one of the most important technologies of the information age. Understanding complex language utterances is also a crucial part of artificial intelligence. Applications of NLP are everywhere because people communicate most everything in language: web search, advertisement, emails, customer service, language translation, radiology reports, etc. There are a large variety of underlying tasks and machine learning models behind NLP applications. Recently, deep learning approaches have obtained very high performance across many different NLP tasks. These can solve tasks with single end-to-end models and do not require traditional, task-specific feature engineering. In this winter quarter course students will learn to implement, train, debug, visualize and invent their own neural network models. The course provides a thorough introduction to cutting-edge research in deep learning applied to NLP. On the model side we will cover word vector representations, window-based neural networks, recurrent neural networks, long-short-term-memory models, recursive neural networks, convolutional neural networks as well as some recent models involving a memory component. Through lectures and programming assignments students will learn the necessary engineering tricks for making neural networks work on practical problems.
The Winter 2017 version lectures are available here at YouTube.
- Introduction to NLP and Deep Learning
- Word Vectors
- Neural Networks
- Backpropagation and Project Advice
- Introduction to TensorFlow
- Dependency Parsing
- Recurrent Neural Networks and Language Models
- Vanishing Gradients, Fancy RNNs
- Machine Translation, Seq2Seq and Attention
- Advanced Attention
- Transformer Networks and CNNs
- Coreference Resolution
- Tree Recursive Neural Networks and Constituency Parsing
- Advanced Architectures and Memory Networks
- Reinforcement Learning for NLP Guest Lecture
- Semi-supervised Learning for NLP
- Future of NLP Models, Multi-task Learning and QA Systems
The notes and slides are available in the course website - here
Exams - 2018 Winter Midterm, Solution, 2017 Winter Midterm, Solution , 2017 Winter Prac Mid1, Solution, 2017 Winter Prac Mid2, Solution
- Udacity NLP Nanodegree | NLP - Dan Jurafsky, Christopher Manning | NLTK with Python 3 for Natural Language Processing | NLTK | How to solve 90% of NLP problems: a step-by-step guide | Natural Language Processing is Fun! | A Practitioner's Guide to NLP - I
FINAL Project | Past Projects
As a part of this course, I did this project, " ".