Comp 545 - Natural Language Processing
This project involves the implementation of character-based RNN and LSTM language models that learn to predict the next character in a sequence, given the previous characters. The models were trained on Shakespeare and Sherlock texts, enabling them to generate text that mimics the style and context of the source material.
- Implement a character sequence data loader to process text data into trainable sequences.
- Develop a Character RNN and LSTM using PyTorch to learn and predict character sequences.
- Explore the models' ability to generate coherent and contextually relevant text as they train, then compare their results.
- Evaluate the impact of different parameters (like temperature and top_k filtering) on the diversity and quality of the generated text.
- Programming Language: Python
- Frameworks/Libraries: PyTorch, NumPy, Hugging Face Datasets, Matplotlib
- Tools: Kaggle
- Data Preprocessing: Custom data loader to convert raw text into sequences of characters for the RNN.
- Model Architecture: Built and trained a character-level RNN from scratch.
- Text Generation: Implemented functions to generate text by sampling the model's predictions, incorporating techniques like temperature scaling and top-K filtering to control diversity.
- Parameter Tuning: Explored the effects of various hyperparameters on the model's text generation capabilities.
RNN Result Sample
Un I all the stain that that is side is I the slay as near the have the manded youm be in sear. It the but that
the man ther the the but the blears and the dear the splen have brist the may be all hear
LSTM Result Sample
ร:
"Yound the fack me for a more his a will had be a chere away I conger."
"We colled the prom which a last the wore will be at dear and the brow and his me her and a pars,
some your his the a look
Considering all the samples and loss curves, I observed that higher values for all parameters introduces some nonsense in a text that is already hard to understand. However, it did better on lower values, close to medium. Comparing RNN and LSTM, I can conclude that LSTM produced more coherent text that somehow made sense in some parts of the sequence as opposed to RNN.
- Clone the repository.
- Install required packages from
requirements.txt
. - Execute the script
code.py
locally or in notebook form on Kaggle to train the model and see text generation in action.