Simple feedforward networks were implemented from scratch with numpy i.e. the forward and backward pass of each layer block. The feedforward network performance is compared to SVM on an synthetic dataset (this time Pytorch was used). Configurable feedforward nets were used for image classifciation tasks on MNIST dataset. Their perofmance was recorded for a wide array of hyperparameters such as network depth, number of neurons per layer, learning rate, weight decay, etc. Lastly, batch normalization was implemented in pytorch and its effect on network performance was also explored.
Implements the convolutional network on MNIST datased for image classification.
This excercise explored the basics of NLP. Firstly a vocabulary was constructed