a.
- Run the code of 2 layer neural network from for multi class classification
- https://stackabuse.com/creating-a-neural-network-from-scratch-in-python-multi-class-classification/
- Go through the code and understand each line.
- Build the artificial dataset as shon in the blog.
- Take 70% data for training. 30% data for testing
- Build a two-layer neural network from from scratch where zo is the output.
a) You will have two layers.
i) One is hidden layers ( use 4 neurons)
ii) three output layer (use one output neuron)
b) Use sigmoid function as activation function in hidden layers
c) Use softmax for output layer
d) Use negative log likelihood loss
e) Derive the derivative for softmax
f) Calculate the derivatives for back propagation. - Write codes training module for 2000 epochs to train the neural network.
- Now classify the test data using the trained neural network.
a) During the test, you will do only forward pass.
b) Argmax the forward pass output
c) Report your accuracy. - Draw data points for training data and also plot the class boundary in 3D plot
- Draw data points for test data and also plot the class boundary in 3D plot
- You cannot use any built-in deep learning functions
b. (same as a. But the input is 3X32X32 image, output 10 classes.. And three layer neural network
1) Download and read train data from CIFAR 10 from 10 classes
2) Now, read one image, reshape it to row vector x. x is our input data now with dimension 1X3072.
3) xโs class label would be one hot encoded
4) Use two hidden layer and one output layer.
5) Build a three-layer neural network from scratch
a) You will have three layers.
i) First hidden layers ( use 1000 neurons)
ii) Second hidden layers (use 100 neurons)
iii) Output layers (10 output neurons)
b) So. Output-probability:
ao = softmax ( Wo * sigmoid ( Wh2 * sigmoid(Wh1 * x + bh1) + bh2 ) + bo )
c) Where x is input vector, Wh1, and Wh2 are weight matrices for hidden layers. Wo is the weight matrix for output layers. bh1, bh2, and bo are bias vectors for corresponding layers.
d) Use sum of cross entropy as your loss function. ao is the output vector.
e) Derive and Calculate the derivatives for back propagation for the three layer neural network.
6) Write codes for training module for 10 epochs to train the neural network.
7) Report your accuracy on the test set
8) Show confusion matrix of your test prediction
9) You cannot use any built-in deep learning functions
c. (use functions) โ
1) Download and read train data from CIFAR 10 from 10 classes
2) Same as assignment b.
3) Write forward(X) function to implement the forward calculation.
4) Use forward function to pass the input data to calculate the probabilities.
5) Calculate the loss using criterion function with cross entropy option.
6) Using the loss, use autograd and backward() to calculate the derivatives automatically.
7) Write codes for training module for 10 epochs to train the neural network.
8) Report your accuracy on the test set
9) Show confusion matrix of your test prediction