lightaime / cs231n Goto Github PK
View Code? Open in Web Editor NEWcs231n assignments sovled by https://ghli.org
cs231n assignments sovled by https://ghli.org
I was not able to understand the significance of this statement. Any help is appreciable. 🍰
from cs231n.classifiers import Softmax
This Softmax doesn't contain predict and train functions in the code
the fundemental idea just like : (a - b)2 = a2 + b2 - 2ab
A = np.sum(np.square(X), axis=1) # 500 x 1
B = np.transpose(np.sum(np.square(X_train), axis=1)) # 1x5000
two_AB = 2*np.dot(X, self.X_train.T) # 500 x 5000
dicts = np.sqrt(A + B - two_AB)`
how could it be broadcasted together with shapes (500,) (1,5000) ?
In my opinion,
In line 266, the index of ar_cache should be self.num_layers-1 instead of self.num_layers due to the for loop you used in line 255(self.num_layers-1 is excluded here)
Therefore, a minor change should be made in line 294 as well.
Since you use the index of the output layer (self.num_layers) consistently, your code works perfectly. However, for numerical continuity, there should be self.num_layers-1.
Your code helps me a lot, Thanks so much!
Happy Spring Festival!
what is meaning of f = lambda W: net.loss(X, y, reg=0.05)[0]? I am confused because I do not understand how net.loss act on different W?
the fundemental idea just like : (a - b)2 = a2 + b2 - 2ab
A = np.sum(np.square(X), axis=1) # 500 x 1 B = np.transpose(np.sum(np.square(X_train), axis=1)) # 1x5000 two_AB = 2*np.dot(X, self.X_train.T) # 500 x 5000 dicts = np.sqrt(A + B - two_AB)
how could it be broadcasted together with shapes (500,) (1,5000) ?
and anther one, at line 143,
for i in xrange(num_test):
range should be located here ?
Assignment 1 Q2 Linear SVM
In assignment1/cs231n/classifiers/linear_svm.py
line 79: margins = np.maximum(0, scores - correct_class_scores +1)
It seems line 79 includes the same-class loss adding the total loss by a constant 1*num_train, which might be fine for training, but strictly speaking not correct.
cs231n/assignment1/two_layer_net.ipynb的Forward pass: compute scores中,correct_scores怎么来的,怎么能保证通过随机生成的矩阵经过运算后,能和corret_scores基本相同?
Hey, I've been using your code to compare against my own and you've done an amazing job helping out with going through the assignments with clean code.
In the dropout forward pass of assignment two I noticed you have
mask = (np.random.rand(*x.shape) >= p) / (1-p)
And this makes so that your drop out matches your p. But according to the class notes: # probability of keeping a unit active. higher = less dropout
so I believe the line should read:
mask = (np.random.rand(*x.shape) < p) / p
I'm curious because I hope others continue to use your code to learn and get through the class as I have 👍
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.