- A decision tree is a tree in which each branch node represents a choice between a number of alternatives, and each leaf node represents a decision.
- Random Forest trained with the “bagging” method builds multiple decision trees and merges them together and correct for decision trees' habit of overfitting to make random sampled training set
- data : breast cancer prediction, https://archive.ics.uci.edu/ml/datasets/breast+cancer+wisconsin+%28original%29
- code reference : https://github.com/tiepvupsu/DecisionTreeID3
- Decision tree's code is tree 1 file and Random Forest's code is tree2 file.
- SVM(Support Vector Machine) constructs a set of hyperplanes to have the largest distance to the nearest training data point of any class.
- SVMis broadly divided into hard margin(No error cases are allowed), soft margin(Error cases are allowed and penalty) and kernel(Non-linear decision boundary).
- code reference : SUPPORT VECTOR MACHINES SUCCINCTLY by Alexandre Kowalczyk Foreword by Daniel Jebaraj
- this alogrithm is possible in soft margin and kernel.
- deep learning : DNN, CNN, RNN
- code reference : https://github.com/WegraLee/deep-learning-from-scratch-2