1.) Data Preprocessing Techniques(Missing Values, label encoding, One hot encoding, feature scaling, splitting data in train and test set)
2.) Simple Linear Regression Techniques(Data allocation, Splitting data in train and test set, Linear Regression fitting, plotting training and test prediction)
3.) Multiple Linear Regression(Including multiple independent variables to build multiple linear regression model to predict response variable by including BACKWARD ELIMINATION METHOD)
3.1) Automated backward elmination Multiple Linear Regression(with only p values and with pvalue , adj. R squared
4.) Polynomial Linear Regression in R (Bulding a bluffing detector to detect the new employee bluffing on his mention salary with last employer, techniques for high resolution and smoother curve, imaginery grid for position levels)
5.) Support Vector Regression --- Using SVR with feature scaling due to nature of dataset, techniques for high resolution and smoother curve, Inverse transformation to get result without feature scaling(i.e in original data type)
6.) Decision Tree Regression ---- (splitting into leaves, recommending a best suit, showing a 1 dimension decision which turn into worst)
7.) Random Forest Regression ---- (Building RFR model and tuning its performance by increasing number of Decision Trees)
1.) Data Preprocessing Techniques(Missing Values, factorencoding, feature scaling, spliting data in train and test set)
2.) Simple Linear Regression Techniques(Data allocation, Spliting data in train and test set, Linear Regression fitting, plotting training and test prediction)
3.) Multiple_Linear_Regression(Including multiple independent variables to build multiple linear regression model to predict response variable by including BACKWARD ELIMINATION METHOD)
3.1) Automated backward elmination Multiple_Linear_Regression Code
4.) Polynomial_Linear_Regression in R (Bulding a bluffing detector to detect the new employee bluffing on his mention salary with last employer, techniques for high resolution and smoother curve, imaginery grid for position levels)
5.) Support_Vector_Regression --- Using SVR with eps-regression technique, Library used - e1071, Inverse transformation to get result without feature scaling (i.e in original data type)
6.) Decision_Tree_Regression ---- (splitting into leaves, recommending a best suit, showing a 1 dimension decision which turn into worst, High resolution tuning, printing the leaves)
7.) Random Forest Regression ---- (Building RFR model and tuning its performance by increasing number of Decision Trees)