kingfengji / mgbdt Goto Github PK
View Code? Open in Web Editor NEWThis is the official clone for the implementation of the NIPS18 paper Multi-Layered Gradient Boosting Decision Trees (mGBDT) .
This is the official clone for the implementation of the NIPS18 paper Multi-Layered Gradient Boosting Decision Trees (mGBDT) .
I feel that code written in python 3.5 would likely be compatible with other python 3 versions, are you sure that a build is necessary in 3.5?
How can we adjust it to a regression problem?
If itβs possible, I want to determine the auc score. Auc_score(y,classifier.forward(y_test))
Hi,
I wanna run the uci_year and uci_adult demo, but I can't find the get_data.sh files as ReadME said. Would you please upload it or tell me the data format so I can handle it by myself.
I find that the code uses features file, but it is not in the git too.
/opt/conda/lib/python3.7/site-packages/joblib/parallel.py in (.0)
254 with parallel_backend(self._backend, n_jobs=self._n_jobs):
255 return [func(*args, **kwargs)
--> 256 for func, args, kwargs in self.items]
257
258 def len(self):
/kaggle/working/mGBDT/lib/mgbdt/model/online_xgb.py in fit_increment(self, X, y, num_boost_round, params)
13 for k, v in extra_params.items():
14 params[k] = v
---> 15 params.pop("n_estimators")
16
17 if callable(self.objective):
KeyError: 'n_estimators'
Hello, I run the yeast.py file and I found 60% accuracy mean, is it normal ?
@kingfengji Thanks for making the code available. I believe that multi-layered decision trees is a very elegant and powerful approach! I was applying your model to the boston housing dataset but wasn't able to outperform a baseline xgboost model.
To compare your approach to several alternatives, I ran a small benchmark study using the following approaches, where all models have the same hyper-parameters
I am using PyTorch's L1Loss
for model training and use the MAE
for evaluation, where all models are trained in serial mode. Results are as follows
In particular, I observe the following
MSELoss
since the loss exploded after some iterations, even after normalizing X
. Should we, similar to Neural Networks, also scale y
to avoid exploding gradients?PyTorch Loss
?F
, can we also track how well the target propagation is working by evaluating the reconstruction loss of G
?F
and G
, you are currently using the MSELoss
for the xgboost models. Do you have some experience with modifying this loss?To reproduce the results, you an use the attached notebook.
@kingfengji I would highly appreciate your feedback. Many thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.