Giter VIP home page Giter VIP logo

benchmarks's Introduction

mlpack: a fast, header-only machine learning library
a fast, header-only machine learning library

Azure DevOps builds (job) License NumFOCUS

Download: current stable version (4.3.0)

mlpack is an intuitive, fast, and flexible header-only C++ machine learning library with bindings to other languages. It is meant to be a machine learning analog to LAPACK, and aims to implement a wide array of machine learning methods and functions as a "swiss army knife" for machine learning researchers.

mlpack's lightweight C++ implementation makes it ideal for deployment, and it can also be used for interactive prototyping via C++ notebooks (these can be seen in action on mlpack's homepage).

In addition to its powerful C++ interface, mlpack also provides command-line programs, Python bindings, Julia bindings, Go bindings and R bindings.

Quick links:

mlpack uses an open governance model and is fiscally sponsored by NumFOCUS. Consider making a tax-deductible donation to help the project pay for developer time, professional services, travel, workshops, and a variety of other needs.


0. Contents

  1. Citation details
  2. Dependencies
  3. Installing and using mlpack in C++
  4. Building mlpack bindings to other languages
    1. Command-line programs
    2. Python bindings
    3. R bindings
    4. Julia bindings
    5. Go bindings
  5. Building mlpack's test suite
  6. Further resources

1. Citation details

If you use mlpack in your research or software, please cite mlpack using the citation below (given in BibTeX format):

@article{mlpack2023,
    title     = {mlpack 4: a fast, header-only C++ machine learning library},
    author    = {Ryan R. Curtin and Marcus Edel and Omar Shrit and 
                 Shubham Agrawal and Suryoday Basak and James J. Balamuta and 
                 Ryan Birmingham and Kartik Dutt and Dirk Eddelbuettel and 
                 Rishabh Garg and Shikhar Jaiswal and Aakash Kaushik and 
                 Sangyeon Kim and Anjishnu Mukherjee and Nanubala Gnana Sai and 
                 Nippun Sharma and Yashwant Singh Parihar and Roshan Swain and 
                 Conrad Sanderson},
    journal   = {Journal of Open Source Software},
    volume    = {8},
    number    = {82},
    pages     = {5026},
    year      = {2023},
    doi       = {10.21105/joss.05026},
    url       = {https://doi.org/10.21105/joss.05026}
}

Citations are beneficial for the growth and improvement of mlpack.

2. Dependencies

mlpack requires the following additional dependencies:

If the STB library headers are available, image loading support will be available.

If you are compiling Armadillo by hand, ensure that LAPACK and BLAS are enabled.

3. Installing and using mlpack in C++

See also the C++ quickstart.

Since mlpack is a header-only library, installing just the headers for use in a C++ application is trivial.

From the root of the sources, configure and install in the standard CMake way:

mkdir build && cd build/
cmake ..
sudo make install

If the cmake .. command fails due to unavailable dependencies, consider either using the -DDOWNLOAD_DEPENDENCIES=ON option as detailed in the following subsection, or ensure that mlpack's dependencies are installed, e.g. using the system package manager. For example, on Debian and Ubuntu, all relevant dependencies can be installed with sudo apt-get install libarmadillo-dev libensmallen-dev libcereal-dev libstb-dev g++ cmake.

Alternatively, since CMake v3.14.0 the cmake command can create the build folder itself, and so the above commands can be rewritten as follows:

cmake -S . -B build
sudo cmake --build build --target install

During configuration, CMake adjusts the file mlpack/config.hpp using the details of the local system. This file can be modified by hand as necessary before or after installation.

3.1. Additional build options

You can add a few arguments to the cmake command to control the behavior of the configuration and build process. Simply add these to the cmake command. Some options are given below:

  • -DDOWNLOAD_DEPENDENCIES=ON will automatically download mlpack's dependencies (ensmallen, Armadillo, and cereal). Installing Armadillo this way is not recommended and it is better to use your system package manager when possible (see below).
  • -DCMAKE_INSTALL_PREFIX=/install/root/ will set the root of the install directory to /install/root when make install is run.
  • -DDEBUG=ON will enable debugging symbols in any compiled bindings or tests.

There are also options to enable building bindings to each language that mlpack supports; those are detailed in the following sections.

Once headers are installed with make install, using mlpack in an application consists only of including it. So, your program should include mlpack:

#include <mlpack.hpp>

and when you link, be sure to link against Armadillo. If your example program is my_program.cpp, your compiler is GCC, and you would like to compile with OpenMP support (recommended) and optimizations, compile like this:

g++ -O3 -std=c++14 -o my_program my_program.cpp -larmadillo -fopenmp

Note that if you want to serialize (save or load) neural networks, you should add #define MLPACK_ENABLE_ANN_SERIALIZATION before including <mlpack.hpp>. If you don't define MLPACK_ENABLE_ANN_SERIALIZATION and your code serializes a neural network, a compilation error will occur.

See the C++ quickstart and the examples repository for some examples of mlpack applications in C++, with corresponding Makefiles.

3.1.a. Linking with autodownloaded Armadillo

When the autodownloader is used to download Armadillo (-DDOWNLOAD_DEPENDENCIES=ON), the Armadillo runtime library is not built and Armadillo must be used in header-only mode. The autodownloader also does not download dependencies of Armadillo such as OpenBLAS. For this reason, it is recommended to instead install Armadillo using your system package manager, which will also install the dependencies of Armadillo. For example, on Ubuntu and Debian systems, Armadillo can be installed with

sudo apt-get install libarmadillo-dev

and other package managers such as dnf and brew and pacman also have Armadillo packages available.

If the autodownloader is used to provide Armadillo, mlpack programs cannot be linked with -larmadillo. Instead, you must link directly with the dependencies of Armadillo. For example, on a system that has OpenBLAS available, compilation can be done like this:

g++ -O3 -std=c++14 -o my_program my_program.cpp -lopenblas -fopenmp

See the Armadillo documentation for more information on linking Armadillo programs.

3.2. Reducing compile time

mlpack is a template-heavy library, and if care is not used, compilation time of a project can be increased greatly. Fortunately, there are a number of ways to reduce compilation time:

  • Include individual headers, like <mlpack/methods/decision_tree.hpp>, if you are only using one component, instead of <mlpack.hpp>. This reduces the amount of work the compiler has to do.

  • Only use the MLPACK_ENABLE_ANN_SERIALIZATION definition if you are serializing neural networks in your code. When this define is enabled, compilation time will increase significantly, as the compiler must generate code for every possible type of layer. (The large amount of extra compilation overhead is why this is not enabled by default.)

  • If you are using mlpack in multiple .cpp files, consider using extern templates so that the compiler only instantiates each template once; add an explicit template instantiation for each mlpack template type you want to use in a .cpp file, and then use extern definitions elsewhere to let the compiler know it exists in a different file.

Other strategies exist too, such as precompiled headers, compiler options, ccache, and others.

4. Building mlpack bindings to other languages

mlpack is not just a header-only library: it also comes with bindings to a number of other languages, this allows flexible use of mlpack's efficient implementations from languages that aren't C++.

In general, you should not need to build these by hand---they should be provided by either your system package manager or your language's package manager.

Building the bindings for a particular language is done by calling cmake with different options; each example below shows how to configure an individual set of bindings, but it is of course possible to combine the options and build bindings for many languages at once.

4.i. Command-line programs

See also the command-line quickstart.

The command-line programs have no extra dependencies. The set of programs that will be compiled is detailed and documented on the command-line program documentation page.

From the root of the mlpack sources, run the following commands to build and install the command-line bindings:

mkdir build && cd build/
cmake -DBUILD_CLI_PROGRAMS=ON ../
make
sudo make install

You can use make -j<N>, where N is the number of cores on your machine, to build in parallel; e.g., make -j4 will use 4 cores to build.

4.ii. Python bindings

See also the Python quickstart.

mlpack's Python bindings are available on PyPI and conda-forge, and can be installed with either pip install mlpack or conda install -c conda-forge mlpack. These sources are recommended, as building the Python bindings by hand can be complex.

With that in mind, if you would still like to manually build the mlpack Python bindings, first make sure that the following Python packages are installed:

  • setuptools
  • wheel
  • cython >= 0.24
  • numpy
  • pandas >= 0.15.0

Now, from the root of the mlpack sources, run the following commands to build and install the Python bindings:

mkdir build && cd build/
cmake -DBUILD_PYTHON_BINDINGS=ON ../
make
sudo make install

You can use make -j<N>, where N is the number of cores on your machine, to build in parallel; e.g., make -j4 will use 4 cores to build. You can also specify a custom Python interpreter with the CMake option -DPYTHON_EXECUTABLE=/path/to/python.

4.iii. R bindings

See also the R quickstart.

mlpack's R bindings are available as the R package mlpack on CRAN. You can install the package by running install.packages('mlpack'), and this is the recommended way of getting mlpack in R.

If you still wish to build the R bindings by hand, first make sure the following dependencies are installed:

  • R >= 4.0
  • Rcpp >= 0.12.12
  • RcppArmadillo >= 0.9.800.0
  • RcppEnsmallen >= 0.2.10.0
  • roxygen2
  • testthat
  • pkgbuild

These can be installed with install.packages() inside of your R environment. Once the dependencies are available, you can configure mlpack and build the R bindings by running the following commands from the root of the mlpack sources:

mkdir build && cd build/
cmake -DBUILD_R_BINDINGS=ON ../
make
sudo make install

You may need to specify the location of the R program in the cmake command with the option -DR_EXECUTABLE=/path/to/R.

Once the build is complete, a tarball can be found under the build directory in src/mlpack/bindings/R/, and then that can be installed into your R environment with a command like install.packages(mlpack_3.4.3.tar.gz, repos=NULL, type='source').

4.iv. Julia bindings

See also the Julia quickstart.

mlpack's Julia bindings are available by installing the mlpack.jl package using Pkg.add("mlpack.jl"). The process of building, packaging, and distributing mlpack's Julia bindings is very nontrivial, so it is recommended to simply use the version available in Pkg, but if you want to build the bindings by hand anyway, you can configure and build them by running the following commands from the root of the mlpack sources:

mkdir build && cd build/
cmake -DBUILD_JULIA_BINDINGS=ON ../
make

If CMake cannot find your Julia installation, you can add -DJULIA_EXECUTABLE=/path/to/julia to the CMake configuration step.

Note that the make install step is not done above, since the Julia binding build system was not meant to be installed directly. Instead, to use handbuilt bindings (for instance, to test them), one option is to start Julia with JULIA_PROJECT set as an environment variable:

cd build/src/mlpack/bindings/julia/mlpack/
JULIA_PROJECT=$PWD julia

and then using mlpack should work.

4.v. Go bindings

See also the Go quickstart.

To build mlpack's Go bindings, ensure that Go >= 1.11.0 is installed, and that the Gonum package is available. You can use go get to install mlpack for Go:

go get -u -d mlpack.org/v1/mlpack
cd ${GOPATH}/src/mlpack.org/v1/mlpack
make install

The process of building the Go bindings by hand is a little tedious, so following the steps above is recommended. However, if you wish to build the Go bindings by hand anyway, you can do this by running the following commands from the root of the mlpack sources:

mkdir build && cd build/
cmake -DBUILD_GO_BINDINGS=ON ../
make
sudo make install

5. Building mlpack's test suite

mlpack contains an extensive test suite that exercises every part of the codebase. It is easy to build and run the tests with CMake and CTest, as below:

mkdir build && cd build/
cmake -DBUILD_TESTS=ON ../
make
ctest .

If you want to test the bindings, too, you will have to adapt the CMake configuration command to turn on the language bindings that you want to test---see the previous sections for details.

6. Further Resources

More documentation is available for both users and developers.

User documentation:

Tutorials:

Developer documentation:

To learn about the development goals of mlpack in the short- and medium-term future, see the vision document.

If you have problems, find a bug, or need help, you can try visiting the mlpack help page, or mlpack on Github. Alternately, mlpack help can be found on Matrix at #mlpack; see also the community page.

benchmarks's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

benchmarks's Issues

Some DTC benchmarks are missing from the webpage.

According to config.yaml, mlpack and shogun seem to have configuration for the DTC task. However, the benchmark webpage seems to have only scikit's results.

Is it possible that the current data is outdated? How can we update the newest results?

image

Avoid setting up an explicit python path on shogun

On shogun install file, when the user is using, e.g: virtualenv, this kind of CMAKE configuration will override the users configuration for python environment.

Is it really necessary? In my case, the system-wide python didn't have numpy, but my active virtualenv had and it was not finding it. By removing the definition of PYTHON_INCLUDE_DIR, PYTHON_EXECUTABLE:FILEPATH and PYTHON_PACKAGES_PATH variables it worked as expected.

Include version information in database

Ideally a user who is looking at the benchmark results should be able to determine what version of each library was used for benchmarking. This might be as easy as grabbing the version information fromt he configuration, or it might be a little more difficult---for the 'performance over time view', we may want to also update that to get the version at the time of each run and display that.

I'll handle this when I have time (unless someone else gets to it first).

wine data downloading is defect

Because of the wine_qual data, it won't download the wine data automatically

wine_qual*.csv mlpack.org/datasets/wine_qual.tar.gz
wine*.csv mlpack.org/datasets/wine.tar.gz

... because the glob matches the wine_qual data already. Remove the star?

Collect better error messages from failures

Sometimes failures can give us results like:

Traceback (most recent call last):
  File "/home/jenkins/workspace/pull-requests-benchmarks-benchmark/tests/benchmark_decision_tree.py", line 261, in test_RunMetrics
    self.assertTrue(result["Runtime"] > 0)
TypeError: 'int' object is not subscriptable

with stdout

[FATAL] Could not execute command:
orkspace/pull-requests-benchmarks-benchmark/libraries/bin/mlpack_decision_tree',
'-t', 'datasets/iris_train.csv', '-T', 'datasets/iris_test.csv', '-v', '-p',
'mlpack_dct_predict.csv'] The error is: Command
orkspace/pull-requests-benchmarks-benchmark/libraries/bin/mlpack_decision_tree',
'-t', 'datasets/iris_train.csv', '-T', 'datasets/iris_test.csv', '-v', '-p',
'mlpack_dct_predict.csv']' returned non-zero exit status -6

It would probably be helpful if we could provide the actual stdout or stderr output from the subprocess calls.

KMeans Weka failing

There doesn't seem to be any data on the site for any of Weka's results doing KMEANS. Any easy fix? I'd love to compare it!

Build fails and exits on Fedora 28 without python33

The build fails and stops on a vanilla Fedora 28 install:

> git clone https://github.com/mlpack/benchmarks.git
> cd benchmarks
> make
which: no python3.3 in (/usr/share/Modules/bin:/usr/lib64/ccache:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin)

Everything works after deleting the python3.3 line. The makefile is written as if it expects that python3.3 might not be installed, but then it fails and stops if python3.3 is not installed. This seems like a fairly major problem. I don't work with makefiles myself, so I can't really offer any suggestions.

MLP for MATLAB

I see that there is perceptron.m in the MATLAB folder and I want to implement the mlp i.e mlp_forward.m and mlp_backward.m for MATLAB. Is it fine? Or has it already been implemented?

Non uniformity in benchmarking scripts of Shogun

So, I was trying to write a benchmarking script for the Gaussian Process Classifier of Shogun and was looking through other benchmarking scripts to get an idea. In this process I stumbled upon a non-uniformity in the way the benchmarking scripts were written. I am new to this project so pardon me if this question is naive.
Question 1) The decision tree.py script throws an error if less than two datasets are given but the lda.py allows for us to give one dataset. The first mandates that we perform prediction with decision tree but not with lda. Was this a conscious choice? If so why? Will this not matter in benchmarking?

Question 2) Shouldn't this line be if (len(self.dataset) > 1): ?

Include ELKI in the benchmarks

ELKI has much faster clustering implementations than Weka.
In particular, it includes some fast k-means variants.
It would be interesting to see how it competes in your benchmarks.

It's java, but it an easily be launched from the command line. The GUI will assist in choosing the command line parameters.

With -time it will output some timing information, and -resulthandler DiscardResultHandler will prevent writing the result to stdout. A basic KMeans command line looks like this:

java -jar elki-bundle-0.7.1.jar cli \
-dbc.in waveform.csv \
-algorithm clustering.kmeans.KMeansCompare -kmeans.k 2 \
-time -resulthandler DiscardResultHandler

and the output then is:

de.lmu.ifi.dbs.elki.datasource.FileBasedDatabaseConnection.load: 86 ms
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.initialization: de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.initialization.RandomlyChosenInitialMeans@6debcae2
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 667233.4919999999
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 10000
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 266349.30295619805
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 20001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 263882.95153705264
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 30001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 263825.90708396287
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 40001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 263814.94259269274
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 50001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 263812.85525821644
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 60001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 263811.528195029
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 70001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 263810.6052650743
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 80001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.variance-sum: 263810.4244548899
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.distance-computations: 90001
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.iterations: 8
de.lmu.ifi.dbs.elki.algorithm.clustering.kmeans.KMeansCompare.runtime: 83 ms

Of course it will not scale as good as a C++ application, but it will be much more competitive than Weka. I would appreciate if you include it in your benchmarks. As you can see, it took about as long to cluster as it took the read the input file...

ELKI has several k-means variants. Sort, Compare, Elkan and Hamerly are the faster versions. Lloyd is usually not competitive with these.

Expanding the benchmark coverage of this repository for all the toolkits

@zoq @rcurtin I felt that this is a fantastic project where people can find which ml-toolkits are better for certain algorithms, and where the toolkits can improve themselves. So, I have been doing some work on my own that might be useful for this project. I have made a google sheet of the data I have been collecting in this regard. This google sheet contains:

  • the names of various machine learning and statistical analysis algorithms supported by the toolkits benchmarked in this repository
  • in which libraries they are found and in which libraries they aren't found
  • what are the API classes or functions that correspond to the algorithms
  • which algorithms have benchmarks and for what libraries are these benchmarks written.

I have till now covered all the algorithms provided by scikit-learn, mlpack and I am in the process of adding all the algorithms provided by Shogun into this list. This is a work in progress. I am going to add more algorithms to this list in the coming future and hopefully complete this. This is the google sheet that I am preparing:

image

image

image

image

I this regards I have some questions:
a) Is the aim of this project limited to benchmarking the algorithms supported mlpack? If no, I feel that having a sheet like this one, would help. (I got the idea of consolidating all this in a google sheet after I saw a google sheet on tensorflow's github when they were making tensorflow 2.0 and had to list all the API classes that needed some specific change).
b) Also, would it be possible for contributors from mlpack to also contribute to this sheet? I can give edit access. Currently, there are around 166 algorithms that are already listed with many more algorithms not covered and I haven't yet gone through all the library APIs. Would appreciate the help :)

Benchmarks sometimes show negative execution time

When I try to benchmarks some libraries I sometimes get negative timing values. For instance, if I try to run make run BLOCK=shogun,scikit METHODBLOCK=KMEANS, I will get this result:

       mlpack  matlab    scikit  mlpy  shogun  weka  elki  milk  dlibml
wine        -       -  0.000575     -      -2     -     -     -       -
iris        -       -  0.000505     -      -2     -     -     -       -

However, if I try other METHODBLOCKs I will get "correct" results. I tried to search into the Github's issues for solutions but I found nothing. What could be causing this? Did I miss to configure something?
I am running this on Debian 9.5, with Python 3.

Please include scikit-learn on Prerequesites

I thought I've installed all the prequesites but when running make setup:

[...]
byte-compiling ..//lib/python3.5/site-packages/mlpy/da.py to da.cpython-35.pyc
writing byte-compilation script '/tmp/tmp0e99ak1r.py'
/home/gut/venv/bin/python3 -OO /tmp/tmp0e99ak1r.py
removing /tmp/tmp0e99ak1r.py
running install_egg_info
Writing ..//lib/python3.5/site-packages/mlpy-3.5.0-py3.5.egg-info
Partial import of sklearn during the build process.
Traceback (most recent call last):
  File "setup.py", line 149, in get_scipy_status
    import scipy
ImportError: No module named 'scipy'
Traceback (most recent call last):
  File "setup.py", line 270, in <module>
    setup_package()
  File "setup.py", line 260, in setup_package
    .format(scipy_req_str, instructions))
ImportError: Scientific Python (SciPy) is not installed.
scikit-learn requires SciPy >= 0.9.
Installation instructions are available on the scikit-learn website: http://scikit-learn.org/stable/install.html

Partial import of sklearn during the build process.
Traceback (most recent call last):
  File "setup.py", line 149, in get_scipy_status
    import scipy
ImportError: No module named 'scipy'
Traceback (most recent call last):
  File "setup.py", line 270, in <module>
    setup_package()
  File "setup.py", line 260, in setup_package
    .format(scipy_req_str, instructions))
ImportError: Scientific Python (SciPy) is not installed.
scikit-learn requires SciPy >= 0.9.
Installation instructions are available on the scikit-learn website: http://scikit-learn.org/stable/install.html

Error installing scikit-learn!
Makefile:188: recipe for target '.setup' failed
make: *** [.setup] Error 1

a pip install scipy scikit-learn cython seems to solve this kind of issues. Swig also needed to be installed (but it's not a python module, so I sudo apt install'ed it).

Audit benchmarking configuration scripts

We need to take a look through the default configuration and ensure that each benchmarking algorithm has more than one library and a feasible (and sensible) set of datasets and options to go with it.

I will handle this one as I am able to.

This is in reference to mlpack/mlpack#1350.

No conversion possible and could not execute command error on default datasets for LSH

Hello,

I encountered an issue to which I do not know the solution, thus I hope someone will be able to help me here.
I am trying to run default benchmark on LSH method of mlpack library and I get No conversion possible error followed by Could not execute command. I tried running default benchmarks also on other methods and I get similar result.

Am I missing something?

Full output of running default benchmark on LSH method of mlpack library:

/usr/local/bin/python3 benchmark/run_benchmark.py -c config.yaml -b mlpack -l False -u False -m LSH --f "" --n False
[INFO ] CPU Model:  Intel(R) Core(TM) i5-5300U CPU @ 2.30GHz
[INFO ] Distribution: debian jessie/sid
[INFO ] Platform: x86_64
[INFO ] Memory: 7.50390625 GB
[INFO ] CPU Cores: 4
[INFO ] Method: LSH
[INFO ] Options: -k 3 -s 42
[INFO ] Library: mlpack
[INFO ] Dataset: wine
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: cloud
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: wine
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: isolet
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: corel-histogram
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: covtype
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: 1000000-10-randu
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: mnist
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: Twitter
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']
[INFO ] Dataset: tinyImages100k
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_lsh', '-r', '-v', '-k', '3', '-s', '42']

                    mlpack 
wine               failure 
cloud              failure 
isolet             failure 
corel-histogram    failure 
covtype            failure 
1000000-10-randu   failure 
mnist              failure 
Twitter            failure 
tinyImages100k     failure

Installation Problem mlpy

I get this error when I run make run

/usr/bin/python3 run.py -c test.yaml -s "" -o "INFO"
run.py:35: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  base_param = list(yaml.load_all(stream))[0]
run.py:38: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  driver_param = list(yaml.load_all(stream))[0]
run.py:47: YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
  for method in method_config:
[INFO] Script: methods/mlpy/pca.py
Traceback (most recent call last):
  File "run.py", line 115, in <module>
    run(args.config, args.lib, args.methods, args.loglevel)
  File "run.py", line 70, in run
    module = Loader.ImportModuleFromPath(values["script"])
  File "/home/ahmed/Desktop/benchmarks/util/util.py", line 54, in ImportModuleFromPath
    module = imp.load_module(modName, *tup)
  File "/usr/lib/python3.6/imp.py", line 235, in load_module
    return load_source(name, filename, file)
  File "/usr/lib/python3.6/imp.py", line 172, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 684, in _load
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "methods/mlpy/pca.py", line 18, in <module>
    import mlpy
ModuleNotFoundError: No module named 'mlpy'
Makefile:159: recipe for target '.run' failed
make: *** [.run] Error 1
```


Also whenever I try and install mlpy using pip3 I get this error message


```
(base) ahmed@ahmed-Lenovo-ideapad-520-15IKB:~/Desktop/benchmarks$ pip3 install mlpy
Collecting mlpy
  Using cached https://files.pythonhosted.org/packages/93/3c/be8ccff2aa3e5ce3b922cff026aadc62d3a671168e42616b1e0b8eccba12/mlpy-0.1.0.tar.gz
Collecting numpy>=1.6.2 (from mlpy)
  Using cached https://files.pythonhosted.org/packages/07/08/a549ba8b061005bb629b76adc000f3caaaf881028b963c2e18f811c6edc1/numpy-1.18.2-cp36-cp36m-manylinux1_x86_64.whl
Collecting scipy>=0.11 (from mlpy)
  Using cached https://files.pythonhosted.org/packages/dc/29/162476fd44203116e7980cfbd9352eef9db37c49445d1fec35509022f6aa/scipy-1.4.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting matplotlib (from mlpy)
  Downloading https://files.pythonhosted.org/packages/93/4b/52da6b1523d5139d04e02d9e26ceda6146b48f2a4e5d2abfdf1c7bac8c40/matplotlib-3.2.1-cp36-cp36m-manylinux1_x86_64.whl (12.4MB)
    100% |████████████████████████████████| 12.4MB 145kB/s 
Collecting scikit-learn (from mlpy)
  Using cached https://files.pythonhosted.org/packages/5e/d8/312e03adf4c78663e17d802fe2440072376fee46cada1404f1727ed77a32/scikit_learn-0.22.2.post1-cp36-cp36m-manylinux1_x86_64.whl
Collecting six>=1.9.0 (from mlpy)
  Using cached https://files.pythonhosted.org/packages/65/eb/1f97cb97bfc2390a276969c6fae16075da282f5058082d4cb10c6c5c1dba/six-1.14.0-py2.py3-none-any.whl
Collecting pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 (from matplotlib->mlpy)
  Using cached https://files.pythonhosted.org/packages/5d/bc/1e58593167fade7b544bfe9502a26dc860940a79ab306e651e7f13be68c2/pyparsing-2.4.6-py2.py3-none-any.whl
Collecting cycler>=0.10 (from matplotlib->mlpy)
  Using cached https://files.pythonhosted.org/packages/f7/d2/e07d3ebb2bd7af696440ce7e754c59dd546ffe1bbe732c8ab68b9c834e61/cycler-0.10.0-py2.py3-none-any.whl
Collecting python-dateutil>=2.1 (from matplotlib->mlpy)
  Using cached https://files.pythonhosted.org/packages/d4/70/d60450c3dd48ef87586924207ae8907090de0b306af2bce5d134d78615cb/python_dateutil-2.8.1-py2.py3-none-any.whl
Collecting kiwisolver>=1.0.1 (from matplotlib->mlpy)
  Using cached https://files.pythonhosted.org/packages/f8/a1/5742b56282449b1c0968197f63eae486eca2c35dcd334bab75ad524e0de1/kiwisolver-1.1.0-cp36-cp36m-manylinux1_x86_64.whl
Collecting joblib>=0.11 (from scikit-learn->mlpy)
  Using cached https://files.pythonhosted.org/packages/28/5c/cf6a2b65a321c4a209efcdf64c2689efae2cb62661f8f6f4bb28547cf1bf/joblib-0.14.1-py2.py3-none-any.whl
Collecting setuptools (from kiwisolver>=1.0.1->matplotlib->mlpy)
  Using cached https://files.pythonhosted.org/packages/70/b8/b23170ddda9f07c3444d49accde49f2b92f97bb2f2ebc312618ef12e4bd6/setuptools-46.0.0-py3-none-any.whl
Building wheels for collected packages: mlpy
  Running setup.py bdist_wheel for mlpy ... error
  Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-6tdvjugp/mlpy/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmpik6xwqw9pip-wheel- --python-tag cp36:
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib.linux-x86_64-3.6
  creating build/lib.linux-x86_64-3.6/mlpy
  copying mlpy/__init__.py -> build/lib.linux-x86_64-3.6/mlpy
  running egg_info
  writing mlpy.egg-info/PKG-INFO
  writing dependency_links to mlpy.egg-info/dependency_links.txt
  writing requirements to mlpy.egg-info/requires.txt
  writing top-level names to mlpy.egg-info/top_level.txt
  reading manifest file 'mlpy.egg-info/SOURCES.txt'
  reading manifest template 'MANIFEST.in'
  warning: no previously-included files matching '*.sdf' found under directory 'mlpy/libs'
  warning: no previously-included files matching '*.sln' found under directory 'mlpy/libs'
  warning: no previously-included files matching '*.suo' found under directory 'mlpy/libs'
  warning: no previously-included files matching '*.exp' found under directory 'mlpy/libs'
  warning: no previously-included files matching '*.ilk' found under directory 'mlpy/libs'
  warning: no previously-included files matching '*.lib' found under directory 'mlpy/libs'
  warning: no previously-included files matching '*pdb' found under directory 'mlpy/libs'
  warning: no previously-included files matching '__pycache__' found under directory '*'
  no previously-included directories found matching 'mlpy/libs/classifier/classifier'
  no previously-included directories found matching 'mlpy/libs/hmmc/hmmc'
  no previously-included directories found matching 'docs/build'
  no previously-included directories found matching 'docs/generated'
  writing manifest file 'mlpy.egg-info/SOURCES.txt'
  creating build/lib.linux-x86_64-3.6/mlpy/agents
  copying mlpy/agents/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/agents
  copying mlpy/agents/fsm.py -> build/lib.linux-x86_64-3.6/mlpy/agents
  copying mlpy/agents/modules.py -> build/lib.linux-x86_64-3.6/mlpy/agents
  copying mlpy/agents/world.py -> build/lib.linux-x86_64-3.6/mlpy/agents
  creating build/lib.linux-x86_64-3.6/mlpy/auxiliary
  copying mlpy/auxiliary/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
  copying mlpy/auxiliary/array.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
  copying mlpy/auxiliary/datasets.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
  copying mlpy/auxiliary/datastructs.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
  copying mlpy/auxiliary/io.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
  copying mlpy/auxiliary/misc.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
  copying mlpy/auxiliary/plotting.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
  creating build/lib.linux-x86_64-3.6/mlpy/cluster
  copying mlpy/cluster/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/cluster
  copying mlpy/cluster/vq.py -> build/lib.linux-x86_64-3.6/mlpy/cluster
  creating build/lib.linux-x86_64-3.6/mlpy/constants
  copying mlpy/constants/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/constants
  copying mlpy/constants/_constants.py -> build/lib.linux-x86_64-3.6/mlpy/constants
  creating build/lib.linux-x86_64-3.6/mlpy/environments
  copying mlpy/environments/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/environments
  copying mlpy/environments/gridworld.py -> build/lib.linux-x86_64-3.6/mlpy/environments
  copying mlpy/environments/nao.py -> build/lib.linux-x86_64-3.6/mlpy/environments
  creating build/lib.linux-x86_64-3.6/mlpy/environments/webots
  creating build/lib.linux-x86_64-3.6/mlpy/environments/webots/controllers
  creating build/lib.linux-x86_64-3.6/mlpy/environments/webots/controllers/serverc
  copying mlpy/environments/webots/controllers/serverc/serverc.py -> build/lib.linux-x86_64-3.6/mlpy/environments/webots/controllers/serverc
  creating build/lib.linux-x86_64-3.6/mlpy/experiments
  copying mlpy/experiments/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/experiments
  copying mlpy/experiments/task.py -> build/lib.linux-x86_64-3.6/mlpy/experiments
  creating build/lib.linux-x86_64-3.6/mlpy/knowledgerep
  copying mlpy/knowledgerep/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep
  creating build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
  copying mlpy/knowledgerep/cbr/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
  copying mlpy/knowledgerep/cbr/engine.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
  copying mlpy/knowledgerep/cbr/features.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
  copying mlpy/knowledgerep/cbr/methods.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
  copying mlpy/knowledgerep/cbr/similarity.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
  creating build/lib.linux-x86_64-3.6/mlpy/learners
  copying mlpy/learners/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/learners
  creating build/lib.linux-x86_64-3.6/mlpy/learners/offline
  copying mlpy/learners/offline/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/learners/offline
  copying mlpy/learners/offline/irl.py -> build/lib.linux-x86_64-3.6/mlpy/learners/offline
  creating build/lib.linux-x86_64-3.6/mlpy/learners/online
  copying mlpy/learners/online/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/learners/online
  copying mlpy/learners/online/rl.py -> build/lib.linux-x86_64-3.6/mlpy/learners/online
  creating build/lib.linux-x86_64-3.6/mlpy/libs
  copying mlpy/libs/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/libs
  copying mlpy/libs/classifier.pyd -> build/lib.linux-x86_64-3.6/mlpy/libs
  copying mlpy/libs/hmmc.pyd -> build/lib.linux-x86_64-3.6/mlpy/libs
  copying mlpy/libs/noconflict.py -> build/lib.linux-x86_64-3.6/mlpy/libs
  creating build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/array_helper.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/array_helper.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/c45tree.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/c45tree.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/classifier.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/classifier.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/classifier_module.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/classifier_module.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/coord.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/coord.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/random.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  copying mlpy/libs/classifier/random.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
  creating build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
  copying mlpy/libs/hmmc/hmm.c -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
  copying mlpy/libs/hmmc/hmm.h -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
  copying mlpy/libs/hmmc/hmmc_module.c -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
  copying mlpy/libs/hmmc/hmmc_module.h -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
  creating build/lib.linux-x86_64-3.6/mlpy/mdp
  copying mlpy/mdp/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
  copying mlpy/mdp/discrete.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
  copying mlpy/mdp/distrib.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
  copying mlpy/mdp/stateaction.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
  creating build/lib.linux-x86_64-3.6/mlpy/mdp/continuous
  copying mlpy/mdp/continuous/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/mdp/continuous
  copying mlpy/mdp/continuous/casml.py -> build/lib.linux-x86_64-3.6/mlpy/mdp/continuous
  creating build/lib.linux-x86_64-3.6/mlpy/modules
  copying mlpy/modules/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/modules
  copying mlpy/modules/patterns.py -> build/lib.linux-x86_64-3.6/mlpy/modules
  creating build/lib.linux-x86_64-3.6/mlpy/optimize
  copying mlpy/optimize/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/optimize
  copying mlpy/optimize/algorithms.py -> build/lib.linux-x86_64-3.6/mlpy/optimize
  copying mlpy/optimize/utils.py -> build/lib.linux-x86_64-3.6/mlpy/optimize
  creating build/lib.linux-x86_64-3.6/mlpy/planners
  copying mlpy/planners/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/planners
  copying mlpy/planners/discrete.py -> build/lib.linux-x86_64-3.6/mlpy/planners
  creating build/lib.linux-x86_64-3.6/mlpy/planners/explorers
  copying mlpy/planners/explorers/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/planners/explorers
  copying mlpy/planners/explorers/discrete.py -> build/lib.linux-x86_64-3.6/mlpy/planners/explorers
  creating build/lib.linux-x86_64-3.6/mlpy/search
  copying mlpy/search/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/search
  copying mlpy/search/informed.py -> build/lib.linux-x86_64-3.6/mlpy/search
  creating build/lib.linux-x86_64-3.6/mlpy/stats
  copying mlpy/stats/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/stats
  copying mlpy/stats/_conditional.py -> build/lib.linux-x86_64-3.6/mlpy/stats
  copying mlpy/stats/_discrete.py -> build/lib.linux-x86_64-3.6/mlpy/stats
  copying mlpy/stats/_multivariate.py -> build/lib.linux-x86_64-3.6/mlpy/stats
  copying mlpy/stats/_stats.py -> build/lib.linux-x86_64-3.6/mlpy/stats
  creating build/lib.linux-x86_64-3.6/mlpy/stats/dbn
  copying mlpy/stats/dbn/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/stats/dbn
  copying mlpy/stats/dbn/hmm.py -> build/lib.linux-x86_64-3.6/mlpy/stats/dbn
  creating build/lib.linux-x86_64-3.6/mlpy/stats/models
  copying mlpy/stats/models/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/stats/models
  copying mlpy/stats/models/_basic.py -> build/lib.linux-x86_64-3.6/mlpy/stats/models
  copying mlpy/stats/models/mixture.py -> build/lib.linux-x86_64-3.6/mlpy/stats/models
  creating build/lib.linux-x86_64-3.6/mlpy/tools
  copying mlpy/tools/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/tools
  copying mlpy/tools/configuration.py -> build/lib.linux-x86_64-3.6/mlpy/tools
  copying mlpy/tools/log.py -> build/lib.linux-x86_64-3.6/mlpy/tools
  copying mlpy/tools/misc.py -> build/lib.linux-x86_64-3.6/mlpy/tools
  running build_ext
  building 'classifier' extension
  creating build/temp.linux-x86_64-3.6
  creating build/temp.linux-x86_64-3.6/mlpy
  creating build/temp.linux-x86_64-3.6/mlpy/libs
  creating build/temp.linux-x86_64-3.6/mlpy/libs/classifier
  x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/home/ahmed/.local/lib/python3.6/site-packages/numpy/core/include -I/usr/include/python3.6m -c mlpy/libs/classifier/classifier_module.cc -o build/temp.linux-x86_64-3.6/mlpy/libs/classifier/classifier_module.o
  mlpy/libs/classifier/classifier_module.cc: In function ‘PyObject* initclassifier()’:
  mlpy/libs/classifier/classifier_module.cc:33:7: error: ‘Py_InitModule3’ was not declared in this scope
     m = Py_InitModule3("classifier", ClassifierMethods, "Classification module");
         ^~~~~~~~~~~~~~
  mlpy/libs/classifier/classifier_module.cc:33:7: note: suggested alternative: ‘Py_Initialize’
     m = Py_InitModule3("classifier", ClassifierMethods, "Classification module");
         ^~~~~~~~~~~~~~
         Py_Initialize
  mlpy/libs/classifier/classifier_module.cc:34:18: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
     if (m == NULL) return;
                    ^~~~~~
  mlpy/libs/classifier/classifier_module.cc:37:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
      return;
      ^~~~~~
  mlpy/libs/classifier/classifier_module.cc:42:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
      return;
      ^~~~~~
  mlpy/libs/classifier/classifier_module.cc:48:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
      return;
      ^~~~~~
  mlpy/libs/classifier/classifier_module.cc:53:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
      return;
      ^~~~~~
  mlpy/libs/classifier/classifier_module.cc:58:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
      return;
      ^~~~~~
  mlpy/libs/classifier/classifier_module.cc:63:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
      return;
      ^~~~~~
  mlpy/libs/classifier/classifier_module.cc:68:2: warning: control reaches end of non-void function [-Wreturn-type]
    }
    ^
  error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
  
  ----------------------------------------
  Failed building wheel for mlpy
  Running setup.py clean for mlpy
Failed to build mlpy
Installing collected packages: numpy, scipy, pyparsing, six, cycler, python-dateutil, setuptools, kiwisolver, matplotlib, joblib, scikit-learn, mlpy
  Running setup.py install for mlpy ... error
    Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-6tdvjugp/mlpy/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-jwzm9k94-record/install-record.txt --single-version-externally-managed --compile --user --prefix=:
    running install
    running build
    running build_py
    creating build
    creating build/lib.linux-x86_64-3.6
    creating build/lib.linux-x86_64-3.6/mlpy
    copying mlpy/__init__.py -> build/lib.linux-x86_64-3.6/mlpy
    running egg_info
    writing mlpy.egg-info/PKG-INFO
    writing dependency_links to mlpy.egg-info/dependency_links.txt
    writing requirements to mlpy.egg-info/requires.txt
    writing top-level names to mlpy.egg-info/top_level.txt
    reading manifest file 'mlpy.egg-info/SOURCES.txt'
    reading manifest template 'MANIFEST.in'
    warning: no previously-included files matching '*.sdf' found under directory 'mlpy/libs'
    warning: no previously-included files matching '*.sln' found under directory 'mlpy/libs'
    warning: no previously-included files matching '*.suo' found under directory 'mlpy/libs'
    warning: no previously-included files matching '*.exp' found under directory 'mlpy/libs'
    warning: no previously-included files matching '*.ilk' found under directory 'mlpy/libs'
    warning: no previously-included files matching '*.lib' found under directory 'mlpy/libs'
    warning: no previously-included files matching '*pdb' found under directory 'mlpy/libs'
    warning: no previously-included files matching '__pycache__' found under directory '*'
    no previously-included directories found matching 'mlpy/libs/classifier/classifier'
    no previously-included directories found matching 'mlpy/libs/hmmc/hmmc'
    no previously-included directories found matching 'docs/build'
    no previously-included directories found matching 'docs/generated'
    writing manifest file 'mlpy.egg-info/SOURCES.txt'
    creating build/lib.linux-x86_64-3.6/mlpy/agents
    copying mlpy/agents/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/agents
    copying mlpy/agents/fsm.py -> build/lib.linux-x86_64-3.6/mlpy/agents
    copying mlpy/agents/modules.py -> build/lib.linux-x86_64-3.6/mlpy/agents
    copying mlpy/agents/world.py -> build/lib.linux-x86_64-3.6/mlpy/agents
    creating build/lib.linux-x86_64-3.6/mlpy/auxiliary
    copying mlpy/auxiliary/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
    copying mlpy/auxiliary/array.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
    copying mlpy/auxiliary/datasets.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
    copying mlpy/auxiliary/datastructs.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
    copying mlpy/auxiliary/io.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
    copying mlpy/auxiliary/misc.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
    copying mlpy/auxiliary/plotting.py -> build/lib.linux-x86_64-3.6/mlpy/auxiliary
    creating build/lib.linux-x86_64-3.6/mlpy/cluster
    copying mlpy/cluster/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/cluster
    copying mlpy/cluster/vq.py -> build/lib.linux-x86_64-3.6/mlpy/cluster
    creating build/lib.linux-x86_64-3.6/mlpy/constants
    copying mlpy/constants/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/constants
    copying mlpy/constants/_constants.py -> build/lib.linux-x86_64-3.6/mlpy/constants
    creating build/lib.linux-x86_64-3.6/mlpy/environments
    copying mlpy/environments/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/environments
    copying mlpy/environments/gridworld.py -> build/lib.linux-x86_64-3.6/mlpy/environments
    copying mlpy/environments/nao.py -> build/lib.linux-x86_64-3.6/mlpy/environments
    creating build/lib.linux-x86_64-3.6/mlpy/environments/webots
    creating build/lib.linux-x86_64-3.6/mlpy/environments/webots/controllers
    creating build/lib.linux-x86_64-3.6/mlpy/environments/webots/controllers/serverc
    copying mlpy/environments/webots/controllers/serverc/serverc.py -> build/lib.linux-x86_64-3.6/mlpy/environments/webots/controllers/serverc
    creating build/lib.linux-x86_64-3.6/mlpy/experiments
    copying mlpy/experiments/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/experiments
    copying mlpy/experiments/task.py -> build/lib.linux-x86_64-3.6/mlpy/experiments
    creating build/lib.linux-x86_64-3.6/mlpy/knowledgerep
    copying mlpy/knowledgerep/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep
    creating build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
    copying mlpy/knowledgerep/cbr/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
    copying mlpy/knowledgerep/cbr/engine.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
    copying mlpy/knowledgerep/cbr/features.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
    copying mlpy/knowledgerep/cbr/methods.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
    copying mlpy/knowledgerep/cbr/similarity.py -> build/lib.linux-x86_64-3.6/mlpy/knowledgerep/cbr
    creating build/lib.linux-x86_64-3.6/mlpy/learners
    copying mlpy/learners/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/learners
    creating build/lib.linux-x86_64-3.6/mlpy/learners/offline
    copying mlpy/learners/offline/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/learners/offline
    copying mlpy/learners/offline/irl.py -> build/lib.linux-x86_64-3.6/mlpy/learners/offline
    creating build/lib.linux-x86_64-3.6/mlpy/learners/online
    copying mlpy/learners/online/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/learners/online
    copying mlpy/learners/online/rl.py -> build/lib.linux-x86_64-3.6/mlpy/learners/online
    creating build/lib.linux-x86_64-3.6/mlpy/libs
    copying mlpy/libs/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/libs
    copying mlpy/libs/classifier.pyd -> build/lib.linux-x86_64-3.6/mlpy/libs
    copying mlpy/libs/hmmc.pyd -> build/lib.linux-x86_64-3.6/mlpy/libs
    copying mlpy/libs/noconflict.py -> build/lib.linux-x86_64-3.6/mlpy/libs
    creating build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/array_helper.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/array_helper.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/c45tree.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/c45tree.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/classifier.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/classifier.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/classifier_module.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/classifier_module.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/coord.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/coord.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/random.cc -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    copying mlpy/libs/classifier/random.h -> build/lib.linux-x86_64-3.6/mlpy/libs/classifier
    creating build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
    copying mlpy/libs/hmmc/hmm.c -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
    copying mlpy/libs/hmmc/hmm.h -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
    copying mlpy/libs/hmmc/hmmc_module.c -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
    copying mlpy/libs/hmmc/hmmc_module.h -> build/lib.linux-x86_64-3.6/mlpy/libs/hmmc
    creating build/lib.linux-x86_64-3.6/mlpy/mdp
    copying mlpy/mdp/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
    copying mlpy/mdp/discrete.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
    copying mlpy/mdp/distrib.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
    copying mlpy/mdp/stateaction.py -> build/lib.linux-x86_64-3.6/mlpy/mdp
    creating build/lib.linux-x86_64-3.6/mlpy/mdp/continuous
    copying mlpy/mdp/continuous/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/mdp/continuous
    copying mlpy/mdp/continuous/casml.py -> build/lib.linux-x86_64-3.6/mlpy/mdp/continuous
    creating build/lib.linux-x86_64-3.6/mlpy/modules
    copying mlpy/modules/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/modules
    copying mlpy/modules/patterns.py -> build/lib.linux-x86_64-3.6/mlpy/modules
    creating build/lib.linux-x86_64-3.6/mlpy/optimize
    copying mlpy/optimize/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/optimize
    copying mlpy/optimize/algorithms.py -> build/lib.linux-x86_64-3.6/mlpy/optimize
    copying mlpy/optimize/utils.py -> build/lib.linux-x86_64-3.6/mlpy/optimize
    creating build/lib.linux-x86_64-3.6/mlpy/planners
    copying mlpy/planners/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/planners
    copying mlpy/planners/discrete.py -> build/lib.linux-x86_64-3.6/mlpy/planners
    creating build/lib.linux-x86_64-3.6/mlpy/planners/explorers
    copying mlpy/planners/explorers/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/planners/explorers
    copying mlpy/planners/explorers/discrete.py -> build/lib.linux-x86_64-3.6/mlpy/planners/explorers
    creating build/lib.linux-x86_64-3.6/mlpy/search
    copying mlpy/search/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/search
    copying mlpy/search/informed.py -> build/lib.linux-x86_64-3.6/mlpy/search
    creating build/lib.linux-x86_64-3.6/mlpy/stats
    copying mlpy/stats/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/stats
    copying mlpy/stats/_conditional.py -> build/lib.linux-x86_64-3.6/mlpy/stats
    copying mlpy/stats/_discrete.py -> build/lib.linux-x86_64-3.6/mlpy/stats
    copying mlpy/stats/_multivariate.py -> build/lib.linux-x86_64-3.6/mlpy/stats
    copying mlpy/stats/_stats.py -> build/lib.linux-x86_64-3.6/mlpy/stats
    creating build/lib.linux-x86_64-3.6/mlpy/stats/dbn
    copying mlpy/stats/dbn/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/stats/dbn
    copying mlpy/stats/dbn/hmm.py -> build/lib.linux-x86_64-3.6/mlpy/stats/dbn
    creating build/lib.linux-x86_64-3.6/mlpy/stats/models
    copying mlpy/stats/models/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/stats/models
    copying mlpy/stats/models/_basic.py -> build/lib.linux-x86_64-3.6/mlpy/stats/models
    copying mlpy/stats/models/mixture.py -> build/lib.linux-x86_64-3.6/mlpy/stats/models
    creating build/lib.linux-x86_64-3.6/mlpy/tools
    copying mlpy/tools/__init__.py -> build/lib.linux-x86_64-3.6/mlpy/tools
    copying mlpy/tools/configuration.py -> build/lib.linux-x86_64-3.6/mlpy/tools
    copying mlpy/tools/log.py -> build/lib.linux-x86_64-3.6/mlpy/tools
    copying mlpy/tools/misc.py -> build/lib.linux-x86_64-3.6/mlpy/tools
    running build_ext
    building 'classifier' extension
    creating build/temp.linux-x86_64-3.6
    creating build/temp.linux-x86_64-3.6/mlpy
    creating build/temp.linux-x86_64-3.6/mlpy/libs
    creating build/temp.linux-x86_64-3.6/mlpy/libs/classifier
    x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/home/ahmed/.local/lib/python3.6/site-packages/numpy/core/include -I/usr/include/python3.6m -c mlpy/libs/classifier/classifier_module.cc -o build/temp.linux-x86_64-3.6/mlpy/libs/classifier/classifier_module.o
    mlpy/libs/classifier/classifier_module.cc: In function ‘PyObject* initclassifier()’:
    mlpy/libs/classifier/classifier_module.cc:33:7: error: ‘Py_InitModule3’ was not declared in this scope
       m = Py_InitModule3("classifier", ClassifierMethods, "Classification module");
           ^~~~~~~~~~~~~~
    mlpy/libs/classifier/classifier_module.cc:33:7: note: suggested alternative: ‘Py_Initialize’
       m = Py_InitModule3("classifier", ClassifierMethods, "Classification module");
           ^~~~~~~~~~~~~~
           Py_Initialize
    mlpy/libs/classifier/classifier_module.cc:34:18: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
       if (m == NULL) return;
                      ^~~~~~
    mlpy/libs/classifier/classifier_module.cc:37:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
        return;
        ^~~~~~
    mlpy/libs/classifier/classifier_module.cc:42:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
        return;
        ^~~~~~
    mlpy/libs/classifier/classifier_module.cc:48:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
        return;
        ^~~~~~
    mlpy/libs/classifier/classifier_module.cc:53:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
        return;
        ^~~~~~
    mlpy/libs/classifier/classifier_module.cc:58:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
        return;
        ^~~~~~
    mlpy/libs/classifier/classifier_module.cc:63:4: error: return-statement with no value, in function returning ‘PyObject* {aka _object*}’ [-fpermissive]
        return;
        ^~~~~~
    mlpy/libs/classifier/classifier_module.cc:68:2: warning: control reaches end of non-void function [-Wreturn-type]
      }
      ^
    error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
    
    ----------------------------------------
Command "/usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-6tdvjugp/mlpy/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-jwzm9k94-record/install-record.txt --single-version-externally-managed --compile --user --prefix=" failed with error code 1 in /tmp/pip-build-6tdvjugp/mlpy/



```

Contributing doc is necessary.

hello,
Now I am working on mlpack_decision_tree.py and plan to finish ASAP. But when I refer to method/mlpack/decision_tree_stump.py, and other benchmark file, I find the logic of RunMemory in method/mlpack/decision_tree_stump.py and other file is different.
I am confused about return value of the function RunMemory and RunMetrics. Can someone tell me?

MRPT ANN test occasionally fails

I've noticed that sometimes, the ANN test for MRPT fails, but not always:

$ env # filtered for only the necessary stuff
LD_LIBRARY_PATH=libraries/lib/
VALGRIND_BIN=/usr/bin/valgrind
ANN_PATH=methods/ann/
FLANN_PATH=methods/flann/
MS_PRINT_BIN=/usr/bin/ms_print
MATLABPATH=/home/ryan/src/benchmarks-rc/methods/matlab
DEBUGBINPATH=libraries/mlpack/build-debug/bin/
PYTHONPATH=libraries/lib/python3.5/site-packages:libraries/lib/python3.5/dist-packages
JAVAPATH=libraries/share/
MATLAB_BIN=/opt/matlab/bin/
BINPATH=libraries/bin/
{ ryan@slake }{ ~/src/benchmarks-rc } $ python3.5 tests/benchmark_ann.py
.[INFO ] Perform Approximate Nearest Neighbours.
[INFO ] Loading dataset
..[INFO ] Perform Approximate Nearest Neighbours.
[INFO ] Loading dataset
E.[INFO ] Perform Approximate Nearest Neighbours.
[INFO ] Loading dataset
..[INFO ] Perform ANN.
[INFO ] Loading dataset
.
======================================================================
ERROR: test_RunMetrics (__main__.ANN_MRPT_TEST)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "tests/benchmark_ann.py", line 78, in test_RunMetrics
    self.assertTrue(result["Runtime"] > 0)
TypeError: 'int' object is not subscriptable

----------------------------------------------------------------------
Ran 8 tests in 8.966s

FAILED (errors=1)
{ ryan@slake }{ ~/src/benchmarks-rc } $ python3.5 tests/benchmark_ann.py
.[INFO ] Perform Approximate Nearest Neighbours.
[INFO ] Loading dataset
..[INFO ] Perform Approximate Nearest Neighbours.
[INFO ] Loading dataset
..[INFO ] Perform Approximate Nearest Neighbours.
[INFO ] Loading dataset
..[INFO ] Perform ANN.
[INFO ] Loading dataset
.
----------------------------------------------------------------------
Ran 8 tests in 6.116s

OK

@Iron-Stark: could you look into this one please? Maybe there is some issue with the implementation. We can see the issue is also happening on the benchmark nodes: http://masterblaster.mlpack.org/job/benchmark%20-%20checkout%20-%20all%20nodes/label=savannah/90/testReport/

All the information I gave above (the environment variables especially) should be helpful enough for you to reproduce the issue; let me know if I can help out with that.

Difficulties of setting up local benchmark environment

Hi,

I've been trying to set up benchmarks in my laptop, but unfortunately I got several errors and ended up running install scripts one by one.

Currently, I'm trying to install mlpy and shogun, using mlpy_install.sh and shogun_install.sh. I get the following errors.

May I ask you the spec of an ideal environment? For example, which Python version should I use?

can't run benchmarks

Hi,
whenever I try to run make run BLOCK=mlpack METHODBLOCK=KMEANS or anyother benchmark,
I get the following types of errors

/usr/bin/python3 benchmark/run_benchmark.py -c config.yaml -b mlpack -l False -u False -m KMEANS --f "" --n False -r "" -p ""
[WARN ] No module named simplejson
[INFO ] CPU Model: Intel(R) Core(TM) i5-5250U CPU @ 1.60GHz
[INFO ] Distribution: Ubuntu 16.04
[INFO ] Platform: x86_64
[INFO ] Memory: 7.7080078125 GB
[INFO ] CPU Cores: 4
[INFO ] Method: KMEANS
[INFO ] Options: -c 5
[INFO ] Library: mlpack
[INFO ] Dataset: cloud
[FATAL] No conversion possible.
[FATAL] Could not execute command: ['mlpack_kmeans', '-h']
[FATAL] Could not execute command: ['mlpack_kmeans', '-i', 'datasets/cloud.csv', '-I',
'-o', 'output.csv', '-v', '-c', '5']

    mlpack  matlab  scikit  mlpy  shogun  weka 

cloud -2 - - - - -

can anyone please suggest me how to solve this?

Weka not being called correctly

After installing everything I noticed that the weka version downloaded was 3.8.1 (libraries/weka/README).

When I run the suite as make run BLOCK=weka METHODBLOCK=KMEANS, I get the following error:

[FATAL] Can't parse the data: wrong format

Deeper check was narrowed down to this explicit call:

$ java -classpath /home/gut/benchmarks/libraries/share//weka.jar:methods/weka KMeans -i d -c 6
This program performs K-Means clustering on the given dataset.

Required options:
-c [int]         Number of clusters to find.
-i [string]      Input dataset to perform clustering on.

Options:
-m [int]         Maximum number of iterations before K-Means
                 terminates. Default value 1000.
-s [int]         Random seed.

And it didn't work because the regex \n .*?total_time: (?P<total_time>.*?)s.*?\n didn't match. (In fact weka didn't run, it just exits by printing the "help" output as showed above)

Did I miss something?

Allow printing of all the metrics

As of now, benchmarks only writes out timings to the output as the programme execute, but ideally it should have all the metrics(like Timings, Accuracy, etc. ) in the output, which are there being measured in method's benchmarking script.
Currently bash output looks like (It only shows timings):

mlpack shogun
iris 0.008097 0.008097
ionosphere 0.172063 0.172063

But I think it should be somewhat like (need not to be similar) :

Time Mlpack Shogun
iris 0.008097 0.008097
ionosphere 0.172063 0.172063
Accuracy Mlpack Shogun
iris 96.6667 96.6667
ionosphere 97.3245 97.3245

Enhancements to benchmarking

Hi,

I was thinking of adding some features & simplifying the benchmarking tool into a single line usage command line interface.

For example:

  1. ‘mlpack-benchmark -ann -mlpack -github=#1123123’
  2. ‘mlpack-benchmark -ann -shogun -github=#522323’
  3. ‘mlpack-benchmark -ann -mlpack'

The following should be the parameters to this system:

  1. The algorithm (-ann, -kmeans, -cnn)
  2. The ML library (-mlpack, -shogun, -scikit)
  3. GitHub commit (optional)
  4. Local path to uncommitted library (-local=/some/path)
  5. Which data sets to train on (-datasets=wine,iris)

The output should be the time it took to train on each data set, error rate, more specific
output on the algorithm itself (MSE, avg. time per epoch etc.)

This enhancement should also allow users to specify which commit of the library they wanna run the benchmark test on. Alternatively, they could specify a local commit to bench mark, to easily test uncommitted changes. Moreover, this program will automatically download & build the source of the library if required.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.