Giter VIP home page Giter VIP logo

linear-model-evaluation---matlab-deep-inverse-optimization's Introduction

Linear-Model-Evaluation - Matlab-Deep-Inverse-Optimization

Framework to parse parameters between Matlab and Python to do Deep Inverse Optimization which is not available in the Matlab Optim Toolbox. The Deep Inverse Optimization code can be found here (https://github.com/tankconcordia/deep_inv_opt) described as an efficient way for tuning parameters is very necessary to evaluate and find correlation on empirical and optimized forward models in Neuroscience.

This repository can parse any parameter from Matlab to the Python code described by the Deep Inverse Optimization repository.

Requirements

  • Miniconda 3
  • Anaconda 3
  • Matlab >= R2019a
  • Python 3.7
  • Pytorch
  • deep_inv_opt
  • numpy
  • matplotlib

Follow the next steps before run the Matlab main code:

1. Install Anaconda and Miniconda 3 prompt following the instructions documented in this link (https://docs.conda.io/projects/conda/en/latest/user-guide/install/windows.html). For Unix or Mac follow these instructions (https://docs.conda.io/projects/conda/en/latest/user-guide/install/linux.html) or --(https://docs.conda.io/projects/conda/en/latest/user-guide/install/macos.html)

2. From the Anaconda prompt in Windows or Unix install the requirement packages from te requirements.txt file added in this current repository following the next command on pip:

   pip install -r requirements.txt

(it will take a time, and be sure you have 5GB available in your HD for Conda installation)

3. After the main requirements are installed you must install the Deep Inverse Optimization (https://github.com/tankconcordia/deep_inv_opt). Download the repository and setup the code as part of the enviroment variable as they suggested in the repository:

   python setup.py develop

(run this on the deep_inv_opt main directory where the file setup.py is located).

After Deep Inverse Optimization repository is set in Anaconda you can go to the matlab prompt and run the main example containing a linear model optimization following this command on the main directory you have located the .m files from this repository:

   matlab_interface_python(40,20);

The previous command will create an input vector of N=40 random points and N=40 target points from a uniform distribution between 0 and 1. You can modify the model on matlab and this command will proceed with 20 training iterations on deep_inv_opt after you run it. The final output on the Matlab command prompt will be something like this:

     inverse_parametric_linprog[0001]: loss=0.101617 weights=[2.8000]
     inverse_parametric_linprog[0002]: loss=0.081102 weights=[2.9568]
     inverse_parametric_linprog[0003]: loss=0.078752 weights=[3.0901]
     inverse_parametric_linprog[0004]: loss=0.062718 weights=[3.1193]
     inverse_parametric_linprog[0005]: loss=0.062531 weights=[3.1179]
     inverse_parametric_linprog[0006]: loss=0.061715 weights=[3.1177]
     inverse_parametric_linprog[0007]: loss=0.061681 weights=[3.1177]
     inverse_parametric_linprog[0008]: loss=0.061229 weights=[3.1177]
     inverse_parametric_linprog[0009]: loss=0.061196 weights=[3.1177]
     inverse_parametric_linprog[0010]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0011]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0012]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0013]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0014]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0015]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0016]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0017]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0018]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0019]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[0020]: loss=0.061193 weights=[3.1177]
     inverse_parametric_linprog[done]: loss=0.061193 weights=[3.1177]

For this particular model we want to make an inverse optimization on the paramater w on a Bayesian model described on the following Equation denominator:

This Equation can be re-written as:

And we can define the ratio as u.

Now if we define as a vector of optimal empirical set-points or targets the model should follow, we can optimize the model described in the first equation doing the following assigment .

And for sake of the optimization we can define the matrix of equalities of our current model as follows:

The final plot results and the particular example out is shown in the image below:

results linear model inverse optimization

The blue points are the values modified by w to follow the targets or the bigger black-points set from the random distributions. The loss decreases from 0.1764 to 0.06868

linear-model-evaluation---matlab-deep-inverse-optimization's People

Contributors

dependabot[bot] avatar meiyor avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.