Giter VIP home page Giter VIP logo

mlp-and-generalized-rbf-network's Introduction

MLP and Generalized RBF Network

University Project with the scope of the Optimization Methods for Machine Learning Course.

The goal of this project was to implement neural networks to solve a regression problem. It was needed to reconstruct in the region [−2,2] × [−1,1] a two dimensional function, having only a data set obtained by randomly sampling 300 points. The image of original function was provided.

Project consists of three following questions:

  • Question 1 - Full minimization

In this part, two shallow Feedforward Neural Networks (FNN with one hidden layer) were implemented: a MLP and RBF networks. The hyper-parameters $\pi$ of the network were settled by means of an heuristic procedure and the parameters $\omega$ were settled by optimizing the regularized training error: $$ E(\omega; \pi) = \frac{1}{2P} \sum_{p=1}^P (f(x^p) - y^p)^2 + \rho \parallel \omega \parallel^2 $$

  • Question 2 - Two block methods

In this part the same as before two learning machines were considered: MLP and RBF networks. Now the hyperparameters are fixed as the best ones found with the Grid-Search (Extreme Learning). Additionally, the parameters are divided in two block.

  • Question 3 - Decomposition method

This time, only the MLP is considered with the parameters divided in two blocks as the question before. However, in this case there is no parameters fixed. The current methodology is to find the optimal value for one block fixing the other alternatively.

More detailed description of the work done and results can be found in the report pdf file report.pdf.

Code organization

Organization of the code follows the professor's rules where for each question there had to be created a different folder with two files (example for a question 1.1):

  • a file called run_ 11 _DaJoLi.py that must include code executed for solving the specific question
  • a file with the complete code written that includes all the functions that are called from the run_ 11 _DaJoLi.py file.

Short summary of results

summary_table

where the hyper-parameters are:

N - the number of neurons N of the hidden layer

σ - the spread in the activation function

ρ - the regularization parameter.

Image of the best prediction

bestprediction

Image of the original function

original

mlp-and-generalized-rbf-network's People

Contributors

joannabroniarek avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.