Giter VIP home page Giter VIP logo

rsnns's People

Contributors

cbergmeir avatar sshlien avatar vspinu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

rsnns's Issues

Activation Functions about RELU

Hi. I am making use of elman neural networks.

I want to use RELU FUNCTIONs on networks... but, there's no RELU Function on RSNNS

Which options could I use, if I wnat to use RELU Function?

If it is impossible, can I make a pull requests about RELU Function? (Because, I don't know how to use/test my c++ RELU code, and also dependencies...)

Please Let me know..! I really want your help.

RBF sites/links handling

krui_getSiteName chokes when using sites within the network and at least RadialBasisLearning due to incomplete link handling. Solution for learn_f.cpp:3724 as follows:

if (NetInitialize || LearnFuncHasChanged) {
  FOR_ALL_UNITS(unit_ptr) {
    if UNIT_HAS_SITES( unit_ptr ){
      FOR_ALL_SITES_AND_LINKS(unit_ptr, site_ptr, link_ptr) {
        link_ptr->value_a = 0.0;
      }
    }else{
      FOR_ALL_LINKS(unit_ptr, link_ptr) {
        link_ptr->value_a = 0.0;
      }
    }
  }
}

Also, the following pointer operations in kr_mem.cpp look kind of suspicious. Should post increments be used instead?

tmp_ptr = ++link_array;
tmp_ptr = ++site_array;

R crashes upon running mlp()

Hi,

Can you look into the following issue?

When I run mlp() I get the error 'terminate called after throwing an instance of 'Rcpp: not compatible' followed by a crash of R.

Thx.

Different results using Windows or Linux

Hello,

I would like to know why the results using Windows or Linux are different.

library(RSNNS)

set.seed(1)

data = matrix(rnorm(300), ncol = 3)
model = mlp(x = data[1:95, 1:2], y = data[1:95, 3], size = 5)
predict = predict(model, data5[96:100, 1:2])
predict
# Results using Windows:
#           [,1]
# 96  0.02506411
# 97  0.01911207
# 98  0.02082419
# 99  0.01711116
# 100 0.01743222

# Results using Linux:
#           [,1]
# 96  0.02582934
# 97  0.01976289
# 98  0.02150856
# 99  0.01771399
# 100 0.01804006

The R versions and the versions of package RSNNS are the same.

Best regards,
Marius

Where is old `loadNet()`

I'm using SNNS function loadNet(netfile) to recover network state. I can't find any equivalent in RSNNS.

Loading *.pat files

Using data <-readPatFile(filename) function is too long. For example file with 45000 records using SNNS loadPattern(filename) function takes 1-2 sec. instead RSNNS takes 30min. Why?

Is it possible to use the RSNNS library to perform RBF interpolation when the target output is longer than the input?

Is it possible to use the RSNNS library to perform RBF interpolation when the target output is longer than the input?

There are 4 columns, "fitted" which stands for a value for each surf_x, surf_y, and surf_z value. surf_x etc. and all values of "outputs" all represent a coordinate in 3D space.

The goal is to use RBF interpolation to find the "fitted" value corresponding to the coordinates found in "outputs" where all values of surf_x in "inputs" can be found in column [,1] of "outputs".

library(RSNNS)

> dim(inputs)
[1] 2312    4

> head(inputs)
      fitted    surf_x    surf_y   surf_z
1 -2.0260418 -64.20550 -84.79310 219.4273
2  1.2358981 -67.00667 -79.65610 227.9487
3 -1.3667838 -62.88993 -86.83528 218.1500
4  0.1172459 -66.91177 -79.44738 227.0095
5  1.1278194 -67.30763 -80.56429 226.5518
6 -2.7412495 -62.88353 -85.40290 218.0784

> dim(outputs)
[1] 6971    3

> head(outputs)
        [,1]    [,2]    [,3]
[1,] -38.000 -41.794 180.858
[2,] -39.542 -50.167 175.664
[3,] -33.440 -56.959 173.340
[4,] -17.804 -60.643 181.561
[5,]  -8.156 -50.190 201.004
[6,] -23.019 -36.171 196.698

rbf_res <- RSNNS::rbf(x = inputs,
               y = outputs)

Error in checkInput(x, y) : nrows of 'x' and 'y' must match

A question for installing RSNNS

Hello cbergmeri, When I installed package RSNNS, error occurs. The error is "make: *** [RSNNS.so] Error 1 ERROR: compilation failed for package 'RSNNS' ". I find RSNNS is compiled by clang. But I use gcc to compile it. Is this an reason for this question? I am looking forward to your reply. Thanks!!!

Suggestion of init method

Dear Christoph,
I would like suggest a feature to add in your great package, that is cruelly missing in R version.
It would be nice to propose an initialization method copying the weights of an already trained network, that would allow to iteratively train a network with new examples.
Or even better: an input parameter in high level functions (mlp, elman ...) to pass an already existing network to update.
Thank you for telling me if i am missing this functionality. I actually didn't find a way to achieve this browsing R documentation of RSNNS.

Best Regards,
Nicolas

How do bias work at a mlp network?

I noticed that biases are not applied in a conventional manner in a Multilayer Perceptron model. It is not clear how the feedforward step applies bies between the layer and their neurons.

Please, could you explain how they work?

Doubt about the forecast with RSNNS

Hi,
I use the RSNNS package and today I noticed something strange. It always worked, however, today I used it and the forecast is very strange.

Do you know something? I am writing an article with this package.

require(RSNNS)
a<-jordan(scale(mtcars[,5]),mtcars[,1], size=8, learnFuncParams = c(0.1), maxit=1000,linOut = TRUE)
b<-predict(a,data.frame(scale(mtcars[,5])))
plot(mtcars[,1],type="l",col="blue",ylim = c(1,40))
lines(b,type="l",col="black")
summary(a)
weightMatrix(a)
a<-mlp(mtcars[,5],mtcars[,1], size=10, learnFuncParams = c(0.1), maxit=1000,linOut = TRUE)
b<-predict(a,data.frame(mtcars[,5]))
plot(mtcars[,1],type="l",col="blue",ylim = c(1,40))
lines(b,type="l",col="black")

Unable To access repository for package installation in R

Hello,
I am trying to install a package in sparkR and getting the error
unable to access index for repository http://mran.revolutionanalytics.com/snapshot/2015-05-01/src/contrib

The version of R i'm using is 3.2.0 and its throwing error for every new package installation.
I did try to download the package zip file and load it from there but that also not worked for me getting this as an error
‘randomForest’ is not a valid installed package.
Is there any solution?

Jordan segfault when used with more than one layer

when using, e.g.: size=c(2,2), the function exits with a seg fault, but should better report that Jordan networks with more than one hidden layer don't make much sense, as their feedback comes directly from the output layer.

library(RSNNS)

data(snnsData)
inputs <- snnsData$laser_1000.pat[,inputColumns(snnsData$laser_1000.pat)]
outputs <- snnsData$laser_1000.pat[,outputColumns(snnsData$laser_1000.pat)]

patterns <- splitForTrainingAndTest(inputs, outputs, ratio=0.15)

modelJordan <- jordan(patterns$inputsTrain, patterns$targetsTrain,
size=c(2,2), learnFuncParams=c(0.1), maxit=100,
inputsTest=patterns$inputsTest,
targetsTest=patterns$targetsTest, linOut=FALSE)

How to show the values of hidden nodes in a mlp model

Dear developer,

How can I predict the value of hidden nodes in a mlp model?

For keras I can short the model upto a layer call 'dense_1' and make my predictions on 'ph.test':
layer_name <- 'dense_1';
chk_model <- keras_model(inputs=model$input, outputs=get_layer(model, layer_name)$output);
chko <- predict(chk_model, ph.test);

Thanks in advance,

Ad Denissen.

predict method for pruned networks

Hi again,

I had a quick question about the predict method for pruned neural networks. I'm getting some weird results when I predict values with new data from a pruned mlp network. Here's some code to illustrate the problem.

A plot of the pruned network...

library(RSNNS)
# devtools::install_github('fawda123/NeuralNetTools')
library(NeuralNetTools)
x <- neuraldat[, c('X1', 'X2', 'X3')]
y <- neuraldat[, 'Y1']

# pruned model using code from RSSNS pruning demo
pruneFuncParams <- list(max_pr_error_increase = 10.0, pr_accepted_error = 1.0, 
  no_of_pr_retrain_cycles = 1000, min_error_to_stop = 0.01, init_matrix_value = 1e-6, 
  input_pruning = TRUE, hidden_pruning = TRUE)
mod <- mlp(x, y, size = 5, pruneFunc = "OptimalBrainSurgeon", 
  pruneFuncParams = pruneFuncParams)

# plot shows that only X3 has weighted connections
plotnet(mod)

Now, when I give the pruned model some new data with a constant value for X3, why would the predicted response change? I would expect changing values for X1 and X2 to have no effect on model predictions because the connection weights are zero, whereas changing values for X3 would produce a response for Y because the connection weights are non-zero. Am I thinking about this correctly?

# ranges
# apply(x, 2, range)

# no obs for prediction data
n <- 10

# holding X3 constant gets a response
newdat <- data.frame(X1 = seq(0, 1, length = n), X2 = seq(0, 1, length = n), X3 = 0.5)
predict(mod, newdata = newdat)

# holding all but X3 constant returns a single value for the response

newdat <- data.frame(X1 = 0.5, X2 = 0.5, X3 = seq(0, 1, length = n))
predict(mod, newdata = newdat)

Any idea why this is happening?

Thanks,

Marcus

plotting function

Hi there,

I'm the author of the NeuralNetTools package. I noticed your todo list for the next version of RSNNS links to an old gist of mine for plotting nnet objects. The CRAN release for NeuralNetTools has a plotnet method for mlp models and I just pushed a feature to the devel version to plot pruned mlp models. Thought you might be interested... I'll probably submit the latest release to CRAN pretty soon, so you could include it as a dependency in a future version of RSNNS.

Keep up the great work!

-Marcus

Explanation for a command

Why learnFuncParams=c(10, 0.1) in mlp command with BackpropBatch argument for learnFunc.
please reply asap.

How to implement a time-delay neural network using RSNNS?

I do not see where time-delay neural network is documented? Can you please be so kind and gracious enough to let me know where the tdnn is documented in your package documentation. If not, could you be so kind and gracious enough to briefly include an example of how to do this with the SNSS code please?

I have read an article about SNNS package from 2012. The authors original article is here. They mention on page 2 that several network architectures are implemented in SNSS, including time-delay neural networks. I believe the source code is located here. Unfortunately, when I look at the R package documentation I do not see this mentioned in the package api.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.