Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using nnet for prediction, am i doing it right?

I'm still pretty new to R and AI / ML techniques. I would like to use a neural net for prediction, and since I'm new I would just like to see if this is how it should be done.

As a test case, I'm predicting values of sin(), based on 2 previous values. For training I create a data frame withy = sin(x), x1 = sin(x-1), x2 = sin(x-2), then use the formula y ~ x1 + x2.

It seems to work, but I am just wondering if this is the right way to do it, or if there is a more idiomatic way.

This is the code:

require(quantmod) #for Lag()
requre(nnet)
x <- seq(0, 20, 0.1)
y <- sin(x)
te <- data.frame(y, Lag(y), Lag(y,2))
names(te) <- c("y", "x1", "x2")
p <- nnet(y ~ x1 + x2, data=te, linout=TRUE, size=10)
ps <- predict(p, x1=y)
plot(y, type="l")
lines(ps, col=2)

Thanks

[edit]

Is this better for the predict call?

t2 <- data.frame(sin(x), Lag(sin(x)))
names(t2) <- c("x1", "x2")
vv <- predict(p, t2)
plot(vv)

I guess I'd like to see that the nnet is actually working by looking at its predictions (which should approximate a sin wave.)

like image 795
dizzy Avatar asked Oct 12 '11 17:10

dizzy


People also ask

Can we use neural network for prediction?

Use of neural networks prediction in predictive analyticsNeural networks work better at predictive analytics because of the hidden layers. Linear regression models use only input and output nodes to make predictions. The neural network also uses the hidden layer to make predictions more accurate.

What does NNET do in R?

nnet package on r can be used to create an ANN to see the accuracy of the model and make predictions on input data which will be classified later. The '~. 'Command is used to enter all independent variables, i.e. The 'nnet' command is run with data train.

What is NNET?

nnet: Feed-Forward Neural Networks and Multinomial Log-Linear Models. Software for feed-forward neural networks with a single hidden layer, and for multinomial log-linear models. Version: 7.3-17.


1 Answers

I really like the caret package, as it provides a nice, unified interface to a variety of models, such as nnet. Furthermore, it automatically tunes hyperparameters (such as size and decay) using cross-validation or bootstrap re-sampling. The downside is that all this re-sampling takes some time.

#Load Packages
require(quantmod) #for Lag()
require(nnet)
require(caret)

#Make toy dataset
y <- sin(seq(0, 20, 0.1))
te <- data.frame(y, x1=Lag(y), x2=Lag(y,2))
names(te) <- c("y", "x1", "x2")

#Fit model
model <- train(y ~ x1 + x2, te, method='nnet', linout=TRUE, trace = FALSE,
                #Grid of tuning parameters to try:
                tuneGrid=expand.grid(.size=c(1,5,10),.decay=c(0,0.001,0.1))) 
ps <- predict(model, te)

#Examine results
model
plot(y)
lines(ps, col=2)

It also predicts on the proper scale, so you can directly compare results. If you are interested in neural networks, you should also take a look at the neuralnet and RSNNS packages. caret can currently tune nnet and neuralnet models, but does not yet have an interface for RSNNS.

/edit: caret now has an interface for RSNNS. It turns out if you email the package maintainer and ask that a model be added to caret he'll usually do it!

/edit: caret also now supports Bayesian regularization for feed-forward neural networks from the brnn package. Furthermore, caret now also makes it much easier to specify your own custom models, to interface with any neural network package you like!

like image 191
Zach Avatar answered Oct 11 '22 04:10

Zach