Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PyBrain:How can I put specific weights in a neural network?

I am trying to recreate a neural network based on given facts.It has 3 inputs,a hidden layer and an output.My problem is that the weights are also given,so I don't need to train.

I was thinking maybe I could save the trainning of a similar in structure neural network and change the values accordingly.Do you think that will work?Any other ideas.Thanks.

Neural Network Code:

    net = FeedForwardNetwork()
    inp = LinearLayer(3)
    h1 = SigmoidLayer(1)
    outp = LinearLayer(1)

    # add modules
    net.addOutputModule(outp)
    net.addInputModule(inp)
    net.addModule(h1)

    # create connections
    net.addConnection(FullConnection(inp, h1))
    net.addConnection(FullConnection(h1, outp))

    # finish up
    net.sortModules()


    trainer = BackpropTrainer(net, ds)
    trainer.trainUntilConvergence()

Save training and load code from How to save and recover PyBrain training?

# Using NetworkWriter

from pybrain.tools.shortcuts import buildNetwork
from pybrain.tools.xml.networkwriter import NetworkWriter
from pybrain.tools.xml.networkreader import NetworkReader

net = buildNetwork(2,4,1)

NetworkWriter.writeToFile(net, 'filename.xml')
net = NetworkReader.readFrom('filename.xml') 
like image 251
IordanouGiannis Avatar asked Mar 03 '12 21:03

IordanouGiannis


1 Answers

I was curious how reading already trained network (with xml tool) is done. Because, that means network weights can be somehow set. So in NetworkReader documentation I found, that you can set parameters with _setParameters().

However that underscore means private method which could have potentially some side effects. Also keep in mind, that vector with weights must be same length as originally constructed network.

Example

>>> import numpy
>>> from pybrain.tools.shortcuts import buildNetwork
>>> net = buildNetwork(2,3,1)
>>> net.params

array([...some random values...])

>>> len(net.params)

13

>>> new_params = numpy.array([1.0]*13)
>>> net._setParameters(new_params)
>>> net.params

array([1.0, ..., 1.0])

Other important thing is to put values in right order. For example above it's like this:

[  1., 1., 1., 1., 1., 1.,      1., 1., 1.,        1.,       1., 1., 1.    ] 
     input->hidden0            hidden0->out     bias->out   bias->hidden0   

To determine which weights belongs to which connections between layers, try this

# net is our neural network from previous example
for c in [connection for connections in net.connections.values() for connection in connections]:
    print("{} -> {} => {}".format(c.inmod.name, c.outmod.name, c.params))

Anyway, I still don't know exact order of weights between layers...

like image 157
sjudǝʊ Avatar answered Nov 04 '22 20:11

sjudǝʊ