Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to get weights from tensorflow fully_connected

I'm trying to extract the weights from a model after training it. Here's a toy example

import tensorflow as tf
import numpy as np

X_ = tf.placeholder(tf.float64, [None, 5], name="Input")
Y_ = tf.placeholder(tf.float64, [None, 1], name="Output")

X = ...
Y = ...
with tf.name_scope("LogReg"):
    pred = fully_connected(X_, 1, activation_fn=tf.nn.sigmoid)
    loss = tf.losses.mean_squared_error(labels=Y_, predictions=pred)
    training_ops = tf.train.GradientDescentOptimizer(0.01).minimize(loss)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    for i in range(200):
        sess.run(training_ops, feed_dict={
            X_: X,
            Y_: Y
        })
        if (i + 1) % 100 == 0:
            print("Accuracy: ", sess.run(accuracy, feed_dict={
                X_: X,
                Y_: Y
            }))

# Get weights of *pred* here

I've looked at Get weights from tensorflow model and at the docs but can't find a way to retrieve the value of the weights.

So in the toy example case, suppose that X_ has shape (1000, 5), how can I get the 5 values in the 1-layer weights after

like image 800
dbokers Avatar asked Apr 01 '17 15:04

dbokers


People also ask

How do you visualize model weights in TensorFlow?

To visualize the weights, you can use a tf. image_summary() op to transform a convolutional filter (or a slice of a filter) into a summary proto, write them to a log using a tf. train. SummaryWriter , and visualize the log using TensorBoard.

How do you get weights in TF layers dense?

Dense(...) . Once you have a handle to this layer object, you can use all of its functionality. For obtaining the weights, just use obj. trainable_weights this returns a list of all the trainable variables found in that layer's scope.


Video Answer


1 Answers

There are some issues in your code that needs to be fixed:

1- You need to use variable_scope instead of name_scope at the following line (please refer to the TensorFlow documentation for difference between them):

with tf.name_scope("LogReg"):

2- To be able to retrieve a variable later in code, you need to know it's name. So, you need to assign a name to the variable of interest (if you don't support one, there will be a default one assigned, but then you need to figure out what it is!):

pred = tf.contrib.layers.fully_connected(X_, 1, activation_fn=tf.nn.sigmoid, scope = 'fc1')

Now let's see how the above fixes can help us to get a variable's value. Each layer has two types of variables: weights and biases. In the following code snippet (a modified version of yours) I will only show how to retrieve the weights for the fully connected layer:

X_ = tf.placeholder(tf.float64, [None, 5], name="Input")
Y_ = tf.placeholder(tf.float64, [None, 1], name="Output")

X = np.random.randint(1,10,[10,5])
Y = np.random.randint(0,2,[10,1])

with tf.variable_scope("LogReg"):
    pred = tf.fully_connected(X_, 1, activation_fn=tf.nn.sigmoid, scope = 'fc1')
    loss = tf.losses.mean_squared_error(labels=Y_, predictions=pred)
    training_ops = tf.train.GradientDescentOptimizer(0.01).minimize(loss)

with tf.Session() as sess:

    all_vars= tf.global_variables()
    def get_var(name):
        for i in range(len(all_vars)):
            if all_vars[i].name.startswith(name):
                return all_vars[i]
        return None
    fc1_var = get_var('LogReg/fc1/weights')

    sess.run(tf.global_variables_initializer())    
    for i in range(200):
        _,fc1_var_np = sess.run([training_ops,fc1_var], feed_dict={
        X_: X,
        Y_: Y 
        })
        print fc1_var_np
like image 53
Ali Avatar answered Oct 22 '22 11:10

Ali