I am looking for a way to quickly change a graph within an interactive session in Jupyter in order to test different structures. Initially I wanted to simple delete existing variables and recreate them with a different initializer. This does not seem to be possible [1].
I then found [2] and am now attempting to simply discard and recreate the default graph. But this does not seem to work. This is what I do:
a. Start a session
import tensorflow as tf
import math
sess = tf.InteractiveSession()
b. Create a variable in the default graph
IMAGE_PIXELS = 32 * 32
HIDDEN1 = 200
BATCH_SIZE = 100
NUM_POINTS = 30
images_placeholder = tf.placeholder(tf.float32, shape=(BATCH_SIZE, IMAGE_PIXELS))
points_placeholder = tf.placeholder(tf.float32, shape=(BATCH_SIZE, NUM_POINTS))
# Hidden 1
with tf.name_scope('hidden1'):
weights_init = tf.truncated_normal([IMAGE_PIXELS, HIDDEN1], stddev=1.0 / math.sqrt(float(IMAGE_PIXELS)))
weights = tf.Variable(weights_init, name='weights')
biases_init = tf.zeros([HIDDEN1])
biases = tf.Variable(biases_init, name='biases')
hidden1 = tf.nn.relu(tf.matmul(images_placeholder, weights) + biases)
c. Use the variable
# Add the variable initializer Op.
init = tf.initialize_all_variables()
# Run the Op to initialize the variables.
sess.run(init)
d. Reset the graph
tf.reset_default_graph()
e. Recreate the variable
with tf.name_scope('hidden1'):
weights = tf.get_variable(name='weights', shape=[IMAGE_PIXELS, HIDDEN1],
initializer=tf.contrib.layers.xavier_initializer())
biases_init = tf.zeros([HIDDEN1])
biases = tf.Variable(biases_init, name='biases')
hidden1 = tf.nn.relu(tf.matmul(images_placeholder, weights) + biases)
However, I get an exception (see below). So my question is: is it possible to reset/remove the graph and recreate it as before? If so, how?
Appreciate any pointers.
TIA,
ValueError Traceback (most recent call last)
<ipython-input-5-e98a82c45473> in <module>()
5 biases_init = tf.zeros([HIDDEN1])
6 biases = tf.Variable(biases_init, name='biases')
----> 7 hidden1 = tf.nn.relu(tf.matmul(images_placeholder, weights) + biases)
8
/home/hmf/my_py3/lib/python3.4/site-packages/tensorflow/python/ops/math_ops.py in matmul(a, b, transpose_a, transpose_b, a_is_sparse, b_is_sparse, name)
1323 A `Tensor` of the same type as `a`.
1324 """
-> 1325 with ops.op_scope([a, b], name, "MatMul") as name:
1326 a = ops.convert_to_tensor(a, name="a")
1327 b = ops.convert_to_tensor(b, name="b")
/usr/lib/python3.4/contextlib.py in __enter__(self)
57 def __enter__(self):
58 try:
---> 59 return next(self.gen)
60 except StopIteration:
61 raise RuntimeError("generator didn't yield") from None
/home/hmf/my_py3/lib/python3.4/site-packages/tensorflow/python/framework/ops.py in op_scope(values, name, default_name)
4014 ValueError: if neither `name` nor `default_name` is provided.
4015 """
-> 4016 g = _get_graph_from_inputs(values)
4017 n = default_name if name is None else name
4018 if n is None:
/home/hmf/my_py3/lib/python3.4/site-packages/tensorflow/python/framework/ops.py in _get_graph_from_inputs(op_input_list, graph)
3812 graph = graph_element.graph
3813 elif original_graph_element is not None:
-> 3814 _assert_same_graph(original_graph_element, graph_element)
3815 elif graph_element.graph is not graph:
3816 raise ValueError(
/home/hmf/my_py3/lib/python3.4/site-packages/tensorflow/python/framework/ops.py in _assert_same_graph(original_item, item)
3757 if original_item.graph is not item.graph:
3758 raise ValueError(
-> 3759 "%s must be from the same graph as %s." % (item, original_item))
3760
3761
ValueError: Tensor("weights:0", shape=(1024, 200), dtype=float32_ref) must be from the same graph as Tensor("Placeholder:0", shape=(100, 1024), dtype=float32).`
When you reset the default graph, you do not remove the previous Tensors created. When calling tf.reset_default_graph()
, a new graph is created and set to default.
Here is an example to illustrate:
x = tf.constant(1)
print tf.get_default_graph() == x.graph # prints True
tf.reset_default_graph()
print tf.get_default_graph() == x.graph # prints False
The error you had indicates that two tensors must be from the same graph, which means you are still using some tensors from the previous graph AND from the current default graph.
The easy fix is to create again the two placeholders images_placeholder
and points_placeholder
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With