Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I generate a random vector in TensorFlow and maintain it for further use?

I am trying to generate a random variable and use it twice. However, when I use it the second time, the generator creates a second random variable that is not identical to the first. Here is code to demonstrate:

import numpy as np
import tensorflow as tf

# A random variable
rand_var_1 = tf.random_uniform([5],0,10, dtype = tf.int32, seed = 0)
rand_var_2 = tf.random_uniform([5],0,10, dtype = tf.int32, seed = 0)

#Op1
z1 = tf.add(rand_var_1,rand_var_2)

#Op2
z2 = tf.add(rand_var_1,rand_var_2)

init = tf.initialize_all_variables()

with tf.Session() as sess:
    sess.run(init)
    z1_op = sess.run(z1)
    z2_op = sess.run(z2)
    print(z1_op,z2_op)

I want z1_op and z2_op to be equal. I think this is because the random_uniform op gets called twice. Is there a way to use TensorFlow (without using NumPy) to achieve this?

(My use case is more complicated, but this is the distilled question.)

like image 673
Srikiran Avatar asked Jan 19 '16 22:01

Srikiran


People also ask

How do you generate random tensors?

To create a random tensor with specific shape, use torch. rand() function with shape passed as argument to the function. torch. rand() function returns tensor with random values generated in the specified shape.

What is tf random normal?

tf. random. normal() outputs a tensor of the given shape filled with values of the dtype type from a normal distribution.

What does tf random uniform do?

random. uniform() This outputs a tensor of the given shape filled with values from a uniform distribution in the range minval to maxval , where the lower bound is inclusive but the upper bound isn't.

What does tf random categorical do?

Draws samples from a categorical distribution.


1 Answers

The current version of your code will randomly generate a new value for rand_var_1 and rand_var_2 on each call to sess.run() (although since you set the seed to 0, they will have the same value within a single call to sess.run()).

If you want to retain the value of a randomly-generated tensor for later use, you should assign it to a tf.Variable:

rand_var_1 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
rand_var_2 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))

# Or, alternatively:
rand_var_1 = tf.Variable(tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0))
rand_var_2 = tf.Variable(rand_var_1.initialized_value())

# Or, alternatively:
rand_t = tf.random_uniform([5], 0, 10, dtype=tf.int32, seed=0)
rand_var_1 = tf.Variable(rand_t)
rand_var_2 = tf.Variable(rand_t)

...then tf.initialize_all_variables() will have the desired effect:

# Op 1
z1 = tf.add(rand_var_1, rand_var_2)

# Op 2
z2 = tf.add(rand_var_1, rand_var_2)

init = tf.initialize_all_variables()

with tf.Session() as sess:
    sess.run(init)        # Random numbers generated here and cached.
    z1_op = sess.run(z1)  # Reuses cached values for rand_var_1, rand_var_2.
    z2_op = sess.run(z2)  # Reuses cached values for rand_var_1, rand_var_2.
    print(z1_op, z2_op)   # Will print two identical vectors.
like image 75
mrry Avatar answered Oct 28 '22 16:10

mrry