Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow graph size

Tags:

tensorflow

I am wondering if there is an easy way to check the size/memory needed of a tensorflow graph before running a tensorflow session.

I am looking for something where I can keep changing my system parameters that define the graph and can see how big (in memory) the graph becomes accordingly.

like image 770
Mohamed Abdelhafez Avatar asked Apr 24 '26 16:04

Mohamed Abdelhafez


1 Answers

I have done something similar where I wanted to see the number of parameters in my model.

vars = 0
for v in tf.all_variables():
    vars += np.prod(v.get_shape().as_list())
print(vars)

Now vars contains the sum of the product of the dimensions of all the variables in your graph. If each variable is of type tf.float32 you can multiply vars by 4 to get the number of bytes consumed by all of the variables. This however is just a lower bound and there will be some additional overhead. Also I think computing the gradients requires a lot of memory since it needs to store the activations at each point in the model for the backwards pass.

like image 153
chasep255 Avatar answered Apr 27 '26 19:04

chasep255



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!