Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TF2 add report_tensor_allocations_upon_oom to RunOptions

I'm getting this message:

Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info.

How do I do that in Tensorflow 2.3?

Over the past few days this turned out to be a surprisingly frustrating issue. There appears to be no working example of how to do this in TF2.

like image 612
mafu Avatar asked Oct 04 '20 16:10

mafu


1 Answers

This is still a long way from an allocated tensor list, but a start for TF2:

Tensorflow 2.4.1 contains the tf.config.experimental.get_memory_usage method, which returns the current number of bytes used on the GPU. Comparing this value across different points in time can shed some light on which tensors take up VRAM. It seems to be pretty accurate.

BTW, the latest nightly build contains the tf.config.experimental.get_memory_info method instead, seems they had a change of heart. This one contains the current, as well as the peak memory used.

Example code on TF 2.4.1:

import tensorflow as tf

print(tf.config.experimental.get_memory_usage("GPU:0"))  # 0

tensor_1_mb = tf.zeros((1, 1024, 256), dtype=tf.float32)
print(tf.config.experimental.get_memory_usage("GPU:0"))  # 1050112

tensor_2_mb = tf.zeros((2, 1024, 256), dtype=tf.float32)
print(tf.config.experimental.get_memory_usage("GPU:0"))  # 3147264

tensor_1_mb = None
print(tf.config.experimental.get_memory_usage("GPU:0"))  # 2098688

tensor_2_mb = None
print(tf.config.experimental.get_memory_usage("GPU:0"))  # 1536
like image 90
Synthesis Avatar answered Nov 06 '22 09:11

Synthesis