I've seen several questions about GPU Memory with Tensorflow but I've installed it on a Pine64 with no GPU support.
That means I'm running it with very limited resources (CPU and RAM only) and Tensorflow seems to want it all, completely freezing my machine.
Is there a way to limit the amount of processing power and memory allocated to Tensorflow? Something similar to bazel's own --local_resources
flag?
Limiting GPU memory growth To limit TensorFlow to a specific set of GPUs, use the tf. config. set_visible_devices method. In some cases it is desirable for the process to only allocate a subset of the available memory, or to only grow the memory usage as is needed by the process.
Technical Details: GPU — Nvidia RTX 3080. CPU & Memory — Intel i7, 32 GB RAM. TensorFlow Version — 2.4.
GPUs are commonly used for deep learning model training and inference. To set up TensorFlow to work with GPUs, you need to have the relevant GPU device drivers and configure it to use GPUs (which is slightly different for Windows and Linux machines). Then, TensorFlow runs operations on your GPUs by default.
This will create a session that runs one op at a time, and only one thread per op
sess = tf.Session(config=
tf.ConfigProto(inter_op_parallelism_threads=1,
intra_op_parallelism_threads=1))
Not sure about limiting memory, it seems to be allocated on demand, I've had TensorFlow freeze my machine when my network wanted 100GB of RAM, so my solution was to make networks that need less RAM
For TensorFlow 2.x this has been answered in the following thread:
In Tensorflow 2.x, there is no session anymore. Directly use the config API to set the parallelism at the start of the program.
import tensorflow as tf
tf.config.threading.set_intra_op_parallelism_threads(2)
tf.config.threading.set_inter_op_parallelism_threads(2)
with tf.device('/CPU:0'):
model = tf.keras.models.Sequential([...
https://www.tensorflow.org/api_docs/python/tf/config/threading
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With