Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow 2: how to switch execution from GPU to CPU and back?

In tensorflow 1.X with standalone keras 2.X, I used to switch between training on GPU, and running inference on CPU (much faster for some reason for my RNN models) with the following snippet:

keras.backend.clear_session()

def set_session(gpus: int = 0):
    num_cores = cpu_count()

    config = tf.ConfigProto(
        intra_op_parallelism_threads=num_cores,
        inter_op_parallelism_threads=num_cores,
        allow_soft_placement=True,
        device_count={"CPU": 1, "GPU": gpus},
    )

    session = tf.Session(config=config)
    k.set_session(session)

This ConfigProto functionality is no longer available in tensorflow 2.0 (there I'm using the integrated tensorflow.keras). In the beginning, it is possible to run tf.config.experimental.set_visible_devices() in order to e.g. disable the GPU, but any subsequent calls to set_visible_devices result in RuntimeError: Visible devices cannot be modified after being initialized. Is there a way of re-initializing the visible devices or is there another way of switching the devices available?

like image 947
valend.in Avatar asked Dec 19 '19 14:12

valend.in


People also ask

Can TensorFlow use both CPU and GPU?

TensorFlow supports running computations on a variety of types of devices, including CPU and GPU. They are represented with string identifiers for example: "/device:CPU:0" : The CPU of your machine.

Does TensorFlow 2 automatically use GPU?

If a TensorFlow operation has both CPU and GPU implementations, TensorFlow will automatically place the operation to run on a GPU device first. If you have more than one GPU, the GPU with the lowest ID will be selected by default. However, TensorFlow does not place operations into multiple GPUs automatically.


1 Answers

You can use tf.device to explicitly set which device you want to use. For example:

import tensorflow as tf    

model = tf.keras.Model(...)

# Run training on GPU
with tf.device('/gpu:0'):
    model.fit(...)

# Run inference on CPU
with tf.device('/cpu:0'):
    model.predict(...)

If you only have one CPU and one GPU, the names used above should work. Otherwise, device_lib.list_local_devices() can give you a list of your devices. This post gives a nice function for listing just the names, which I adapt here to also show CPUs:

from tensorflow.python.client import device_lib

def get_available_devices():
    local_device_protos = device_lib.list_local_devices()
    return [x.name for x in local_device_protos if x.device_type == 'GPU' or x.device_type == 'CPU']
like image 76
adamconkey Avatar answered Sep 20 '22 17:09

adamconkey