Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does TensorFlow view all CPUs of one machine as ONE device?

From the experiments I run, it seems like TensorFlow uses automatically all CPUs on one machine. Furthermore, it seems like TensorFlow refers to all CPUs as /cpu:0.

Am I right, that only the different GPUs of one machine get indexed and viewed as separate devices, but all the CPUs on one machine get viewed as a single device?

Is there any way that a machine can have multiple CPUs viewing it from TensorFlows perspective?

like image 754
Paul Avatar asked Aug 08 '16 18:08

Paul


People also ask

Can TensorFlow use multiple cores?

TensorFlow has the ability to execute a given operator using multiple threads ("intra-operator parallelisation"), as well as different operators in parallel ("inter-operator parallelisation").

Does TensorFlow use cpus?

All cores are wrapped in cpu:0, i.e., TensorFlow does indeed use multiple CPU cores by default.

Can TensorFlow use both CPU and GPU?

If a TensorFlow operation has both CPU and GPU implementations, TensorFlow will automatically place the operation to run on a GPU device first. If you have more than one GPU, the GPU with the lowest ID will be selected by default. However, TensorFlow does not place operations into multiple GPUs automatically.


1 Answers

By default all CPUs available to the process are aggregated under cpu:0 device.

There's answer by mrry here showing how to create logical devices like /cpu:1, /cpu:2

There doesn't seem to be working functionality to pin logical devices to specific physical cores or be able to use NUMA nodes in tensorflow.

A possible work-around is to use distributed TensorFlow with multiple processes on one machine and use taskset on Linux to pin specific processes to specific cores

like image 102
Yaroslav Bulatov Avatar answered Sep 23 '22 17:09

Yaroslav Bulatov