Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can a model trained on gpu used on cpu for inference and vice versa?

Tags:

tensorflow

I was wondering if a model trained on the GPU could be use to run inference with the cpu ? (And vice versa) Thanks to you!

like image 228
Pusheen_the_dev Avatar asked Dec 05 '16 17:12

Pusheen_the_dev


People also ask

Are GPUs used for inference?

After all, GPUs substantially speed up deep learning training, and inference is just the forward pass of your neural network that's already accelerated on GPU. This is true, and GPUs are indeed an excellent hardware accelerator for inference.

Is inference faster on GPU or CPU?

CPUs are everywhere and can serve as more cost-effective options for running AI-based solutions compared to GPUs. However, finding models that are both accurate and can run efficiently on CPUs can be a challenge. Generally speaking, GPUs are 3X faster than CPUs.


1 Answers

You can do it as long as your model doesn't have explicit device allocations. IE, if your model has blocks like with tf.device('gpu:0'), it'll complain when you run it on model without GPU.

In such cases you must make sure your imported model doesn't have explicit device assignments, for instance, but using clear_devices argument in import_meta_graph

like image 189
Yaroslav Bulatov Avatar answered Oct 03 '22 01:10

Yaroslav Bulatov