Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it possible to train pytorch and tensorflow model together on one GPU?

I have a pytorch model and a tensorflow model, I want to train them together on one GPU, following the process bellow: input --> pytorch model--> output_pytorch --> tensorflow model --> output_tensorflow --> pytorch model.

Is is possible to do this? If answer is yes, is there any problem which I will encounter?

Thanks in advance.

like image 743
J.Doe Avatar asked Nov 07 '22 15:11

J.Doe


1 Answers

I haven't done this but it is possible but implementing is can be a little bit. You can consider each network as a function, you want to - in some sense - compose these function to form your network, to do this you can compute the final function by just giving result of one network to the other and then use chain-rule to compute the derivatives(using symbolic differentiation from both packages).

I think a good way for implementing this you might be to wrap TF models as a PyTorch Function and use tf.gradients for computing the backward pass. Doing gradient updates can really get hard (because some variables exist in TF's computation graph) you can turn TF variables to PyTorch Variable turn them into placeholdes in TF computation graph, feed them in feed_dict and update them using PyTorch mechanisms, but I think it would be really hard to do, instead if you do your updates inside backward method of the function you might be able to do the job(it is really ugly but might do the job).

like image 187
AmirHossein Avatar answered Nov 15 '22 08:11

AmirHossein