Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TensorFlow 2.0 C++ - Load pre-trained model

can someone give me a hint on how I can load a model, trained and exported in python with keras, with the C++ API of tensorflow 2.0?

I can't find information about that, only with tensorflow version < 2.

Kind regards

like image 916
praetorianer777 Avatar asked May 01 '20 23:05

praetorianer777


People also ask

Does TensorFlow have Pretrained models?

Using pre-trained TensorFlow Lite models lets you add machine learning functionality to your mobile and edge device application quickly, without having to build and train a model. This guide helps you find and decide on trained models for use with TensorFlow Lite.

How do you use a pre-trained model?

Simply put, a pre-trained model is a model created by some one else to solve a similar problem. Instead of building a model from scratch to solve a similar problem, you use the model trained on other problem as a starting point. For example, if you want to build a self learning car.

What are pre-trained models for TensorFlow Lite?

Using pre-trained TensorFlow Lite models lets you add machine learning functionality to your mobile and edge device application quickly, without having to build and train a model. This guide helps you find and decide on trained models for use with TensorFlow Lite.

Can I use TensorFlow Lite with open source models?

There are a variety of already trained, open source models you can use immediately with TensorFlow Lite to accomplish many machine learning tasks. Using pre-trained TensorFlow Lite models lets you add machine learning functionality to your mobile and edge device application quickly, without having to build and train a model.

How to use TensorFlow with C API?

Build the simplest model using Python & Tensorflow and export it to tf model that can be read by C API Build a simple C code and compile it with gccand run it like a normal executable file. So here we go, 1. Getting the Tensorflow C API As far as I know, there are 2 ways to get the C API header.

How do I save TensorFlow models?

There are different ways to save TensorFlow models depending on the API you're using. This guide uses tf.keras, a high-level API to build and train models in TensorFlow. For other approaches see the TensorFlow Save and Restore guide or Saving in eager. To demonstrate how to save and load weights, you'll use the MNIST dataset.


1 Answers

Ok I found a solution nut with other problems:

In Python you have to export it with:

tf.keras.models.save_model(model, 'model')

In C++ you have to load it with:

tensorflow::SavedModelBundle model;
tensorflow::Status status = tensorflow::LoadSavedModel(
  tensorflow::SessionOptions(), 
  tensorflow::RunOptions(), 
  "path/to/model/folder", 
  {tensorflow::kSavedModelTagServe}, 
  &model);

Based on this post: Using Tensorflow checkpoint to restore model in C++

If I now try to set inputs and outputs it throws an error: "Could not find node with name 'outputlayer'" and "Invalid argument: Tensor input:0, specified in either feed_devices or fetch_devices was not in the Graph".

Does anybody has an idea whats wrong here?

like image 85
praetorianer777 Avatar answered Sep 22 '22 03:09

praetorianer777