Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tensorflow to Keras: import graph def error on Keras model

I have a Tensorflow code for classifying images which I want to convert to Keras code. But I'm having trouble with the higher level API not having all codes which I desire. The problem which I have been stuck at is:

#net = get_vgg_model() <- got tf.VGG16 model
net = tf.keras.applications.VGG16()


g1 = tf.Graph()
with tf.Session(graph=g1, config=config) as sess, g1.device('/cpu:0'):
    tf.import_graph_def(net['graph_def'], name='vgg')

this code gives the error:

Traceback (most recent call last):
  File "app.py", line 16, in <module>
    from modules.xvision import Xvision
    File "/app/modules/xvision.py", line 84, in <module>
       tf.import_graph_def(net['graph_def'], name='vgg')
   TypeError: 'Model' object has no attribute '__getitem__'

Could someone help me with this graph?

like image 297
Anna Jeanine Avatar asked Nov 27 '18 09:11

Anna Jeanine


2 Answers

Getting the graph

You can get the graph from Keras with:

import keras.backend as K
K.get_session().graph

You can probably pass it to import_graph_def, but I suspect it's already Tensorflow's default graph, since in the link below, the creator of Keras says there is only one graph.

More in: https://github.com/keras-team/keras/issues/3223

Working suggestion

I don't know what you're trying to achieve, but if the idea is using Keras regularly, you'd probably never need to grab the graph.

In Keras, once you created your model with net = tf.keras.applications.VGG16(), you'd start using Keras methods from this model, such as:

#compile for training
net.compile(optimizer=someKerasOptimizer, loss=someKerasLoss, metrics=[m1,m2])

#training
net.fit(trainingInputs, trainingTargets, epochs=..., batch_size=..., ...)    
net.fit_generator(someGeneratorThatLoadsBatches, steps_per_epoch=...., ....)

#predicting
net.predict(inputs)
net.predict_generator(someGeneratorThatLoadsInputImages, steps=howManyBatches)    

Accessing weights and layers would be done by:

layer = net.layers[index]
layer = net.get_layer('layer_name')

weights = layer.get_weights()
layer.set_weights(someWeightsList)

allWeights = net.get_weights()
net.set_weights(listWithAllWeights)
like image 130
Daniel Möller Avatar answered Sep 18 '22 06:09

Daniel Möller


If what you are trying to do is import trained tensorflow to keras, First you have to consider that the naming of each keras model's variable matches with tensorflows model's variables. To show this I've created simple model with single input and single output without hidden layer. To deal with naming issue I've created the model using keras's layers and used keras to train the model.

inputs = tf.keras.layers.Input(shape=(1,), name="inputs")
outputs = tf.keras.layers.Dense(1, activation="linear", name="outputs")(inputs)
model = tf.keras.models.Model(inputs=inputs, outputs=outputs)

Now fit the network with dummy dataset, then save the model using tensorflow

model.compile(loss="mse", optimizer=tf.keras.optimizers.Adam(1e-1))
x = np.random.randn(1000) * 1000
y = x * .5 + 3
model.fit(x, y, epochs=20, batch_size=32)

with tf.keras.backend.get_session() as sess:
    saver = tf.train.Saver()
    meta_graph_def = tf.train.export_meta_graph(filename='./model.meta')
    save_path = saver.save(sess, "./model.ckpt")

Now you can create the same model using keras and load weights using tensorflow as follows.

inputs = tf.keras.layers.Input(shape=(1,), name="inputs")
outputs = tf.keras.layers.Dense(1, activation="linear", name="outputs")(inputs)
model = tf.keras.models.Model(inputs=inputs, outputs=outputs)

sess = tf.keras.backend.get_session()
saver = tf.train.Saver()
saver.restore(sess, "./model.ckpt")

Now you can use your model for predicting or what ever you want.

print(model.predict([10, 2,4,5,6]))
# [[8.000007 ]
# [4.0000067]
# [5.0000067]
# [5.5000067]
# [6.0000067]]
like image 26
Mitiku Avatar answered Sep 17 '22 06:09

Mitiku