I am trying to use keras fune-tuning to develop image classify applications. I deployed that application to a web server and the image classification is succeeded.
However, when the application is used from two or more computers at the same time, the following error message appears and the application doesn't work.
TypeError: Cannot interpret feed_dict key as Tensor: Tensor Tensor("Placeholder:0", shape=(3, 3, 3, 64), dtype=float32) is not an element of this graph.
Here is my code for image classification.
img_height, img_width = 224, 224
channels = 3
input_tensor = Input(shape=(img_height, img_width, channels))
vgg19 = VGG19(include_top=False, weights='imagenet', input_tensor=input_tensor)
fc = Sequential()
fc.add(Flatten(input_shape=vgg19.output_shape[1:]))
fc.add(Dense(256, activation='relu'))
fc.add(Dropout(0.5))
fc.add(Dense(3, activation='softmax'))
model = Model(inputs=vgg19.input, outputs=fc(vgg19.output))
model.load_weights({h5_file_path})
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
img = image.load_img({image_file_path}, target_size=(img_height, img_width))
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
x = x / 255.0
pred = model.predict(x)[0]
How can I run this application in multiple computers at the same time?
Thank you for reading this post.
I found that there are a couple of workarounds, depending on various context:
Using clear_session()
function:
from keras import backend as K
Then do following at the beginning or at the end of the function, after predicting all the data:
K.clear_session()
Calling _make_predict_function()
:
After you load your trained model call:
model._make_predict_function()
See explanation
Disable threading:
If you are running django server use this command:
python manage.py runserver --nothreading
For flask use this:
flask run --without-threads
If none of the above solutions work, check these links keras issue#6462, keras issue#2397
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With