I am using keras applications for transfer learning with resnet 50 and inception v3 but when predicting always get [[ 0.]]
The below code is for a binary classification problem. I have also tried vgg19 and vgg16 but they work fine, its just resnet and inception. The dataset is a 50/50 split. And I am only changing the model = applications.resnet50.ResNet50
line of code for each model.
below is the code:
from keras.callbacks import EarlyStopping
early_stopping = EarlyStopping(monitor='val_loss', patience=2)
img_width, img_height = 256, 256
train_data_dir = xxx
validation_data_dir = xxx
nb_train_samples = 14000
nb_validation_samples = 6000
batch_size = 16
epochs = 50
if K.image_data_format() == 'channels_first':
input_shape = (3, img_width, img_height)
else:
input_shape = (img_width, img_height, 3)
model = applications.resnet50.ResNet50(weights = "imagenet", include_top=False, input_shape = (img_width, img_height, 3))
from keras.callbacks import EarlyStopping
early_stopping = EarlyStopping(monitor='val_loss', patience=2)
img_width, img_height = 256, 256
train_data_dir = xxx
validation_data_dir = xxx
nb_train_samples = 14000
nb_validation_samples = 6000
batch_size = 16
epochs = 50
if K.image_data_format() == 'channels_first':
input_shape = (3, img_width, img_height)
else:
input_shape = (img_width, img_height, 3)
model = applications.resnet50.ResNet50(weights = "imagenet", include_top=False, input_shape = (img_width, img_height, 3))
#Freeze the layers which you don't want to train. Here I am freezing the first 5 layers.
for layer in model.layers[:5]:
layer.trainable = False
#Adding custom Layers
x = model.output
x = Flatten()(x)
x = Dense(1024, activation="relu")(x)
x = Dropout(0.5)(x)
#x = Dense(1024, activation="relu")(x)
predictions = Dense(1, activation="sigmoid")(x)
# creating the final model
model_final = Model(input = model.input, output = predictions)
# compile the model
model_final.compile(loss = "binary_crossentropy", optimizer = optimizers.SGD(lr=0.0001, momentum=0.9), metrics=["accuracy"])
# Initiate the train and test generators with data Augumentation
train_datagen = ImageDataGenerator(
rescale=1. / 255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
test_datagen = ImageDataGenerator(
rescale=1. / 255,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=True)
train_generator = train_datagen.flow_from_directory(
train_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode='binary')
validation_generator = test_datagen.flow_from_directory(
validation_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode='binary')
# Save the model according to the conditions
#checkpoint = ModelCheckpoint("vgg16_1.h5", monitor='val_acc', verbose=1, save_best_only=True, save_weights_only=False, mode='auto', period=1)
#early = EarlyStopping(monitor='val_acc', min_delta=0, patience=10, verbose=1, mode='auto')
model_final.fit_generator(
train_generator,
steps_per_epoch=nb_train_samples // batch_size,
epochs=epochs,
validation_data=validation_generator,
validation_steps=nb_validation_samples // batch_size,
callbacks=[early_stopping])
from keras.models import load_model
import numpy as np
from keras.preprocessing.image import img_to_array, load_img
#test_model = load_model('vgg16_1.h5')
img = load_img('testn7.jpg',False,target_size=(img_width,img_height))
x = img_to_array(img)
x = np.expand_dims(x, axis=0)
#preds = model_final.predict_classes(x)
prob = model_final.predict(x, verbose=0)
#print(preds)
print(prob)
Note That model_final.evaluate_generator(validation_generator, nb_validation_samples)
provides an expected accuracy like 80% its just predict that is always 0.
Just find it strange that vgg19 and vgg16 work fine but not resnet50 and inception. Do these models require something else to work?
Any insight would be great.
Thanks in advance.
I was running into similar problem. You are scaling all the RGB values from 0-255 to 0-1 during training.
Thse same should be done at the time of prediction.
Try
x = img_to_array(img)
x = x/255
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With