Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory Leak Keras TensorFlow1.8.0

I need help to minimize the memory leak suspected code .

I am using Keras latest with tensorflow 1.8.0 and python 3.6

When the program start its gradually growing to giganytes..!! need help here.

I am using VGG16 net for categorization to images. I couldnt localize the problem which causes the memory leaks.

Is it tensorflow bug or python suffering from such jobs

code is:

class_labels = ['cc','','cc','xx']

image = load_img(img_path, target_size=target_size)

image_arr = img_to_array(image) # convert from PIL Image to NumPy array
image_arr /= 255

image_arr = np.expand_dims(image_arr, axis=0)

model = applications.VGG16(include_top=False, weights='imagenet')  

bottleneck_features = model.predict(image_arr) 

model = create_top_model("softmax", bottleneck_features.shape[1:])

model.load_weights("res/_top_model_weights.h5")
numpy_horizontal_concat = cv2.imread(img_path)
xxx=1
path ="/home/dataset/test"
listOfFiles = os.listdir(path)
random.shuffle(listOfFiles)
pattern = "*.jpg"
model = applications.VGG16(include_top=False, weights='imagenet')

for entry in listOfFiles:
    if fnmatch.fnmatch(entry, pattern):
        image = load_img(path+"/"+ entry, target_size=target_size)
        start_time = time.time()

        image_arr = img_to_array(image)  # convert from PIL Image to NumPy array
        image_arr /= 255

        image_arr = np.expand_dims(image_arr, axis=0)

        bottleneck_features = model.predict(image_arr)

        model2 = create_top_model("softmax", bottleneck_features.shape[1:])

        model2.load_weights("res/_top_model_weights.h5")


        predicted = model2.predict(bottleneck_features)
        decoded_predictions = dict(zip(class_labels, predicted[0]))
        decoded_predictions = sorted(decoded_predictions.items(), key=operator.itemgetter(1), reverse=True)
        elapsed_time = time.time() - start_time

        print()
        count = 1
        for key, value in decoded_predictions[:5]:
            print("{}. {}: {:8f}%".format(count, key, value * 100))
            print("time:  " , time.strftime("%H:%M:%S", time.gmtime(elapsed_time)) , "  - " , elapsed_time)
            count += 1

        #OPENCV concat test
        #numpy_horizontal_concat = np.concatenate((mat_image,numpy_horizontal_concat), axis=0)

        hide_img = True
        model2=""
        predicted=""
        image_arr=""
        image=""
like image 634
2adnielsenx xx Avatar asked Dec 07 '22 14:12

2adnielsenx xx


1 Answers

Inside your for loop you build a new model with loaded weights. This model is build inside your tensorflow session, which you don't reset. So you session is build up with many models without deleting a single one.

There are 2 possible solutions:

  1. Try to optimize your code that you only have to load your model once. That way your code will get also much more faster
  2. Reset your session:

I strongly recommend to use the first solution but if this isn't possible:

from keras import backend as K
K.clear_session()
like image 117
dennis-w Avatar answered Dec 10 '22 17:12

dennis-w