Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Loading Keras model from google cloud ml bucket

Using Keras on Google cloud ml:

Saving model from training:

model.save('model.h5')

if cfg.cloud:
    # Copy model.h5 over to Google Cloud Storage
    with file_io.FileIO('model.h5', mode='rb') as input_f:
        with file_io.FileIO(data_folder + 'model.h5', mode='wb+') as output_f:
            output_f.write(input_f.read())

Note: not saving to job_folder, I need to read this later, not wanting to keep track of the latest job (even if that's a good way to keep models apart).

Now I want to read from my next run:

f = file_io.FileIO(model_file, mode='rb')
model = load_model(f)
model.load_weights(f)

Where 'model_file' is given as input in my submission, pointing at

--model-file gs://$BUCKET_NAME/resources/model.h5

Complains from google cloud ml jobs:

TypeError: expected str, bytes or os.PathLike object, not FileIO

I tried a number of things, but my basic question is: what's the best practice writing and especially reading in models from gcp buckets?

like image 955
user2427317 Avatar asked Feb 04 '26 07:02

user2427317


1 Answers

Finally I got it working, using the solution here:

loading saved keras model from gs to pydatalab

Thanks Tíarnán McGrath (I don't have enough points to do a plus 1)

like image 77
user2427317 Avatar answered Feb 06 '26 00:02

user2427317



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!