Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Deploying Keras Models via Google Cloud ML

I am looking to use Google Cloud ML to host my Keras models so that I can call the API and make some predictions. I am running into some issues from the Keras side of things.

So far I have been able to build a model using TensorFlow and deploy it on CloudML. In order for this to work I had to make some changes to my basic TF code. The changes are documented here: https://cloud.google.com/ml/docs/how-tos/preparing-models#code_changes

I have also been able to train a similar model using Keras. I can even save the model in the same export and export.meta format as I would get with TF.

from keras import backend as K

saver = tf.train.Saver()
session = K.get_session()
saver.save(session, 'export')

The part I am missing is how do I add the placeholders for input and output into the graph I build on Keras?

like image 444
Matthew Jackson Avatar asked Jan 31 '17 13:01

Matthew Jackson


3 Answers

After training your model on Google Cloud ML Engine (check out this awesome tutorial ), I named the input and output of my graph with

signature = predict_signature_def(inputs={'NAME_YOUR_INPUT': new_Model.input},
                                  outputs={'NAME_YOUR_OUTPUT': new_Model.output})

You can see the full exporting example for an already trained keras model 'model.h5' below.

import keras.backend as K
import tensorflow as tf
from keras.models import load_model, Sequential
from tensorflow.python.saved_model import builder as saved_model_builder
from tensorflow.python.saved_model import tag_constants, signature_constants
from tensorflow.python.saved_model.signature_def_utils_impl import predict_signature_def

# reset session
K.clear_session()
sess = tf.Session()
K.set_session(sess)

# disable loading of learning nodes
K.set_learning_phase(0)

# load model
model = load_model('model.h5')
config = model.get_config()
weights = model.get_weights()
new_Model = Sequential.from_config(config)
new_Model.set_weights(weights)

# export saved model
export_path = 'YOUR_EXPORT_PATH' + '/export'
builder = saved_model_builder.SavedModelBuilder(export_path)

signature = predict_signature_def(inputs={'NAME_YOUR_INPUT': new_Model.input},
                                  outputs={'NAME_YOUR_OUTPUT': new_Model.output})

with K.get_session() as sess:
    builder.add_meta_graph_and_variables(sess=sess,
                                         tags=[tag_constants.SERVING],
                                         signature_def_map={
                                             signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature})
    builder.save()

You can also see my full implementation.

edit: And if my answer solved your problem, just leave me an uptick here :)

like image 145
Lausbert Avatar answered Nov 18 '22 14:11

Lausbert


I found out that in order to use keras on google cloud one has to install it with a setup.py script and put it on the same place folder where you run the gcloud command:

├── setup.py
└── trainer
    ├── __init__.py
    ├── cloudml-gpu.yaml
    ├── example5-keras.py

And in the setup.py you put content such as:

from setuptools import setup, find_packages

setup(name='example5',
  version='0.1',
  packages=find_packages(),
  description='example to run keras on gcloud ml-engine',
  author='Fuyang Liu',
  author_email='[email protected]',
  license='MIT',
  install_requires=[
      'keras',
      'h5py'
  ],
  zip_safe=False)

Then you can start your job running on gcloud such as:

export BUCKET_NAME=tf-learn-simple-sentiment
export JOB_NAME="example_5_train_$(date +%Y%m%d_%H%M%S)"
export JOB_DIR=gs://$BUCKET_NAME/$JOB_NAME
export REGION=europe-west1

gcloud ml-engine jobs submit training $JOB_NAME \
  --job-dir gs://$BUCKET_NAME/$JOB_NAME \
  --runtime-version 1.0 \
  --module-name trainer.example5-keras \
  --package-path ./trainer \
  --region $REGION \
  --config=trainer/cloudml-gpu.yaml \
  -- \
  --train-file gs://tf-learn-simple-sentiment/sentiment_set.pickle

To use GPU then add a file such as cloudml-gpu.yaml in your module with the following content:

trainingInput:
  scaleTier: CUSTOM
  # standard_gpu provides 1 GPU. Change to complex_model_m_gpu for 4 
GPUs
  masterType: standard_gpu
  runtimeVersion: "1.0"
like image 43
Fuyang Liu Avatar answered Nov 18 '22 15:11

Fuyang Liu


I don't know much about Keras. I consulted with some experts, and the following should work:

from keras import backend as k

# Build the model first
model = ...    

# Declare the inputs and outputs for CloudML
inputs = dict(zip((layer.name for layer in model.input_layers),
                  (t.name for t in model.inputs)))
tf.add_to_collection('inputs', json.dumps(inputs))

outputs = dict(zip((layer.name for layer in model.output_layers),
                   (t.name for t in model.outputs)))
tf.add_to_collection('outputs', json.dumps(outputs))

# Fit/train the model
model.fit(...)

# Export the model
saver = tf.train.Saver()
session = K.get_session()
saver.save(session, 'export')

Some important points:

  • You have to call tf.add_to_collection after you create the model but before you ever call K.get_session(), fit etc.,
  • You should be sure set the name of input and output layers when you add them to the graph because you'll need to refer to them when you send prediction requests.
like image 1
rhaertel80 Avatar answered Nov 18 '22 13:11

rhaertel80