Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I configure Tensorflow Serving to serve models from HDFS?

I'm attempting to serve Tensorflow models out of HDFS using Tensorflow Serving project.

I'm running tensorflow serving docker container tag 1.10.1 https://hub.docker.com/r/tensorflow/serving

I can see tensorflow/serving repo referencing Hadoop at https://github.com/tensorflow/serving/blob/628702e1de1fa3d679369e9546e7d74fa91154d3/tensorflow_serving/model_servers/BUILD#L341

"@org_tensorflow//tensorflow/core/platform/hadoop:hadoop_file_system"

This is a reference to

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/platform/hadoop/hadoop_file_system.cc

I have set the following environmental variables:

  • HADOOP_HDFS_HOME to point to my HDFS home (/etc/hadoop in my case).
  • MODEL_BASE_PATH set to "hdfs://tensorflow/models"
  • MODEL_NAME set to name of model I wish to load

I mount Hadoop home into docker container and can verify it using docker exec.

When I run the docker container I get the following in logs:

tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:369] FileSystemStoragePathSource encountered a file-system access error: Could not find base path hdfs://tensorflow/models/my_model for servable my_model

I have found examples of Tensorflow doing training using HDFS, but not serving models from HDFS using Tensorflow Serving.

Can Tensorflow Serving serve models from HDFS? If so, how do you do this?

like image 447
troynt Avatar asked Aug 28 '18 10:08

troynt


People also ask

How do you deploy a TensorFlow Serving?

Fortunately, TensorFlow was developed for production and it provides a solution for model deployment — TensorFlow Serving. Basically, there are three steps — export your model for serving, create a Docker container with your model and deploy it with Kubernetes into a cloud platform, i.e. Google Cloud or Amazon AWS.

How do you save a model for TensorFlow Serving?

js, TensorFlow Serving, or TensorFlow Hub. You can save and load a model in the SavedModel format using the following APIs: Low-level tf. saved_model API.


1 Answers

In BUILD of model_servers, under the cc_test for get_model_status_impl_test, add this line @org_tensorflow//tensorflow/core/platform/hadoop:hadoop_file_system, as shown below:

cc_test(
    name = "get_model_status_impl_test",
    size = "medium",
    srcs = ["get_model_status_impl_test.cc"],
    data = [
        "//tensorflow_serving/servables/tensorflow/testdata:saved_model_half_plus_two_2_versions",
    ],
    deps = [
        ":get_model_status_impl",
        ":model_platform_types",
        ":platform_config_util",
        ":server_core",
        "//tensorflow_serving/apis:model_proto",
        "//tensorflow_serving/core:availability_preserving_policy",
        "//tensorflow_serving/core/test_util:test_main",
        "//tensorflow_serving/servables/tensorflow:saved_model_bundle_source_adapter_proto",
        "//tensorflow_serving/servables/tensorflow:session_bundle_config_proto",
        "//tensorflow_serving/servables/tensorflow:session_bundle_source_adapter_proto",
        "//tensorflow_serving/test_util",
        "@org_tensorflow//tensorflow/cc/saved_model:loader",
        "@org_tensorflow//tensorflow/cc/saved_model:signature_constants",
        "@org_tensorflow//tensorflow/contrib/session_bundle",
        "@org_tensorflow//tensorflow/core:test",
        "@org_tensorflow//tensorflow/core/platform/hadoop:hadoop_file_system",
    ],
)

I think this would solve your problem.

Ref: Fail to load the models from HDFS

like image 100
Angus Tay Avatar answered Oct 13 '22 01:10

Angus Tay