Setting up TensorFlow Serving

In your production server, you need to install TensorFlow Serving and its prerequisites. You can visit the official website of TensorFlow Serving at https://tensorflow.github.io/serving/setup. Next, we will use the standard TensorFlow Model Server provided in TensorFlow Serving to serve the model. First, we need to build the tensorflow_model_server with the following command:

bazel build   
//tensorflow_serving/model_servers:tensorflow_model_server

Copy all the files from /home/ubuntu/models/pet_model in your training server into your production server. In our setup, we choose /home/ubuntu/productions as our folder to store all the production models. The productions folder will have the following structure:

- /home/ubuntu/productions/
-- 1
---- saved_model.pb
---- variables

We will use tmux to keep the model server running. Let's install tmux with this command:

sudo apt-get install tmux

Run a tmux session with this command:

tmux new -s serving

In the tmux session, let's change directory to the tensorflow_serving directory and run the following command:

    bazel-bin/tensorflow_serving/model_servers/tensorflow_model_server --port=9000 --model_name=pet-model --model_base_path=/home/ubuntu/productions

The output of the console should look like this:

    2017-05-29 13:44:32.203153: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:274] Loading SavedModel: success. Took 537318 microseconds.
    2017-05-29 13:44:32.203243: I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: pet-model version: 1}
    2017-05-29 13:44:32.205543: I tensorflow_serving/model_servers/main.cc:298] Running ModelServer at 0.0.0.0:9000 ...  

As you can see, the model is running on host 0.0.0.0 and port 9000. In the next section, we will create a simple Python client to send an image to this server via gRPC.

You should also note that the current serving is only using CPU on the production server. Building TensorFlow Serving with GPUs is beyond the scope of this chapter. If you prefer serving with GPUs, you may want to read Appendix A, Advanced Installation, which explains how to build TensorFlow and TensorFlow Serving with GPU support.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.147.47.166