How to do it...

Here is how we proceed with the recipe:

  1. Download Dockerfile.devel from https://github.com/tensorflow/serving/blob/master/tensorflow_serving/tools/docker/Dockerfile.devel
  2. Build a container by running
docker build --pull -t $USER/tensorflow-serving-devel -f Dockerfile.devel
  1. Run the container
docker run -it $USER/tensorflow-serving-devel
  1. Clone the TensorFlow Serving, configure and test the server
git clone --recurse-submodules https://github.com/tensorflow/serving
cd serving/tensorflow
./configure
cd ..
bazel test tensorflow_serving/...
  1. Now let's see an example of saving a model so that the Server can save it. This step is inspired by an example used for building a MNIST trainer and serving the model (see https://github.com/tensorflow/serving/blob/master/tensorflow_serving/example/mnist_saved_model.py ). The first step is to import the builder as saved_model_builder. Then the bulk of the work is done by the SavedModelBuilder() which saves a snapshot of the trained model to reliable storage. Note that here export_path is /tmp/mnist_model/
from tensorflow.python.saved_model import builder as saved_model_builder
...
export_path_base = sys.argv[-1]
export_path = os.path.join(
compat.as_bytes(export_path_base),
compat.as_bytes(str(FLAGS.model_version)))
print 'Exporting trained model to', export_path
builder = saved_model_builder.SavedModelBuilder(export_path)
builder.add_meta_graph_and_variables(
sess, [tag_constants.SERVING],
signature_def_map={
'predict_images':
prediction_signature,
signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:
classification_signature,
},
legacy_init_op=legacy_init_op)
builder.save()
  1. The model can then be served with a simple command
tensorflow_model_server --port=9000 --model_name=mnist --model_base_path=/tmp/mnist_model/
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset