部署本地训练的模型是受支持的用例;无论您在哪里训练,说明基本相同:
要部署模型版本,您需要:
保存在 Google Cloud Storage 上的 TensorFlow SavedModel。您可以通过以下方式获取模型:
不幸的是,TensorFlow for Poets没有展示如何导出 SavedModel(我已经提交了一个功能请求来解决这个问题)。同时,您可以编写如下所示的“转换器”脚本(您也可以在训练结束时执行此操作,而不是保存graph.pb
并重新读取):
input_graph = 'graph.pb'
saved_model_dir = 'my_model'
with tf.Graph() as graph:
# Read in the export graph
with tf.gfile.FastGFile(input_graph, 'rb') as f:
graph_def = tf.GraphDef()
graph_def.ParseFromString(f.read())
tf.import_graph_def(graph_def, name='')
# CloudML Engine and early versions of TensorFlow Serving do
# not currently support graphs without variables. Add a
# prosthetic variable.
dummy_var = tf.Variable(0)
# Define SavedModel Signature (inputs and outputs)
in_image = graph.get_tensor_by_name('DecodeJpeg/contents:0')
inputs = {'image_bytes':
tf.saved_model.utils.build_tensor_info(in_image)}
out_classes = graph.get_tensor_by_name('final_result:0')
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)}
signature = tf.saved_model.signature_def_utils.build_signature_def(
inputs=inputs,
outputs=outputs,
method_name='tensorflow/serving/predict'
)
# Save out the SavedModel.
b = saved_model_builder.SavedModelBuilder(saved_model_dir)
b.add_meta_graph_and_variables(sess,
[tf.saved_model.tag_constants.SERVING],
signature_def_map={'predict_images': signature})
b.save()
(基于此代码实验室和此 SO 帖子的未经测试的代码)。
如果您希望输出使用字符串标签而不是整数索引,请进行以下更改:
# Loads label file, strips off carriage return
label_lines = [line.rstrip() for line
in tf.gfile.GFile("retrained_labels.txt")]
out_classes = graph.get_tensor_by_name('final_result:0')
out_labels = tf.gather(label_lines, ot_classes)
outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_labels)}