首先,我从 Mobilenet 下载了一个量化模型。它包含在 Mobilenet_v1_1.0_224 中。然后我做了以下
bazel-bin/tensorflow/contrib/lite/toco/toco \
> --input_files=Sample/mobilenet_v1_1.0_224/quantized_graph.pb \
> --input_format=TENSORFLOW_GRAPHDEF --output_format=TFLITE \
> --output_file=Sample/mobilenet_v1_1.0_224/quantized_graph.tflite --inference_type=QUANTIZED_UINT8 \
> --input_shape=1,224,224,3 \
> --input_array=input \
> --output_array=MobilenetV1/Predictions/Reshape_1 \
> --mean_value=128 \
> --std_value=127
以下是图表摘要
bazel-bin/tensorflow/tools/graph_transforms/summarize_graph --in_graph=Sample/mobilenet_v1_1.0_224/quantized_graph.pb
Found 1 possible inputs: (name=input, type=float(1), shape=[1,224,224,3])
No variables spotted.
Found 1 possible outputs: (name=MobilenetV1/Predictions/Reshape_1, op=Reshape)
Found 4227041 (4.23M) const parameters, 0 (0) variable parameters, and 0 control_edges
Op types used: 91 Const, 27 Add, 27 Relu6, 15 Conv2D, 13 DepthwiseConv2dNative, 13 Mul, 10 Dequantize, 2 Reshape, 1 Identity, 1 Placeholder, 1 BiasAdd, 1 AvgPool, 1 Softmax, 1 Squeeze
To use with tensorflow/tools/benchmark:benchmark_model try these arguments:
bazel run tensorflow/tools/benchmark:benchmark_model -- --graph=Sample/mobilenet_v1_1.0_224/quantized_graph.pb --show_flops --input_layer=input --input_layer_type=float --input_layer_shape=1,224,224,3 --output_layer=MobilenetV1/Predictions/Reshape_1
所以通过进行转换,我遇到了以下错误
140 个运算符,232 个数组(0 个量化) 2018-03-01 23:12:03.374916:I tensorflow/contrib/lite/toco/graph_transformations/graph_transformations.cc:39] 一般图形转换通过 1:63 个运算符,152 个数组( 1 个量化)2018-03-01 23:12:03.376325:I tensorflow/contrib/lite/toco/graph_transformations/graph_transformations.cc:39] 预量化图转换之前:63 个运算符,152 个数组(1 个量化)2018-03 -01 23:12:03.377492: F tensorflow/contrib/lite/toco/tooling_util.cc:1272] 数组 MobilenetV1/MobilenetV1/Conv2d_0/Relu6,它是 DepthwiseConv 算子的输入,产生输出数组 MobilenetV1/MobilenetV1/Conv2d_1_depthwise/ Relu6,缺少最小/最大数据,这是量化所必需的。要么针对非量化输出格式,
谢谢你的帮助