我正在尝试使用 Coral TPU 在 Rasperry PI 上运行 Tensorflow 模型精简版。型号为 SSD Mobile Net 2。经过完全量化或浮点 I/O 转换后,它在 PC 上运行良好。但是,当我在 Coral TPU 上运行它时,我得到了很多错误的结果。通常它是误报类 0(映射到人)。有人可以帮助我吗,我想出了如何解决它的想法?
张量流版本:2.5.0
TensorFlow Lite 版本:2.5.0
我做的步骤:
- 下载模型:http: //download.tensorflow.org/models/object_detection/tf2/20200711/ssd_mobilenet_v2_fpnlite_320x320_coco17_tpu-8.tar.gz
- 我将输入调整大小图层更改为 320x320,但结果与原始 300x300 相同。
- 我将保存的模型转换为 tf lite 友好格式:
python3 object_detection/export_tflite_graph_tf2.py --pipeline_config_path /home/pawel/proj/net_models/ssd_mobilenet_v2_320x320_coco17_tpu-8-init/pipeline.config --trained_checkpoint_dir /home/pawel/proj/net_models/ssd_mobilenet_v2_320x320_coco17_tpu-8-init/checkpoint --output_directory /home/pawel/proj/net_models/ssd_mobilenet_v2_320x320_coco17_tpu-8-fixed-input
- 模型转换为 TF Lite 格式,model_path 指向上一步输出,我尝试量化True/False 和下面代码的注释部分:
converter = tf.lite.TFLiteConverter.from_saved_model(model_path)
if quantize:
# converter.optimizations = [tf.lite.Optimize.DEFAULT]
# converter.representative_dataset = representative_data_gen
# converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
# converter.inference_input_type = tf.uint8
# converter.inference_output_type = tf.uint8
converter.representative_dataset = representative_data_gen
converter.optimizations = [tf.lite.Optimize.DEFAULT]
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8, tf.lite.OpsSet.SELECT_TF_OPS]
converter.inference_input_type = tf.uint8
converter.inference_output_type = tf.uint8
converter.allow_custom_ops = True
print(converter.experimental_new_quantizer) # outputs True
print(converter.experimental_new_converter) # outputs True
else:
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
converter.optimizations = []
tflite_model = converter.convert()
with open(lite_model_path, 'wb') as f:
f.write(tflite_model)
数据提供者,使用第 5 步中的代码:
def representative_data_gen():
from cocodataset import CocoDataSet
coco = CocoDataSet(input_size, input_size)
images = coco.get_calibration_dataset(500)
for img in images:
yield [img]
- 代表性数据集 - Coco 2017 估值数据 - 500 个样本。
class CocoDataSet:
...
def get_calibration_dataset(self, limit: int):
with open(self.annotation_file, 'r') as f:
annotations = json.load(f)
image_info = annotations['images']
random.shuffle(image_info)
image_info = image_info[:limit]
image_paths = []
for img in image_info:
image_path = self.image_dir + img['file_name']
image_paths.append(image_path)
print(f"{limit} images will be returned")
images = []
fl = True
for i, path in enumerate(image_paths):
print(f"Loading {i}/{len(image_paths)}:" + path)
image = cv.imread(path)
image = cv.cvtColor(image, cv.COLOR_BGR2RGB)
tensor = np.zeros((self.input_height, self.input_width, 3), dtype=np.uint8)
_, _, channel = tensor.shape
h, w, _ = image.shape
scale = min(self.input_width / w, self.input_height / h)
w, h = int(w * scale), int(h * scale)
image = cv.resize(image.copy(), (w, h), interpolation=cv.INTER_LINEAR)
reshaped = image
margin_x = (self.input_width - w) // 2
margin_y = (self.input_height - h) // 2
tensor[margin_y:h + margin_y, margin_x:w + margin_x] = reshaped
tensor = np.expand_dims(tensor, axis=0)
tensor = tensor.astype(np.float32) - 127.5
tensor = tensor * 0.007843
images.append(tensor)
return images
- 珊瑚AI编译:
edgetpu_compiler ssd_mobilenet_v2_coral.tflite
- 使用 Coral AI (Rpi) 进行推理。它适用于 Coral AI SDK 中提供的 SSD2 Mobile Net。
x, y, scale = self.set_image_input(self.interpreter, region)
self.interpreter.invoke()
detection_boxes = self.get_output_tensor(self.interpreter, 0)
detection_classes = self.get_output_tensor(self.interpreter, 1, np.int)
detection_scores = self.get_output_tensor(self.interpreter, 2)
count = self.get_output_tensor(self.interpreter, 3, np.int)
8.输入图像,缩放和居中:
def set_image_input(self, interpreter: tflite.Interpreter, image: np.ndarray) -> (int, int, float):
self.did = self.did + 1
width, height = (self.input_height, self.input_width)
stretch = False
if stretch:
h, w, _ = (self.input_height, self.input_width, 1)
else:
h, w, _ = image.shape
cv.imwrite(f"{self.logs_dir}/image{self.did}.png", image)
scale = min(width / w, height / h)
w, h = int(w * scale), int(h * scale)
tensor = self.input_tensor(interpreter)
tensor.fill(0)
_, _, channel = tensor.shape
image = cv.resize(image.copy(), (w, h), interpolation=cv.INTER_LINEAR)
reshaped = image
if tensor.dtype == np.float32:
reshaped = reshaped * (1.0/255) - 1
margin_x = (self.input_width - w) // 2
margin_y = (self.input_height - h) // 2
tensor[margin_y:h + margin_y, margin_x:w + margin_x] = reshaped
return margin_x, margin_y, scale
- 获取输出张量:
def get_output_tensor(self, interpreter: tflite.Interpreter, index: int, result_type=np.float):
output_details = interpreter.get_output_details()[index]
quantization = output_details['quantization']
dtype = output_details['dtype']
tf_index = output_details['index']
tensor = np.squeeze(interpreter.get_tensor(tf_index))
if quantization != (0, 0):
input_scale, input_zero_point = quantization
tensor = (tensor.astype(np.float32) - input_zero_point) * input_scale
if tensor.dtype != result_type:
tensor = tensor.astype(result_type)
return tensor
我注意到,当我运行转换几次时,结果略有不同——代表性数据是从集合中随机获得的。在我在 PC 上运行的 Coral AI 编译模型上,差异更加明显。