我正在从 json 文件中加载一个 keras 模型,如下所示:
with open(str(incoming_json_file),'r') as fb:
con = json.load(fb)
我的 keras 模型定义如下:
{
"model": "Sequential",
"layers": [
{
"L1": "Conv2D(filters = 64, kernel_size=(2,2), strides=(2,2), padding='same', data_format='channels_last', activation='relu', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.), input_shape=(224,224,3))",
"L2": "MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same', data_format='channels_last')",
"L3": "Conv2D(filters = 64, kernel_size=(2,2), strides=(2,2), padding='same', data_format='channels_last', activation='relu', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.))",
"L4": "MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same', data_format='channels_last')",
"L5": "Conv2D(filters = 64, kernel_size=(2,2), strides=(2,2), padding='same', data_format='channels_last', activation='relu', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.))",
"L6": "Conv2D(filters = 64, kernel_size=(2,2), strides=(2,2), padding='same', data_format='channels_last', activation='relu', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.))",
"L7": "Conv2D(filters = 64, kernel_size=(2,2), strides=(2,2), padding='same', data_format='channels_last', activation='relu', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.))",
"L8": "MaxPooling2D(pool_size=(2,2), strides=(2,2), padding='same', data_format='channels_last')",
"L9": "Flatten()",
"L10": "Dense(4096, activation='softmax', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.))",
"L11": "Dropout(0.4)",
"L12": "Dense(2048, activation='softmax', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.))",
"L13": "Dropout(0.4)",
"L14": "Dense(1000, activation='softmax', use_bias=True, kernel_initializer='zeros', bias_initializer='zeros', kernel_regularizer=regularizers.l1(0.), bias_regularizer=regularizers.l1(0.), activity_regularizer=regularizers.l1(0.), kernel_constraint=max_norm(2.), bias_constraint=max_norm(2.))",
"L15": "Dropout(0.4)"
}
]
}
加载模型并将其转换为 .pb 格式后,我正在尝试将模型原型缓冲区转换为 tflite 模型,如下所示:
mycmd = 'tflite_convert --inference_type=QUANTIZED_UINT8 --output_file={} --graph_def_file={} --input_arrays={} --output_arrays={} quantize_weights=true --std_dev_values=1 --mean_values=0 --default_ranges_min=-2 --default_ranges_max=2'.format((tflite_model_name),pb_model_path,str(input_name[0]),str(output_name[0]))
os.system(mycmd)
但我收到以下错误:
2019-06-14 18:34:45.096579: F tensorflow/lite/toco/graph_transformations/resolve_tensorflow_switch.cc:98] Check failed: other_op->type == OperatorType::kMerge Found Mul as non-selected output from Switch, but only Merge supported.
Aborted (core dumped)
我试图做这样的事情:
"L15": "Dropout(0.4, training=False)"
但这似乎不起作用。
消除错误的任何想法。我看到了这一点,并试图将 Dropout 置于测试模式,但没有可行的选择。