我从共享变量教程中借用了这个例子:
def my_image_filter(input_images):
with tf.variable_scope("conv1"):
# Variables created here will be named "conv1/weights", "conv1/biases".
relu1 = conv_relu(input_images, [5, 5, 32, 32], [32])
with tf.variable_scope("conv2"):
# Variables created here will be named "conv2/weights", "conv2/biases".
return conv_relu(relu1, [5, 5, 32, 32], [32])
假设我训练了这些变量并保存了所有四个变量:weights
以及biases
fromconv1
和conv2
layers 通过传递var_list
to tf.train.Saver
。
现在我想恢复并使用它们两次:
with tf.variable_scope("image_filters") as scope:
result1 = my_image_filter(image1)
scope.reuse_variables()
result2 = my_image_filter(image2)
但是变量的名称现在具有image_filters
前缀,即image_filters/conv1/weights
,因此保护程序无法恢复它们:Key image_filters/conv1/weights not found in checkpoint
如何恢复所有训练过的变量并多次重复使用它们?