我正在使用tf-slim
微调网络,vgg16
. 我想通过对最后一层应用不同的学习率来手动操作梯度。但是当我尝试使用opt.minimize()
, ortf.gradients()
并且opt.apply_gradients()
我None
在摘要报告中获得损失值时
为什么此代码路径train_op
起作用:
optimizer = tf.train.GradientDescentOptimizer( learning_rate=.001 )
train_op = slim.learning.create_train_op(total_loss, optimizer,
global_step=global_step)
slim.learning.train(train_op, log_dir,
init_fn=init_fn,
global_step=global_step,
number_of_steps=25,
save_summaries_secs=300,
save_interval_secs=600
)
但是手动创建train_op
失败并出现以下异常(例如total_loss
is None
):
trainable = tf.trainable_variables()
optimizer = tf.train.GradientDescentOptimizer(learning_rate=.001)
train_op = optimizer.minimize( total_loss, global_step=global_step )
# exception: appears that loss is None
--- Logging error ---
Traceback (most recent call last):
...
File "/anaconda/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/slim/python/slim/learning.py", line 755, in train
sess, train_op, global_step, train_step_kwargs)
File "/anaconda/anaconda3/lib/python3.6/site-packages/tensorflow/contrib/slim/python/slim/learning.py", line 506, in train_step
np_global_step, total_loss, time_elapsed)
File "/anaconda/anaconda3/lib/python3.6/logging/__init__.py", line 338, in getMessage
msg = msg % self.args
TypeError: must be real number, not NoneType
...
Message: 'global step %d: loss = %.4f (%.3f sec/step)'
Arguments: (29, None, 51.91366386413574)
我在这里做错了什么?