根据@user728291optimize_loss
的建议,我可以使用以下函数在 tensorboard 中查看渐变。optimize_loss的函数调用语法是
optimize_loss(
loss,
global_step,
learning_rate,
optimizer,
gradient_noise_scale=None,
gradient_multipliers=None,
clip_gradients=None,
learning_rate_decay_fn=None,
update_ops=None,
variables=None,
name=None,
summaries=None,
colocate_gradients_with_ops=False,
increment_global_step=True
)
该函数需要global_step
并依赖于其他一些导入,如下所示。
from tensorflow.python.ops import variable_scope
from tensorflow.python.framework import dtypes
from tensorflow.python.ops import init_ops
global_step = variable_scope.get_variable( # this needs to be defined for tf.contrib.layers.optimize_loss()
"global_step", [],
trainable=False,
dtype=dtypes.int64,
initializer=init_ops.constant_initializer(0, dtype=dtypes.int64))
然后替换你的典型训练操作
training_operation = optimizer.minimize(loss_operation)
和
training_operation = tf.contrib.layers.optimize_loss(
loss_operation, global_step, learning_rate=rate, optimizer='Adam',
summaries=["gradients"])
然后为您的摘要添加一个合并语句
summary = tf.summary.merge_all()
然后在每次运行/时期结束时的 tensorflow 会话中:
summary_writer = tf.summary.FileWriter(logdir_run_x, sess.graph)
summary_str = sess.run(summary, feed_dict=feed_dict)
summary_writer.add_summary(summary_str, i)
summary_writer.flush() # evidently this is needed sometimes or scalars will not show up on tensorboard.
logdir_run_x
每次运行的不同目录在哪里。这样,当 TensorBoard 运行时,您可以分别查看每个运行。渐变将位于直方图选项卡下,并带有标签OptimizeLoss
。它将所有权重、所有偏差和beta
参数显示为直方图。
更新:使用 tf slim,还有另一种方法也可以工作并且可能更清洁。
optimizer = tf.train.AdamOptimizer(learning_rate = rate)
training_operation = slim.learning.create_train_op(loss_operation, optimizer,summarize_gradients=True)
通过设置summarize_gradients=True
(不是默认设置),您将获得所有权重的梯度摘要。这些将在 Tensorboard 下查看summarize_grads