2

上下文管理器可以将两个两个相关的操作合二为一。例如:

with open('some_file', 'w') as opened_file:
    opened_file.write('Hola!')

上面的代码等价于:

file = open('some_file', 'w')
try:
    file.write('Hola!')
finally:
    file.close()

但在https://www.tensorflow.org/tutorials/eager/custom_training_walkthrough#define_the_loss_and_gradient_function 我发现:

def grad(model, inputs, targets):
  with tf.GradientTape() as tape:
    loss_value = loss(model, inputs, targets)
  return loss_value, tape.gradient(loss_value, model.trainable_variables)

它相当于什么?

4

1 回答 1

1

我不是 python 专家,但我认为 with 是由__enter__方法和__exit__方法定义的(https://book.pythontips.com/en/latest/context_managers.html)。对于 tf.GradientTape 方法__enter__是:

  def __enter__(self):
    """Enters a context inside which operations are recorded on this tape."""
    self._push_tape()
    return self

https://github.com/tensorflow/tensorflow/blob/r2.0/tensorflow/python/eager/backprop.py#L801-L804

__exit__方法

  def __exit__(self, typ, value, traceback):
    """Exits the recording context, no further operations are traced."""
    if self._recording:
      self._pop_tape()

https://github.com/tensorflow/tensorflow/blob/r2.0/tensorflow/python/eager/backprop.py#L806-L809

然后

with tf.GradientTape() as tape:
    loss_value = loss(model, inputs, targets)

tape = tf.GradientTape()
tape.push_tape()
loss_value = loss(model, inputs, targets)
self._pop_tape()
于 2019-11-07T17:21:57.540 回答