2

https://www.tensorflow.org/api_docs/python/tf/hessians

典型的做法应该是做

with tf.GradientTape() as tape_:
    with tf.GradienTape() as tape:
        loss = ...
    g = tape.gradient(loss, [vars])
gg = tape_.gradient(loss, [tf.transpose(vars)])

但当然,转置在磁带上不会像那样工作。

tf.hessian 在文档中没有示例。我认为它可能来自 tf 1.0.

更新:使用tf.jacobian

with tf.GradientTape() as tape_:
    with tf.GradientTape() as tape:
        loss = J(a_orig, r)
    dJda = tape.jacobian(loss, [a_orig])[0]
s = tape_.jacobian(dJda, [a_orig])[0]
4

1 回答 1

0

如更新中所列。

在使用相同变量再次区分时使用tf.jacobian而不是。tf.gradient

with tf.GradientTape() as tape_:
    with tf.GradientTape() as tape:
        loss = J(a_orig, r)
    dJda = tape.jacobian(loss, [a_orig])[0]
s = tape_.jacobian(dJda, [a_orig])[0]
于 2020-02-24T16:26:43.763 回答