2

在当前的笔记本教程(gpflow 2.0)中,所有@tf.function 标签都包含选项 autograph=False,例如(https://gpflow.readthedocs.io/en/2.0.0-rc1/notebooks/advanced/gps_for_big_data.html ):

@tf.function(autograph=False)
def optimization_step(optimizer, model: gpflow.models.SVGP, batch):
    with tf.GradientTape(watch_accessed_variables=False) as tape:
        tape.watch(model.trainable_variables)
        objective = - model.elbo(*batch)
        grads = tape.gradient(objective, model.trainable_variables)
    optimizer.apply_gradients(zip(grads, model.trainable_variables))
    return objective

有谁知道为什么会这样,或者这背后的原因是什么?据我了解,autograph=True只是允许将 python 控制流转换为图形结构。即使不需要该功能,将其设置/保留为 true 是否有任何缺点?

我的猜测是,它在图形的编译时只是一个很小的开销,但应该可以忽略不计。那是错的吗?

谢谢

4

1 回答 1

2

我们在大多数包装目标中设置autograph的原因是因为 GPflow 使用了内部使用生成器的多调度调度程序。然而,TensorFlow 无法在签名模式下处理生成器对象(请参阅AutoGraph 的功能和限制),这会导致以下警告:Falsetf.function

WARNING:tensorflow:Entity <bound method Dispatcher.dispatch_iter of <dispatched sample_conditional>> appears to be a generator function. It will not be converted by AutoGraph.
WARNING: Entity <bound method Dispatcher.dispatch_iter of <dispatched sample_conditional>> appears to be a generator function. It will not be converted by AutoGraph.
WARNING:tensorflow:Entity <bound method Dispatcher.dispatch_iter of <dispatched conditional>> appears to be a generator function. It will not be converted by AutoGraph.
WARNING: Entity <bound method Dispatcher.dispatch_iter of <dispatched conditional>> appears to be a generator function. It will not be converted by AutoGraph.

我们已经知道这个问题有一段时间了,但还没有真正解决它 - 感谢您再次引起我们的注意。我刚刚创建了一个PR来解决这个问题,并且不再需要你将 autograph 设置为 False。我希望这个 PR 很快就会被合并。

于 2020-02-18T17:37:45.443 回答