我有一个旨在构建架构的功能
Input(x) -> o=My_ops(x,128) -> o=slim.batch_norm(o)
所以,我的功能是
def _build_block(self, x, name, is_training=True):
with tf.variable_scope(name) as scope:
o = my_ops(x, 256)
batch_norm_params = {
'decay': 0.9997,
'epsilon': 1e-5,
'scale': True,
'updates_collections': tf.GraphKeys.UPDATE_OPS,
'fused': None, # Use fused batch norm if possible.
'is_training': is_training
}
with slim.arg_scope([slim.batch_norm], **batch_norm_params) as bn:
return slim.batch_norm(o)
我对吗?我可以is_training
像上面的函数那样设置吗?如果没有,你能帮我解决吗?