版本:chainer 2.0.2 我使用 Adam 优化器,然后报错,我发现是这个代码引起的(fix1==0?):在adam.py:
@property
def lr(self):
fix1 = 1. - math.pow(self.hyperparam.beta1, self.t)
fix2 = 1. - math.pow(self.hyperparam.beta2, self.t)
return self.hyperparam.alpha * math.sqrt(fix2) / fix1
错误日志:
Traceback (most recent call last):
File "AU_rcnn/train.py", line 237, in <module>
main()
File "AU_rcnn/train.py", line 233, in main
trainer.run()
File "/root/anaconda3/lib/python3.6/site-packages/chainer/training/trainer.py", line 285, in run
initializer(self)
File "/root/anaconda3/lib/python3.6/site-packages/chainer/training/extensions/exponential_shift.py", line 48, in initialize
self._init = getattr(optimizer, self._attr)
File "/root/anaconda3/lib/python3.6/site-packages/chainer/optimizers/adam.py", line 121, in lr
return self.hyperparam.alpha * math.sqrt(fix2) / fix1
ZeroDivisionError: float division by zero