0

知道

learning_rate = 0.0004
optimizer = torch.optim.Adam(
    model.parameters(),
    lr=learning_rate, betas=(0.5, 0.999)
)

有没有办法从第 100 个 epoch 开始衰减学习率?

这是一个好习惯吗:

decayRate = 0.96
my_lr_scheduler = torch.optim.lr_scheduler.ExponentialLR(optimizer=my_optimizer, gamma=decayRate)
4

1 回答 1

1
from torch.optim.lr_scheduler import MultiStepLR

# reduce the learning rate by 0.1 after epoch 100
scheduler = MultiStepLR(optimizer, milestones=[100,], gamma=0.1)

更多信息请参考:MultiStepLR

于 2020-12-08T14:11:46.037 回答