当我设置学习率并发现训练几个时期后准确率无法提高时
optimizer = optim.Adam(model.parameters(), lr = 1e-4)
n_epochs = 10
for i in range(n_epochs):
// some training here
如果我想使用逐步衰减:每 5 个 epoch 将学习率降低 10 倍,我该怎么做?
当我设置学习率并发现训练几个时期后准确率无法提高时
optimizer = optim.Adam(model.parameters(), lr = 1e-4)
n_epochs = 10
for i in range(n_epochs):
// some training here
如果我想使用逐步衰减:每 5 个 epoch 将学习率降低 10 倍,我该怎么做?
你可以使用 lr shedulartorch.optim.lr_scheduler.StepLR
import torch.optim.lr_scheduler.StepLR
scheduler = StepLR(optimizer, step_size=5, gamma=0.1)
gamma
通过每个step_size
时期衰减每个参数组的学习率,请参见此处的文档
示例来自文档
# Assuming optimizer uses lr = 0.05 for all groups
# lr = 0.05 if epoch < 30
# lr = 0.005 if 30 <= epoch < 60
# lr = 0.0005 if 60 <= epoch < 90
# ...
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(100):
train(...)
validate(...)
scheduler.step()
例子:
import torch
import torch.optim as optim
optimizer = optim.SGD([torch.rand((2,2), requires_grad=True)], lr=0.1)
scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=5, gamma=0.1)
for epoch in range(1, 21):
scheduler.step()
print('Epoch-{0} lr: {1}'.format(epoch, optimizer.param_groups[0]['lr']))
if epoch % 5 == 0:print()
Epoch-1 lr: 0.1
Epoch-2 lr: 0.1
Epoch-3 lr: 0.1
Epoch-4 lr: 0.1
Epoch-5 lr: 0.1
Epoch-6 lr: 0.010000000000000002
Epoch-7 lr: 0.010000000000000002
Epoch-8 lr: 0.010000000000000002
Epoch-9 lr: 0.010000000000000002
Epoch-10 lr: 0.010000000000000002
Epoch-11 lr: 0.0010000000000000002
Epoch-12 lr: 0.0010000000000000002
Epoch-13 lr: 0.0010000000000000002
Epoch-14 lr: 0.0010000000000000002
Epoch-15 lr: 0.0010000000000000002
Epoch-16 lr: 0.00010000000000000003
Epoch-17 lr: 0.00010000000000000003
Epoch-18 lr: 0.00010000000000000003
Epoch-19 lr: 0.00010000000000000003
Epoch-20 lr: 0.00010000000000000003
有关如何调整学习率的更多信息-torch.optim.lr_scheduler
提供了几种根据 epoch 数调整学习率的方法。