3

LambdaLR用作学习率函数:

import torch
import torch.nn as nn
import matplotlib.pyplot as plt

model = torch.nn.Linear(2, 1)
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)
lambda1 = lambda epoch: 0.99 ** epoch
scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, lr_lambda=lambda1, last_epoch = -1)

lrs = []


for i in range(2001):
    optimizer.step()
    lrs.append(optimizer.param_groups[0]["lr"])
    scheduler.step()

plt.plot(lrs)  

在此处输入图像描述

我正在尝试设置最小学习率,因此它不会变为 0。我该怎么做?

4

1 回答 1

0

新的学习率总是这样计算:

LRepoch = LRinitial * Lambda(epoch):

对于初始学习率,它们是指第一个,而不是最后一个。

这意味着我们可以写:

INITIAL_LEARNING_RATE = 0.01
your_min_lr = 0.0001

lambda1 = lambda epoch: max(0.99 ** epoch, your_min_lr / INITIAL_LEARNING_RATE)

然后当变得太小时你会your_min_lr回来,因为equals just 。INITIAL_LEARNING_RATE * (0.99 ** epoch)INITIAL_LEARNING_RATE * your_min_lr / INITIAL_LEARNING_RATEyour_min_lr

于 2021-09-14T15:27:45.690 回答