我正在尝试在 Keras 中为 SGD 编写一个自定义学习率调度程序,它会根据迭代改变学习率。然而,LearningRateScheduler CallBack 只接受一个只接受 epoch 的函数。我的学习率函数如下所示:
学习率 = base_learning_rate x (1 + gamma x 迭代)^(-power)
我正在尝试在 Keras 中为 SGD 编写一个自定义学习率调度程序,它会根据迭代改变学习率。然而,LearningRateScheduler CallBack 只接受一个只接受 epoch 的函数。我的学习率函数如下所示:
学习率 = base_learning_rate x (1 + gamma x 迭代)^(-power)
这可以通过定义您自己的tf.keras.optimizers.schedules.LearningRateSchedule
并将其传递给优化器来实现。
class Example(tf.keras.optimizers.schedules.LearningRateSchedule):
def __init__(self, initial_learning_rate, gamma, power):
self.initial_learning_rate = initial_learning_rate
self.gamma = gamma
self.power = power
def __call__(self, step):
return self.initial_learning_rate * tf.pow((step*self.gamma+1),-self.power)
optimizer = tf.keras.optimizers.SGD(learning_rate=Example(0.1,0.001,2))
参考:https ://www.tensorflow.org/api_docs/python/tf/keras/optimizers/schedules/LearningRateSchedule
当您说“根据迭代更改学习率”时,您的意思是您想在每批结束时更改它吗?如果是这样,您可以使用自定义回调来做到这一点。我尚未对此进行测试,但代码会是一些东西喜欢
class LRA(keras.callbacks.Callback):
def __init__(self,model, initial_learning_rate, gamma, power):
super(LRA, self).__init__()
self.initial_learning=initial_learning
self.gamma=gamma
self.power= power
self.model=model # model is your compiled model
def on_train_begin(self, logs=None):
tf.keras.backend.set_value(self.model.optimizer.lr,
self.initial_learning_rate)
def on_train_batch_end(self, batch, logs=None):
lr=self.initial_learning_rate * tf.pow(((batch+1)*self.gamma+1),-self.power)
tf.keras.backend.set_value(self.model.optimizer.lr, lr)
# print('for ', batch, ' lr set to ', lr) remove comment if you want to see lr change
让我知道这是否有效,我还没有测试过
before you run model.fit include code
initial_learning_rate= .001 # set to desired value
gamma= # set to desired value
power= # set to desired value
callbacks=[LRA(model=model, initial_learning_rate=initial_learning_rate, gamma=gamma, power=power)