pytorch 自定义学习率衰减
def adjust_learning_rate(optimizer, epoch):
"""Sets the learning rate to the initial LR decayed by 10 every 30 epochs"""
lr = args.lr * (0.1 ** (epoch // 30))
for param_group in optimizer.param_groups:
param_group['lr'] = lr
同时,pytorch本身也提供了学习率衰减的方法https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
https://github.com/ncullen93/torchsample 提供了做好的轮子和其他的一些回调功能,如early stop
觉得最好是使用pytorch接口,毕竟有人维护

浙公网安备 33010602011771号