pytorch 自定义学习率衰减

def adjust_learning_rate(optimizer, epoch):
    """Sets the learning rate to the initial LR decayed by 10 every 30 epochs"""
    lr = args.lr * (0.1 ** (epoch // 30))
    for param_group in optimizer.param_groups:
        param_group['lr'] = lr

 

代码来自https://github.com/pytorch/examples/blob/95d5fddfb578674e01802f1db1820d8ac1015f67/imagenet/main.py#L314

 

同时,pytorch本身也提供了学习率衰减的方法https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate

 

https://github.com/ncullen93/torchsample 提供了做好的轮子和其他的一些回调功能,如early stop

 

觉得最好是使用pytorch接口,毕竟有人维护

posted @ 2019-10-21 11:02  e-yi  阅读(16)  评论(0)    收藏  举报  来源