牛客题解 | 实现Adam优化算法
题目
Adam优化算法是一种自适应学习率的优化算法,其计算步骤如下:
- 初始化参数\[m_0 = 0, \quad v_0 = 0, \quad t = 0 \]
- 计算梯度\[g_t = \nabla f(x_t) \]
- 更新动量\[m_t = \beta_1 m_{t-1} + (1 - \beta_1) g_t m_t = m_t / (1 - \beta_1^t) \]
- 更新方差\[v_t = \beta_2 v_{t-1} + (1 - \beta_2) g_t^2 v_t = v_t / (1 - \beta_2^t) \]
- 更新参数\[x_{t+1} = x_t - \eta \cdot m_t / (\sqrt{v_t} + \epsilon) \]
这里对动量和方差进行了偏差修正,以避免初始阶段的不稳定;对新参数更新的时候加上了\(\epsilon\),以避免除0错误。
标准代码如下
def adam_optimizer(f, grad, x0, learning_rate=0.001, beta1=0.9, beta2=0.999, epsilon=1e-8, num_iterations=10):
x = x0
m = np.zeros_like(x)
v = np.zeros_like(x)
for t in range(1, num_iterations + 1):
g = grad(x)
m = beta1 * m + (1 - beta1) * g
v = beta2 * v + (1 - beta2) * g**2
m_hat = m / (1 - beta1**t)
v_hat = v / (1 - beta2**t)
x = x - learning_rate * m_hat / (np.sqrt(v_hat) + epsilon)
return x

浙公网安备 33010602011771号