BGD梯度下降求最优解
BGD 批量梯度下降
一元一次线性回归¶
In [21]:
import numpy as np
import matplotlib.pyplot as plt
In [22]:
X = np.random.rand(100,1)
w,b = np.random.randint(1,10,size=2)
增加噪声¶
In [23]:
y = w* X + b + np.random.randn(100,1)
plt.scatter(X,y)
Out[23]:
<matplotlib.collections.PathCollection at 0x20a155907d0>

In [25]:
#b 作为偏置项
X = np.concatenate([X,np.full(shape = (100,1),fill_value=1)],axis=1)
X[:10]
Out[25]:
array([[0.8878335 , 1. ],
[0.75736497, 1. ],
[0.26754809, 1. ],
[0.79427412, 1. ],
[0.75955024, 1. ],
[0.31909585, 1. ],
[0.44048177, 1. ],
[0.9956163 , 1. ],
[0.31906736, 1. ],
[0.67369927, 1. ]])
核心梯度下降公式¶
$\theta^{n + 1} = \theta^{n} - \eta * X^T(X\theta - y )$
In [37]:
#循环次数
epoches = 10000
eta = 0.01
t0=5
t1=1000
theta = np.random.randn(2,1)
theta
def learning_rate_schedule(t):
return t0/(t+t1)
In [38]:
t=0
for i in range(epoches):
g = X.T.dot(X.dot(theta) - y)
theta = theta -eta*g
t+=1
eta=learning_rate_schedule(t)
print('真实斜率,截距',w,b)
print('BGD求解:',theta)
真实斜率,截距 6 3 BGD求解: [[5.97628571] [2.97962637]]
多元一次¶
In [3]:
import numpy as np
import matplotlib.pyplot as plt
X = np.random.rand(100,8)
w = np.random.randint(1,10,size=(8,1))
b = np.random.randint(1,10,size=1)
y = X.dot(w) + b + np.random.randn(100,1)
X = np.concatenate([X,np.full(shape = (100,1),fill_value=1)],axis=1)
#循环次数
epoches = 10000
eta = 0.01
t0=5
t1=1000
theta = np.random.randn(9,1)
theta
def learning_rate_schedule(t):
return t0/(t+t1)
In [4]:
t=0
for i in range(epoches):
g = X.T.dot(X.dot(theta) - y)
theta = theta -eta*g
t+=1
eta=learning_rate_schedule(t)
print('真实斜率,截距',w,b)
print('BGD求解:',theta)
真实斜率,截距 [[7] [5] [7] [8] [7] [4] [5] [2]] [7] BGD求解: [[7.89473488] [5.08938752] [7.03008708] [7.74292393] [7.16611818] [4.71807855] [4.73106898] [1.82156363] [6.45031039]]
In [ ]:
多元一次不能画图
浙公网安备 33010602011771号