摘要:
一、Activation function Sigmod\[g(z) = \frac{1}{1 + e^{-z}} \] For Binary Classification(Logistic regression)/Output layer is binary. ReLU——most common 阅读全文
posted @ 2025-07-29 19:54
铁鼠
阅读(11)
评论(0)
推荐(0)
摘要:
1.how to build layers Manually(1x3matrix) x = np.arrat([200.0, 17.0]) layer_1 = Dense(units=3, activation='sigmoid') a1 = layer_1(x) layer_2 = Dense(u 阅读全文
posted @ 2025-07-29 15:29
铁鼠
阅读(4)
评论(0)
推荐(0)

浙公网安备 33010602011771号