Gradient Descent (梯度下降)

Recap

Logistic Regression:

$\hat{y}=\sigma ( w^{\tau}x + b )$, $\sigma(z) = \dfrac{1}{1+e^{-z}}$

Want to find $w,b$ that minimize $J(w,b)$

Gradient Descent

Suppose $\alpha$ is learning rate, then loop:

$w = w - \alpha \dfrac{\partial J(w,b)}{\partial w}$

$b = b - \alpha \dfrac{\partial J(w,b)}{\partial b}$


本文作者: Vincent0700
本文地址: https://vincentstudio.info/2019/02/18/017_Gradient_Descent/
版权声明: 本博客所有文章除特别声明外,均采用 BY-NC-SA 许可协议。转载请注明出处!