梯度下降法¶
$$J(\theta)=-\frac{1}{m} \sum_{i=1}^{m}y^{(i)}log(\sigma(X_{b}^{(i)}\theta))+(1-y^{(i)})log(1-\sigma(X_{b}^{(i)}\theta))$$
先解决 $\sigma$函数的导数¶
$$\frac{J(\theta)}{\theta_{j}}=\frac{1}{m} \sum_{i=1}^{m}(\sigma(X_{b}^{(i)}\theta)-y^{(i)})X_{j}^{(i)}$$$$=\frac{1}{m} \sum_{i=1}^{m}(\widehat{y}^{(i)}-y^{(i)})X_{j}^{(i)}$$
$\sigma(X_{b}^{(i)}\theta)$ 第i行乘以 $\theta$ 就是逻辑回归中预测第i行对应的概率是多少!
计算梯度,搜索出 $\sigma$
值就可以了