Softmax回归LR处理二分类问题,Softmax用于多分类场景。默认值是:$\{({x_1},{y_1}),({x_2},{y_2}),...,({x_m},{y_m})\}$,和${y_i}\in\{1,2,...,K\}$,如果K是函数,${y_i}$就是j的值:$$p({y_i}=j|{x_i};\theta)=\frac{{{e^{\theta_j^T{x_i}}}}}{{\sum\limits_{k=1}^K{{e^{\theta_k^T{x_i}}}}}}$$任意默认值:$${h_\theta}({x_i})=\left[{\begin{array}{c}{p({y_i}=1|{x_i};\θ)}\\{p({y_i}=2|{x_i};\theta)}\\{...}\\{p({y_i}=k|{x_i};\theta)}\结束{数组}}\right]=\frac{1}{{\sum\limits_{k=1}^K{{e^{\theta_k^T{x_i}}}}}}\left[{\begin{数组}{c}{{e^{\theta_1^T{x_i}}}}\\{{e^{\theta_2^T{x_i}}}}\\{...}\\{{e^{\theta_K^T{x_i}}}}\end{array}}\right]$$为${\theta_1},{\theta_2},...,{\theta_k}$的默认值,默认:$$\begin{array}{l}L(\theta)=-\frac{1}{m}[\sum\limits_{i=1}^m{\sum\limits_{j=1}^k{I({y_i}=j)\log\frac{{{e^{\theta_j^T{x_i}}}}}{{\sum\limits_{k=1}^K{{e^{\theta_k^T{x_i}}}}}}}}],\\I({y_i}=j)=\left\{{\begin{array}{c}{1,{y_i}\inj}\\{0,{y_i}\notinj}\end{数组}}\正确的$\frac{{\partialL(\theta)}}{{\partial{\theta_j}}}=-\frac{1}{m}\sum\limits_{i=1}^m{{x_i}(I({y_i}=j)-p({y_i}=j|{x_i};\theta))}$$Softmax的参数有余
