L2 regularization

1Nn=1Nlog(1+exp(ynWTXn))+λW22\frac {1} {N} \sum_{n=1}^N \log (1+\exp(-y_n W^T X_n)) + \lambda \left \| W \right \|_2^2

kernel logistic regression###

带L2正则的LR:

minWλNWTW+1Nn=1Nlog(1+exp(ynWTXn))W=n=1NβnXnminβλNn=1Nm=1nβnβmK(Xn,Xm)+1Nn=1Nlog(1+exp(ynm=1NβmK(Xm,Xn)))\min_W \frac {\lambda} {N} W^TW + \frac {1} {N} \sum_{n=1}^N \log (1+\exp(-y_n W^T X_n)) \\ W = \sum_{n=1}^N \beta_n X_n \\ \min_{\beta} \frac {\lambda} {N} \sum_{n=1}^N \sum_{m=1}^n \beta_n \beta_m K(X_n,X_m) + \frac {1} {N} \sum_{n=1}^N \log (1+\exp(-y_n \sum_{m=1}^N \beta_m K(X_m,X_n) ))

为什么W的最优解是X的线性组合?

求解###

坐标下降法等

Last updated

Was this helpful?