2018年8月30日 星期四

CS231n: Linear classification: Support Vector Machine, Softmax

Two major components
score function:maps the raw data to class scores
loss function:quantifies the agreement between the predicted scores and the ground truth labels.




2.Linear classification(Score Function)
f(xi,W,b)=Wxi+b

SVM : support vector machine(Loss Function)
Li=jyimax(0,wTjxiwTyixi+Δ)
threshold at zero max(0,function is often called the hinge loss.

Regularization. (用來達成簡單的函數模型而不至於overfitting training data)
R(W)=klW2k,l

That is, the full Multiclass SVM loss becomes


L=1NiLidata loss+λR(W)regularization loss



or
L=1Nijyi[max(0,f(xi;W)jf(xi;W)yi+Δ)]+λklW2k,l

Binary Support Vector Machine

Li=Cmax(0,1yiwTxi)+R(W)










Multiclass Support Vector Machine loss

Li=jyimax(0,wTjxiwTyixi+Δ)

Softmax classifier:(Loss Function)
Li=log(efyijefj)or equivalentlyLi=fyi+logjefj

H(p,q)=xp(x)logq(x)




沒有留言:

張貼留言