The usage details of SVM

1. When the training code is CLF = SVC (probability = false), predict_ The prompt is as follows: attributeerror: predict_ proba is not available when  probability=False;

Parameter explanation: probability boolean type; optional; default is false

Decide whether to enable probability estimation. We need to add this parameter when training fit () model, and then we can use the related method: predict_ Proba and predict_ log_ proba

2. The score function can be used to get the score, and the score is the accuracy rate;

#coding=utf-8
import pandas as pd
import xlrd
import os
import matplotlib.pyplot as plt
import numpy as np
from sklearn.svm import SVC
X = np.array ([[-1,-1],[-2,-1],[1,1],[2,1],[-1,1],[-1,2],[1,-1],[1,-2]])
y = np.array ([0,0,1,1,2,2,3,3])
# y= np.array ([1,1,2,2,3,3,4,4])
# clf = SVC(decision_ function_ shape=”ovr”,probability=True)
clf = SVC(probability=True)
#clf = SVC(probability=False)
clf.fit (X, y)
print( clf.decision_ Function (x))


for n classification, there will be n classifiers. Then, any two classifiers can work out a classification interface. In this way, the decision_ Function (), for any sample, there will be n * (n-1) / 2 values.
Any two classifiers can work out a classification interface, and then this value is the distance from the classification interface.
I think this function is for statistical drawing. It is most obvious for binary classification. It is used to count how far each point is from the hyperplane, to intuitively represent data in space, to draw hyperplane and interval plane, etc.
decision_ function_ When the shape is “ovr”, it has 4 values, and when it is “ovo”, it has 6 values.
”’
print( clf.predict (X))
print( clf.predict_ Proba (x)) # this is the score, the score of each classifier, take the class corresponding to the maximum score.
print( clf.score (x, y))
# drawing
plot_ step=0.02
x_ min, x_ max = X[:, 0].min() – 1, X[:, 0].max() + 1
y_ min, y_ max = X[:, 1].min() – 1, X[:, 1].max() + 1
xx, yy = np.meshgrid ( np.arange (x_ min, x_ max, plot_ step),
                      np.arange (y_ min, y_ max, plot_ step))

Z = clf.predict (np.c_ [ xx.ravel (), yy.ravel ()) (?) predicts the points on the coordinate style to draw the interface. In fact, the final boundary of the class is the boundary line of the interface.
Z = Z.reshape( xx.shape )
cs = plt.contourf (xx, yy, Z, cmap= plt.cm.Paired )
plt.axis (“tight”)

class_ names=”ABCD”
plot_ colors=”rybg”
for i, n, c in zip(range(4), class_ names, plot_ colors):
    idx = np.where (y = = I) # I is 0 or 1, two classes
are defined plt.scatter (X[idx, 0], X[idx, 1],
                c=c, cmap= plt.cm.Paired ,
                label=”Class %s” % n)
plt.xlim (x_ min, x_ max)
plt.ylim (y_ min, y_ max)
plt.legend (loc=’upper right’)
plt.xlabel (‘x’)
plt.ylabel (‘y’)
plt.title (‘Decision Boundary’)
plt.show ()

Read More: