0

我正在尝试使用SVM不同的内核(如rbf,polylinear.

我正在使用iris在线可用的数据集,其形状为 150 * 4,所以我放弃了第 4 个特征,现在它的形状为 150 * 3。请注意,每个类现在包含 50 个样本,按出现的顺序具有 3 个特征。

class1 = iris[:50, :], class2 = iris[50:100, :], class3 = iris[100:150, :]

我已经用“线性”内核绘制了一个,我不知道如何用其他内核绘制。我已经搜索了几天,没有找到任何我能理解或可以使用的东西。

这是区分不同类别的两个表面

    z_linear = lambda x, y: (-clf.intercept_[0] - clf.coef_[0][0] * x - clf.coef_[0][1] * y) / clf.coef_[0][2]
    w_linear = lambda x, y: (-clf.intercept_[2] - clf.coef_[2][0] * x - clf.coef_[2][1] * y) / clf.coef_[2][2]

线性核支持向量机的决策边界

现在我需要绘制 3 个类以及通过使用其他内核将它们分开的表面(即 'rbf'、'poly' 和 'degree=3')

4

2 回答 2

0

我想你应该做的是绘制一条分隔这些点的非线性线。这就是 RBF/poly 内核本质上所做的,他们发现非线性超平面将类分开

按照这两个链接: https ://scikit-learn.org/0.18/auto_examples/svm/plot_iris.html

https://jakevdp.github.io/PythonDataScienceHandbook/05.07-support-vector-machines.html

于 2021-01-11T08:56:24.040 回答
0
def draw_line(coef,intercept, mi, ma):
    # for the separating hyper plane ax+by+c=0, the weights are [a, b] and the intercept is c
    # to draw the hyper plane we are creating two points
    # 1. ((b*min-c)/a, min) i.e ax+by+c=0 ==> ax = (-by-c) ==> x = (-by-c)/a here in place of y we are keeping the minimum value of y
    # 2. ((b*max-c)/a, max) i.e ax+by+c=0 ==> ax = (-by-c) ==> x = (-by-c)/a here in place of y we are keeping the maximum value of y
    points=np.array([[((-coef[1]*mi - intercept)/coef[0]), mi],[((-coef[1]*ma - intercept)/coef[0]), ma]])
    plt.plot(points[:,0], points[:,1])



def svm_margin(c):
    ratios = [(100,2), (100, 20), (100, 40), (100, 80)]
    plt.figure(figsize=(20,5))
    for j,i in enumerate(ratios):
        plt.subplot(1, 4, j+1)
        X_p=np.random.normal(0,0.05,size=(i[0],2))
        X_n=np.random.normal(0.13,0.02,size=(i[1],2))
       
        y_p=np.array([1]*i[0]).reshape(-1,1)
        y_n=np.array([0]*i[1]).reshape(-1,1)
        
        X=np.vstack((X_p,X_n))
        y=np.vstack((y_p,y_n))
        
        plt.scatter(X_p[:,0],X_p[:,1],color='yellow')
        plt.scatter(X_n[:,0],X_n[:,1],color='red')
    
        ###SVM
        clf = SVC(kernel='linear',C=c)
        clf.fit(X,y)
        coefficient = clf.coef_[0]
        intercept = clf.intercept_
        margin = 1 / (np.sqrt(np.sum(clf.coef_ ** 2)))
        draw_line(coefficient,intercept,min(X[:,1]),max(X[:,1]))
        ### Intercept for parallel hyper place is (intercept +/- 1)
        draw_line(coefficient,intercept - margin * np.sqrt(np.sum(clf.coef_ ** 2)) ,min(X[:,1]),max(X[:,1]))
        draw_line(coefficient,intercept + margin * np.sqrt(np.sum(clf.coef_ ** 2)) ,min(X[:,1]),max(X[:,1]))
        ###https://scikit-learn.org/stable/auto_examples/svm/plot_svm_margin.html
        plt.scatter(X[clf.support_][:,0],X[clf.support_][:,1],facecolors='none',edgecolors='k')
    plt.suptitle('SVM Margin Hyperplane For C = ' + str(c))
    plt.show()
    
svm_margin(0.001)
svm_margin(1)
svm_margin(100)

尝试将其扩展到 3 维系统,输出由上述代码生成以供参考:

在此处输入图像描述

于 2021-01-20T18:43:39.307 回答