0

我有一个 584 x 100 的数据集,每个数据有 584 个特征向量(总共 100 个训练向量)。我已经用 Java 实现了 Libsvm。((1).trainX 大小为 584 x 100,(2).biny 是第一类为 +1,第二类为 -1 的数组,(3)。LinearSVMNormVector 是模型的结果 w(权重向量) )。下面是我的代码 -

                // scale train data between 0 and 1
        double[][] trainX_scale = new double[trainX.length][trainX[0].length];
        for (int i = 0; i < trainX.length; i++) {
            double min = Double.MAX_VALUE;
            double max = Double.MIN_VALUE;
            for (int inner = 0; inner < trainX[i].length; inner++) {
                if (trainX[i][inner] < min)
                    min = trainX[i][inner];
                if (trainX[i][inner] > max)
                    max = trainX[i][inner];
            }
            double difference = max - min;
            for (int inner = 0; inner < trainX[i].length; inner++) {
                trainX_scale[i][inner] = (trainX[i][inner] - min)/ difference;
            }
        }

    // prepare the svm node
        svm_node[][] SVM_node_Train = new svm_node[trainX[0].length][trainX.length];

        for (int p = 0; p < trainX[0].length; p++) {
            for (int q = 0; q < trainX.length; q++) {
                SVM_node_Train[p][q] = new svm_node();
                SVM_node_Train[p][q].index = q;
                SVM_node_Train[p][q].value = trainX_scale[q][p];
            }
        }

        double[] biny_SVM = new double[biny.length];// for svm compatible
        for (int p = 0; p < biny.length; p++) {
            biny_SVM[p] = biny[p];
        }

        svm_problem SVM_Prob = new svm_problem();
        SVM_Prob.l = trainX[0].length;
        SVM_Prob.x = SVM_node_Train;
        SVM_Prob.y = biny_SVM;

        svm_parameter SVM_Param = new svm_parameter();
        SVM_Param.svm_type = 0;
        SVM_Param.kernel_type = 2;
        SVM_Param.cache_size = 100;
        SVM_Param.eps = 0.0000001;
        SVM_Param.C = 1.0;
        SVM_Param.gamma = 0.5;

        svm_model SVM_Model = new svm_model();
        SVM_Model.param = SVM_Param;
        SVM_Model.l = trainX[0].length;
        SVM_Model.nr_class = 2;
        SVM_Model.SV = SVM_node_Train;
        //SVM_Model.label = biny;

        // String check =svm.svm_check_parameter(SVM_Prob, SVM_Param); //
        // System.out.println(check);

        double[] target = new double[biny.length];// for svm compatible
        Arrays.fill(target, 0.0);
        svm.svm_cross_validation(SVM_Prob, SVM_Param, 2, target);

        // train the classifier
        svm_model test_model = svm.svm_train(SVM_Prob, SVM_Param);

        /********** get the training results of libsvm **********/

        //double[][] weights1 = test_model.sv_coef;

        double Bias = test_model.rho[0];
        double NumberOfSupportVectors = svm.svm_get_nr_sv(test_model);

        double [] SupportVectorIDs = new int[NumberOfSupportVectors];
        svm.svm_get_sv_indices(test_model, SupportVectorIDs);
        svm_node[][] SV= test_model.SV;
        double [][]SupportVectors=new double [SV.length][SV[0].length];
        for(int ii=0;ii<SV.length;ii++){
            for(int jj=0;jj<SV[0].length;jj++){
                SupportVectors[ii][jj]=SV[ii][jj].value;
            }
        }
        double[] SupportVectorWeights=test_model.sv_coef[0];
        double[] LinearSVMNormVector = new double [SupportVectors[0].length];
        for (int ii=0;ii<msvm[0].SupportVectors[0].length;ii++){
            for (int jj=0;jj<SupportVectors.length;jj++){
                LinearSVMNormVector[ii] += (SupportVectors[jj][ii] * SupportVectorWeights[jj]);
            }

        }

使用此代码, svm_train 的结果是这样的-

    optimization finished, #iter = 25
    nu = 0.9999999995725399
    obj = -24.999999987969172, rho = 1.1534070678518276E-10
    nSV = 50, nBSV = 26
    Total nSV = 50
    *
    optimization finished, #iter = 25
    nu = 0.9999999998014489
    obj = -24.999999994976864, rho = -4.654032538963752E-10
    nSV = 50, nBSV = 28
    Total nSV = 50
    *
    optimization finished, #iter = 50
    nu = 0.9999999994269334
    obj = -49.999999961945335, rho = -4.303699855872079E-10
    nSV = 100, nBSV = 56
    Total nSV = 100

如果支持向量的数量是 100,那么有界支持向量的数量可以是 56?我有点困惑有人可以告诉我为什么这个分类器不起作用吗?

谢谢!

4

1 回答 1

0

我认为您的分类器工作正常。但是,您似乎对二进制类和多维感到困惑。即使您使用二进制类,您的特征也有 100 维,并使用高斯核进行分类。

内核有助于将您的特征分类为像线性模型一样对待。这样它就可以有一个高维空间的决策边界。高维特征可能有很多边界支持向量。

所以这就是为什么我认为你的分类器做得很好。

我希望它可以帮助你的问题。我现在也在倾斜,所以如果您觉得有什么奇怪和不清楚的地方,请随时告诉我。

于 2013-11-30T01:55:56.610 回答