8

我使用 hyperopt 搜索 SVM 分类器的最佳参数,但 Hyperopt 说最好的“内核”是“0”。{'kernel': '0'} 显然不合适。

有谁知道这是我的错还是一袋 hyperopt 造成的?

代码如下。

from hyperopt import fmin, tpe, hp, rand
import numpy as np
from sklearn.metrics import accuracy_score
from sklearn import svm
from sklearn.cross_validation import StratifiedKFold

parameter_space_svc = {
   'C':hp.loguniform("C", np.log(1), np.log(100)),
   'kernel':hp.choice('kernel',['rbf','poly']),
   'gamma': hp.loguniform("gamma", np.log(0.001), np.log(0.1)),    
}

from sklearn import datasets
iris = datasets.load_digits()

train_data = iris.data
train_target = iris.target

count = 0

def function(args):
  print(args)
  score_avg = 0
  skf = StratifiedKFold(train_target, n_folds=3, shuffle=True, random_state=1)
  for train_idx, test_idx in skf:
    train_X = iris.data[train_idx]
    train_y = iris.target[train_idx]
    test_X = iris.data[test_idx]
    test_y = iris.target[test_idx]
    clf = svm.SVC(**args)
    clf.fit(train_X,train_y)
    prediction = clf.predict(test_X)
    score = accuracy_score(test_y, prediction)
    score_avg += score

  score_avg /= len(skf)
  global count
  count = count + 1
  print("round %s" % str(count),score_avg)
  return -score_avg

best = fmin(function, parameter_space_svc, algo=tpe.suggest, max_evals=100)
print("best estimate parameters",best)

输出如下。

best estimate parameters {'C': 13.271912841932233, 'gamma': 0.0017394328334592358, 'kernel': 0}
4

1 回答 1

23

首先,您使用sklearn.cross_validation的是自 0.18 版起已弃用的版本。所以请将其更新为sklearn.model_selection.

现在回到主要问题,bestfromfmin总是返回使用定义的参数的索引hp.choice

因此,在您的情况下,'kernel':0意味着第一个值 ( 'rbf') 被选为内核的最佳值。

请参阅此问题以确认这一点:

要从中获取原始值best,请使用space_eval()如下函数:

from hyperopt import space_eval
space_eval(parameter_space_svc, best)

Output:
{'C': 13.271912841932233, 'gamma': 0.0017394328334592358, 'kernel': 'rbf'}
于 2017-08-14T13:07:44.390 回答