4

我将recurive feature elimination with cross validation (rfecv)其用作特征选择技术GridSearchCV

我的代码如下。

X = df[my_features_all]
y = df['gold_standard']

x_train, x_test, y_train, y_test = train_test_split(X, y, random_state=0)

k_fold = StratifiedKFold(n_splits=5, shuffle=True, random_state=0)

clf = RandomForestClassifier(random_state = 42, class_weight="balanced")

rfecv = RFECV(estimator=clf, step=1, cv=k_fold, scoring='roc_auc')

param_grid = {'estimator__n_estimators': [200, 500],
    'estimator__max_features': ['auto', 'sqrt', 'log2'],
    'estimator__max_depth' : [3,4,5]
    }

CV_rfc = GridSearchCV(estimator=rfecv, param_grid=param_grid, cv= k_fold, scoring = 'roc_auc', verbose=10, n_jobs = 5)
CV_rfc.fit(x_train, y_train)
print("Finished feature selection and parameter tuning")

现在,我想从上面的代码中获取optimal number of featuresand 。selected features

为此,我运行了以下代码。

#feature selection results
print("Optimal number of features : %d" % rfecv.n_features_)
features=list(X.columns[rfecv.support_])
print(features)

但是,我收到以下错误: AttributeError: 'RFECV' object has no attribute 'n_features_'.

有没有其他方法可以获取这些详细信息?

如果需要,我很乐意提供更多详细信息。

4

1 回答 1

3

rfecv您传递给的对象GridSearchCV不适合它。它首先被克隆,然后将这些克隆拟合到数据中,并针对所有不同的超参数组合进行评估。

因此,要访问最佳功能,您需要访问以下best_estimator_属性GridSearchCV:-

CV_rfc.fit(x_train, y_train)
print("Finished feature selection and parameter tuning")

print("Optimal number of features : %d" % rfecv.n_features_)
features=list(X.columns[CV_rfc.best_estimator_.support_])
print(features)
于 2019-04-12T14:09:31.957 回答