1

我正在使用 LightGBM 包。

我已经使用“create_tree_digraph”成功创建了一棵新树,但我在理解结果时遇到了一些麻烦。

叶节点中有“leaf_value”。我不知道这意味着什么。请有人帮助我理解这一点。谢谢。:)

我从这里使用了这个示例代码:https ://www.analyticsvidhya.com/blog/2017/06/which-algorithm-takes-the-crown-light-gbm-vs-xgboost/

#importing standard libraries 
import numpy as np 
import pandas as pd 
from pandas import Series, DataFrame 
import graphviz

import lightgbm as lgb 

#loading our training dataset 'adult.csv' with name 'data' using pandas 
data=pd.read_csv('./adult.csv',header=None) 

#Assigning names to the columns 
data.columns=['age','workclass','fnlwgt','education','education-num','marital_Status','occupation','relationship','race','sex','capital_gain','capital_loss','hours_per_week','native_country','Income'] 

# Label Encoding our target variable 
from sklearn.preprocessing import LabelEncoder,OneHotEncoder
l=LabelEncoder()
l.fit(data.Income) 

data.Income=Series(l.transform(data.Income))  #label encoding our target variable  

#One Hot Encoding of the Categorical features 
one_hot_workclass=pd.get_dummies(data.workclass) 
one_hot_education=pd.get_dummies(data.education) 

#removing categorical features 
data.drop(['workclass','education','marital_Status','occupation','relationship','race','sex','native_country'],axis=1,inplace=True)  

#Merging one hot encoded features with our dataset 'data' 
data=pd.concat([data,one_hot_workclass,one_hot_education],axis=1) 

#Here our target variable is 'Income' with values as 1 or 0.  
#Separating our data into features dataset x and our target dataset y 
x=data.drop('Income',axis=1) 
y=data.Income 

#Imputing missing values in our target variable 
y.fillna(y.mode()[0],inplace=True) 

#Now splitting our dataset into test and train 
from sklearn.model_selection import train_test_split 
x_train,x_test,y_train,y_test=train_test_split(x,y,test_size=.3)

train_data=lgb.Dataset(x_train,label=y_train)

#setting parameters for lightgbm
param = {'num_leaves':150, 'objective':'binary','max_depth':3,'learning_rate':.05,'max_bin':200}
param['metric'] = ['auc', 'binary_logloss']

#training our model using light gbm
num_round=50
lgbm=lgb.train(param,train_data,num_round)

graph = lgb.create_tree_digraph(lgbm)
graph.render(view=True)

然后我应用了“create_tree_digraph”函数。

图片

4

1 回答 1

2

这些是应用 sigmoid 函数之前的原始预测概率。但是,需要注意的一件事是您的图像仅显示整个模型中的一棵树,因此它与实际结果不同(除非您的模型只是这棵树)。

图像显示了在创建绘图之前将 sigmoid 应用于叶值时的样子。

于 2018-05-09T15:04:15.713 回答