1

有一个二元分类问题:如何获得 Ranger 模型的变量的 Shap 贡献?

样本数据:

library(ranger)
library(tidyverse)

# Binary Dataset
df <- iris
df$Target <- if_else(df$Species == "setosa",1,0)
df$Species <- NULL

# Train Ranger Model
model <- ranger(
  x = df %>%  select(-Target),
  y = df %>%  pull(Target))

我已经尝试了几个库(DALEX、、、、),shapr但我没有得到任何解决方案。fastshapshapper

我希望得到像SHAPforxgboostxgboost 这样的结果:

  • 其输出shap.values是变量的形状贡献
  • shap.plot.summary
4

1 回答 1

3

早上好!,根据我的发现,您可以使用ranger()fastshap() 如下:

library(fastshap)
library(ranger)
library(tidyverse)
data(iris)
# Binary Dataset
df <- iris
df$Target <- if_else(df$Species == "setosa",1,0)
df$Species <- NULL
x <- df %>%  select(-Target)
# Train Ranger Model
model <- ranger(
  x = df %>%  select(-Target),
  y = df %>%  pull(Target))
# Prediction wrapper
pfun <- function(object, newdata) {
  predict(object, data = newdata)$predictions
}

# Compute fast (approximate) Shapley values using 10 Monte Carlo repetitions
system.time({  # estimate run time
  set.seed(5038)
  shap <- fastshap::explain(model, X = x, pred_wrapper = pfun, nsim = 10)
})

# Load required packages
library(ggplot2)
theme_set(theme_bw())

# Aggregate Shapley values
shap_imp <- data.frame(
  Variable = names(shap),
  Importance = apply(shap, MARGIN = 2, FUN = function(x) sum(abs(x)))
)

然后例如,对于可变重要性,您可以执行以下操作:

# Plot Shap-based variable importance
ggplot(shap_imp, aes(reorder(Variable, Importance), Importance)) +
  geom_col() +
  coord_flip() +
  xlab("") +
  ylab("mean(|Shapley value|)")

在此处输入图像描述

此外,如果您想要单独的预测,以下是可能的:

# Plot individual explanations
expl <- fastshap::explain(model, X = x ,pred_wrapper = pfun, nsim = 10, newdata = x[1L, ])
autoplot(expl, type = "contribution")

所有这些信息都在这里找到,还有更多:https ://bgreenwell.github.io/fastshap/articles/fastshap.html 检查链接并解决您的疑问!:)

在此处输入图像描述

于 2020-12-01T09:05:18.753 回答