Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use treeshap based on mlr3 framework? #44

Closed
invain1218 opened this issue Feb 8, 2025 · 2 comments
Closed

How to use treeshap based on mlr3 framework? #44

invain1218 opened this issue Feb 8, 2025 · 2 comments

Comments

@invain1218
Copy link

invain1218 commented Feb 8, 2025

Hello treeshap team,

I am currently attempting to utilize the Treeshap package to calculate SHAP values. However, I encountered an error when trying to apply it to models created using the mlr3 framework. Specifically, I am unsure how to properly execute Treeshap with mlr3 models.

Could you please provide guidance or an example of how to integrate Treeshap with mlr3 models? Any assistance would be greatly appreciated.

Thank you in advance for your support 😊

library(shapviz)
library(kernelshap)
library(mlr3)
library(mlr3verse)
library(mlr3learners)

library(treeshap)
library(xgboost)
data <- fifa20$data[colnames(fifa20$data) != 'work_rate']
target <- fifa20$target


data$target = target
tsk = mlr3::TaskRegr$new(id="dd", backend = data, target = "target")
as.data.table(lrn())
learner = lrn("regr.xgboost")
mm = resample(
  task = tsk,
  learner = learner,
  resampling = rsmp("cv", folds=3),
  store_backends = T,
  store_models = T
)
unified <- unify(mm$learners[[1]], data)

Error: unify.default(mm$learners[[1]], data):
Provided model is not of type supported by treeshap.

The model object is

> mm$learners[[1]]
<LearnerRegrXgboost:regr.xgboost>: Extreme Gradient Boosting
* Model: xgb.Booster
* Parameters: nrounds=1000, nthread=1, verbose=0
* Validate: NULL
* Packages: mlr3, mlr3learners, xgboost
* Predict Types:  [response]
* Feature Types: logical, integer, numeric
* Properties: hotstart_forward, importance, internal_tuning, missings,
  validation, weights 
@mayer79
Copy link
Contributor

mayer79 commented Feb 8, 2025

Just a comment: XGBoost is shipped with its own TreeSHAP implementation, so you don't need {treeshap} here.

@mayer79
Copy link
Contributor

mayer79 commented Feb 11, 2025

@mayer79 mayer79 closed this as completed Feb 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants