Feature explainability for deepar estimator #2617
Unanswered
alishametkari
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am using DeepAREstimator with multivariate features. I want to find the feature importance(shap values) of each feature for each time step. I have tried using shap package but it doesn’t support gluonts estimator. How can we get feature importance in this case? Does gluonts has its own implementation to get shap values? If yes then how to get it?
Beta Was this translation helpful? Give feedback.
All reactions