Feb. 28, 2022, 3:51 p.m. | /u/baigyaanik

Machine Learning www.reddit.com

One limitation of the Shapley value pointed out in the following post/paper was that for non-experts in game theory, theoretical interpretations are not intuitive and not useful.

[https://www.reddit.com/r/MachineLearning/comments/t10cuy/r\_the\_shapley\_value\_in\_machine\_learning/](https://www.reddit.com/r/MachineLearning/comments/t10cuy/r_the_shapley_value_in_machine_learning/)

My goal is to understand what input features are more important to a neural network's predictions (LSTM in my case). The SHAP Python library makes it easy to calculate SHAP values. Since my inputs are time series, I get a SHAP value for every time step of every input feature. I can …

game game theory machinelearning theory values

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Principal Machine Learning Engineer (AI, NLP, LLM, Generative AI)

@ Palo Alto Networks | Santa Clara, CA, United States

Consultant Senior Data Engineer F/H

@ Devoteam | Nantes, France