all AI news
[D] How can I interpret Shapley values if I don't have a background in game theory?
Feb. 28, 2022, 3:51 p.m. | /u/baigyaanik
Machine Learning www.reddit.com
[https://www.reddit.com/r/MachineLearning/comments/t10cuy/r\_the\_shapley\_value\_in\_machine\_learning/](https://www.reddit.com/r/MachineLearning/comments/t10cuy/r_the_shapley_value_in_machine_learning/)
My goal is to understand what input features are more important to a neural network's predictions (LSTM in my case). The SHAP Python library makes it easy to calculate SHAP values. Since my inputs are time series, I get a SHAP value for every time step of every input feature. I can …
More from www.reddit.com / Machine Learning
Jobs in AI, ML, Big Data
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne
Senior Machine Learning Engineer (MLOps)
@ Promaton | Remote, Europe
Principal Machine Learning Engineer (AI, NLP, LLM, Generative AI)
@ Palo Alto Networks | Santa Clara, CA, United States
Consultant Senior Data Engineer F/H
@ Devoteam | Nantes, France