all AI news
Change Detection for Local Explainability in Evolving Data Streams. (arXiv:2209.02764v1 [cs.LG])
Sept. 8, 2022, 1:11 a.m. | Johannes Haug, Alexander Braun, Stefan Zürn, Gjergji Kasneci
cs.LG updates on arXiv.org arxiv.org
As complex machine learning models are increasingly used in sensitive
applications like banking, trading or credit scoring, there is a growing demand
for reliable explanation mechanisms. Local feature attribution methods have
become a popular technique for post-hoc and model-agnostic explanations.
However, attribution methods typically assume a stationary environment in which
the predictive model has been trained and remains stable. As a result, it is
often unclear how local attributions behave in realistic, constantly evolving
settings such as streaming and online …
More from arxiv.org / cs.LG updates on arXiv.org
Jobs in AI, ML, Big Data
Artificial Intelligence – Bioinformatic Expert
@ University of Texas Medical Branch | Galveston, TX
Lead Developer (AI)
@ Cere Network | San Francisco, US
Research Engineer
@ Allora Labs | Remote
Ecosystem Manager
@ Allora Labs | Remote
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US