March 12, 2024, 4:44 a.m. | Kyra Gan, Esmaeil Keyvanshokooh, Xueqing Liu, Susan Murphy

cs.LG updates on arXiv.org arxiv.org

arXiv:2305.18511v2 Announce Type: replace
Abstract: Contextual bandit algorithms are commonly used in digital health to recommend personalized treatments. However, to ensure the effectiveness of the treatments, patients are often requested to take actions that have no immediate benefit to them, which we refer to as pro-treatment actions. In practice, clinicians have a limited budget to encourage patients to take these actions and collect additional information. We introduce a novel optimization and learning algorithm to address this problem. This algorithm effectively …

abstract algorithms arxiv benefit budget clinicians cs.lg digital digital health health however information math.oc patients personalized practice them treatment type

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain