April 5, 2024, 4:42 a.m. | Ou Deng, Qun Jin

cs.LG updates on arXiv.org arxiv.org

arXiv:2308.12388v3 Announce Type: replace
Abstract: Addressing missing data in complex datasets like Electronic Health Records (EHR) is critical for ensuring accurate analysis and decision-making in healthcare. This paper proposes Structural Equation Modeling (SEM) enhanced with the Self-Attention method (SESA), an innovative approach for data imputation in EHR. SESA innovates beyond traditional SEM-based methods by incorporating self-attention mechanisms, enhancing the model's adaptability and accuracy across diverse EHR datasets. This enhancement allows SESA to dynamically adjust and optimize imputation processes, overcoming the …

abstract analysis arxiv attention cs.lg data datasets decision ehr electronic electronic health records equation health healthcare imputation making modeling paper records self-attention sem type

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior Data Engineer

@ Quantexa | Sydney, New South Wales, Australia

Staff Analytics Engineer

@ Warner Bros. Discovery | NY New York 230 Park Avenue South