Aug. 10, 2023, 4:44 a.m. | Yogesh Kumar, Alexander Ilin, Henri Salo, Sangita Kulathinal, Maarit K. Leinonen, Pekka Marttinen

cs.LG updates on arXiv.org arxiv.org

The application of Transformer neural networks to Electronic Health Records
(EHR) is challenging due to the distinct, multidimensional sequential structure
of EHR data, often leading to underperformance when compared to simpler linear
models. Thus, the advantages of Transformers, such as efficient transfer
learning and improved scalability are not fully exploited in EHR applications.
To overcome these challenges, we introduce SANSformer, a novel attention-free
sequential model designed specifically with inductive biases to cater for the
unique characteristics of EHR data.


Our …

advantages application arxiv attention data ehr electronic electronic health records forecasting free health linear multidimensional networks neural networks records scalability transfer transfer learning transformer transformers

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne