all AI news
SANSformers: Self-Supervised Forecasting in Electronic Health Records with Attention-Free Models. (arXiv:2108.13672v3 [cs.LG] UPDATED)
cs.LG updates on arXiv.org arxiv.org
The application of Transformer neural networks to Electronic Health Records
(EHR) is challenging due to the distinct, multidimensional sequential structure
of EHR data, often leading to underperformance when compared to simpler linear
models. Thus, the advantages of Transformers, such as efficient transfer
learning and improved scalability are not fully exploited in EHR applications.
To overcome these challenges, we introduce SANSformer, a novel attention-free
sequential model designed specifically with inductive biases to cater for the
unique characteristics of EHR data.
Our …
advantages application arxiv attention data ehr electronic electronic health records forecasting free health linear multidimensional networks neural networks records scalability transfer transfer learning transformer transformers