May 19, 2022, 7:06 a.m. | /u/MrLeylo

Machine Learning www.reddit.com

I have been researching about the SOTA in ML over EHR data ([e.g.](https://arxiv.org/ftp/arxiv/papers/2107/2107.09951.pdf)) and it seems that most of the relevant approaches are based on LSTM or GRU. Nowadays, that sequences are mostly handled by transformers, why do you think that in EHR it is still at this point? I have some possible reasons:

* Transformers generally need lots of data, something that may be difficult in healthcare context (privacy stuff)
* Transformer FC layers may suppose a big drawback …

ehr health machinelearning ml think transformers

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

Senior AI & Data Engineer

@ Bertelsmann | Kuala Lumpur, 14, MY, 50400

Analytics Engineer

@ Reverse Tech | Philippines - Remote