all AI news
Gluformer: Transformer-Based Personalized Glucose Forecasting with Uncertainty Quantification. (arXiv:2209.04526v1 [cs.LG])
Sept. 13, 2022, 1:11 a.m. | Renat Sergazinov, Mohammadreza Armandpour, Irina Gaynanova
cs.LG updates on arXiv.org arxiv.org
Deep learning models achieve state-of-the art results in predicting blood
glucose trajectories, with a wide range of architectures being proposed.
However, the adaptation of such models in clinical practice is slow, largely
due to the lack of uncertainty quantification of provided predictions. In this
work, we propose to model the future glucose trajectory conditioned on the past
as an infinite mixture of basis distributions (i.e., Gaussian, Laplace, etc.).
This change allows us to learn the uncertainty and predict more accurately …
arxiv forecasting personalized quantification transformer uncertainty
More from arxiv.org / cs.LG updates on arXiv.org
Regularization by Texts for Latent Diffusion Inverse Solvers
1 day, 16 hours ago |
arxiv.org
When can transformers reason with abstract symbols?
1 day, 16 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Data Scientist (m/f/x/d)
@ Symanto Research GmbH & Co. KG | Spain, Germany
Machine Learning Operations (MLOps) Engineer - Advisor
@ Peraton | Fort Lewis, WA, United States
Mid +/Senior Data Engineer (AWS/GCP)
@ Capco | Poland
Senior Software Engineer (ETL and Azure Databricks)|| RR/463/2024 || 4 - 7 Years
@ Emids | Bengaluru, India
Senior Data Scientist (H/F)
@ Business & Decision | Toulouse, France
Senior Analytics Engineer
@ Algolia | Paris, France