all AI news
Data-Efficient Sleep Staging with Synthetic Time Series Pretraining
March 14, 2024, 4:42 a.m. | Niklas Grieger, Siamak Mehrkanoon, Stephan Bialonski
cs.LG updates on arXiv.org arxiv.org
Abstract: Analyzing electroencephalographic (EEG) time series can be challenging, especially with deep neural networks, due to the large variability among human subjects and often small datasets. To address these challenges, various strategies, such as self-supervised learning, have been suggested, but they typically rely on extensive empirical datasets. Inspired by recent advances in computer vision, we propose a pretraining task termed "frequency pretraining" to pretrain a neural network for sleep staging by predicting the frequency content of …
abstract arxiv challenges cs.lg data datasets eeg human networks neural networks pretraining q-bio.qm self-supervised learning series sleep small staging strategies supervised learning synthetic time series type
More from arxiv.org / cs.LG updates on arXiv.org
The Perception-Robustness Tradeoff in Deterministic Image Restoration
1 day, 16 hours ago |
arxiv.org
Jobs in AI, ML, Big Data
Founding AI Engineer, Agents
@ Occam AI | New York
AI Engineer Intern, Agents
@ Occam AI | US
AI Research Scientist
@ Vara | Berlin, Germany and Remote
Data Architect
@ University of Texas at Austin | Austin, TX
Data ETL Engineer
@ University of Texas at Austin | Austin, TX
Lead GNSS Data Scientist
@ Lurra Systems | Melbourne