all AI news
Self-Supervised Contrastive Pre-Training For Time Series via Time-Frequency Consistency. (arXiv:2206.08496v1 [cs.LG])
Web: http://arxiv.org/abs/2206.08496
June 20, 2022, 1:10 a.m. | Xiang Zhang, Ziyuan Zhao, Theodoros Tsiligkaridis, Marinka Zitnik
cs.LG updates on arXiv.org arxiv.org
Pre-training on time series poses a unique challenge due to the potential
mismatch between pre-training and target domains, such as shifts in temporal
dynamics, fast-evolving trends, and long-range and short cyclic effects, which
can lead to poor downstream performance. While domain adaptation methods can
mitigate these shifts, most methods need examples directly from the target
domain, making them suboptimal for pre-training. To address this challenge,
methods need to accommodate target domains with different temporal dynamics and
be capable of doing …
More from arxiv.org / cs.LG updates on arXiv.org
Latest AI/ML/Big Data Jobs
Machine Learning Researcher - Saalfeld Lab
@ Howard Hughes Medical Institute - Chevy Chase, MD | Ashburn, Virginia
Project Director, Machine Learning in US Health
@ ideas42.org | Remote, US
Data Science Intern
@ NannyML | Remote
Machine Learning Engineer NLP/Speech
@ Play.ht | Remote
Research Scientist, 3D Reconstruction
@ Yembo | Remote, US
Clinical Assistant or Associate Professor of Management Science and Systems
@ University at Buffalo | Buffalo, NY