April 9, 2024, 4:42 a.m. | Zhiyu Liang, Chen Liang, Zheng Liang, Hongzhi Wang, Bo Zheng

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.05057v1 Announce Type: new
Abstract: Unsupervised (a.k.a. Self-supervised) representation learning (URL) has emerged as a new paradigm for time series analysis, because it has the ability to learn generalizable time series representation beneficial for many downstream tasks without using labels that are usually difficult to obtain. Considering that existing approaches have limitations in the design of the representation encoder and the learning objective, we have proposed Contrastive Shapelet Learning (CSL), the first URL method that learns the general-purpose shapelet-based representation …

abstract analysis arxiv cs.db cs.lg general labels learn new paradigm paradigm representation representation learning series tasks time series type unsupervised url

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Senior Machine Learning Engineer (MLOps)

@ Promaton | Remote, Europe

AI Engineering Manager

@ M47 Labs | Barcelona, Catalunya [Cataluña], Spain