April 9, 2024, 4:42 a.m. | Zhiyu Liang, Chen Liang, Zheng Liang, Hongzhi Wang, Bo Zheng

cs.LG updates on arXiv.org arxiv.org

arXiv:2404.05057v1 Announce Type: new
Abstract: Unsupervised (a.k.a. Self-supervised) representation learning (URL) has emerged as a new paradigm for time series analysis, because it has the ability to learn generalizable time series representation beneficial for many downstream tasks without using labels that are usually difficult to obtain. Considering that existing approaches have limitations in the design of the representation encoder and the learning objective, we have proposed Contrastive Shapelet Learning (CSL), the first URL method that learns the general-purpose shapelet-based representation …

abstract analysis arxiv cs.db cs.lg general labels learn new paradigm paradigm representation representation learning series tasks time series type unsupervised url

Software Engineer for AI Training Data (School Specific)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Python)

@ G2i Inc | Remote

Software Engineer for AI Training Data (Tier 2)

@ G2i Inc | Remote

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US