Feb. 23, 2024, 5:48 a.m. | Chenxi Sun, Hongyan Li, Yaliang Li, Shenda Hong

cs.CL updates on arXiv.org arxiv.org

arXiv:2308.08241v2 Announce Type: replace
Abstract: This work summarizes two ways to accomplish Time-Series (TS) tasks in today's Large Language Model (LLM) context: LLM-for-TS (model-centric) designs and trains a fundamental large model, or fine-tunes a pre-trained LLM for TS data; TS-for-LLM (data-centric) converts TS into a model-friendly representation to enable the pre-trained LLM to handle TS data. Given the lack of data, limited resources, semantic context requirements, and so on, this work focuses on TS-for-LLM, where we aim to activate LLM's …

abstract arxiv context cs.ai cs.cl data data-centric designs embedding language language model large language large language model llm representation series tasks test text time series trains type work

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote