May 27, 2024, 4:42 a.m. | Jingzhe Shi, Qinwei Ma, Huan Ma, Lei Li

cs.LG updates on arXiv.org arxiv.org

arXiv:2405.15124v1 Announce Type: new
Abstract: Scaling law that rewards large datasets, complex models and enhanced data granularity has been observed in various fields of deep learning. Yet, studies on time series forecasting have cast doubt on scaling behaviors of deep learning methods for time series forecasting: while more training data improves performance, more capable models do not always outperform less capable models, and longer input horizons may hurt performance for some models. We propose a theory for scaling law for …

arxiv cs.ai cs.lg forecasting law scaling scaling law series time series time series forecasting type

AI Focused Biochemistry Postdoctoral Fellow

@ Lawrence Berkeley National Lab | Berkeley, CA

Senior Data Engineer

@ Displate | Warsaw

Staff Software Engineer (Data Platform)

@ Phaidra | Remote

Distributed Compute Engineer

@ Magic | San Francisco

Power Platform Developer/Consultant

@ Euromonitor | Bengaluru, Karnataka, India

Finance Project Senior Manager

@ QIMA | London, United Kingdom