April 16, 2024, 6:45 a.m. | /u/SeawaterFlows

Machine Learning www.reddit.com

**Paper**: [https://arxiv.org/abs/2402.02592](https://arxiv.org/abs/2402.02592)

**Code**: [https://github.com/SalesforceAIResearch/uni2ts](https://github.com/SalesforceAIResearch/uni2ts)

**Models**: [https://huggingface.co/collections/Salesforce/moirai-10-r-models-65c8d3a94c51428c300e0742](https://huggingface.co/collections/Salesforce/moirai-10-r-models-65c8d3a94c51428c300e0742)

**Dataset**: [https://huggingface.co/datasets/Salesforce/lotsa\_data](https://huggingface.co/datasets/Salesforce/lotsa_data)

**Blog post**: [https://blog.salesforceairesearch.com/moirai/](https://blog.salesforceairesearch.com/moirai/)

**Abstract**:

>Deep learning for time series forecasting has traditionally operated within a one-model-per-dataset framework, limiting its potential to leverage the game-changing impact of large pre-trained models. The concept of *universal forecasting*, emerging from pre-training on a vast collection of time series datasets, envisions a single Large Time Series Model capable of addressing diverse downstream forecasting tasks. However, constructing such a model poses unique challenges specific to time series data: …

abstract collection concept dataset datasets deep learning diverse forecasting framework game however impact machinelearning per pre-trained models pre-training series tasks time series time series forecasting training universal vast

Data Engineer

@ Lemon.io | Remote: Europe, LATAM, Canada, UK, Asia, Oceania

Artificial Intelligence – Bioinformatic Expert

@ University of Texas Medical Branch | Galveston, TX

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York