March 14, 2024, 10:04 a.m. | /u/shchur

Machine Learning www.reddit.com

Paper: [https://arxiv.org/abs/2403.07815](https://arxiv.org/abs/2403.07815)

Code: [https://github.com/amazon-science/chronos-forecasting](https://github.com/amazon-science/chronos-forecasting)

Model weights: [https://huggingface.co/collections/amazon/chronos-models-65f1791d630a8d57cb718444](https://huggingface.co/collections/amazon/chronos-models-65f1791d630a8d57cb718444)

Abstract:

>We introduce Chronos, a simple yet effective framework for **pretrained probabilistic time series models**. Chronos tokenizes time series values using scaling and quantization into a fixed vocabulary and trains existing transformer-based language model architectures on these tokenized time series via the cross-entropy loss. We pretrained Chronos models based on the T5 family (ranging from 20M to 710M parameters) on a large collection of publicly available datasets, complemented by a synthetic dataset that …

abstract architectures cross-entropy entropy family framework language language model loss machinelearning parameters quantization scaling series simple time series trains transformer values via

AI Research Scientist

@ Vara | Berlin, Germany and Remote

Data Architect

@ University of Texas at Austin | Austin, TX

Data ETL Engineer

@ University of Texas at Austin | Austin, TX

Lead GNSS Data Scientist

@ Lurra Systems | Melbourne

Data Science Analyst

@ Mayo Clinic | AZ, United States

Sr. Data Scientist (Network Engineering)

@ SpaceX | Redmond, WA