Aug. 17, 2022, 3:29 p.m. | /u/fedegarzar

Machine Learning www.reddit.com

​

https://preview.redd.it/put2itbz1bi91.png?width=920&format=png&auto=webp&s=10c905054f14214d1caaaf7765dc5693efad4a14

History tends to repeat itself. But FB-Prophet's [tainted memory](https://www.reddit.com/r/MachineLearning/comments/syx41w/p_beware_of_false_fbprophets_introducing_the/) is too recent and should act as a warning not to repeat the same mistakes.

This post compares Neural-Prophet's performance with Exponential Smoothing (ETS), a half-century-old forecasting method part of every practitioner's toolkit.

Our [comparison](https://github.com/Nixtla/statsforecast/blob/main/experiments/neuralprophet/README.md) covers Tourism, M3, M4, ERCOT, and ETTm2 datasets, following the authors' recommended hyperparameter and network configuration settings. Despite Neural-Prophet's [outstanding success](https://arxiv.org/abs/2111.15397) over its unreliable predecessor, its errors are still 30 percent larger than ETS' …

facebook machinelearning prophet

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote