all AI news
Time-series Generation by Contrastive Imitation. (arXiv:2311.01388v1 [stat.ML])
cs.LG updates on arXiv.org arxiv.org
Consider learning a generative model for time-series data. The sequential
setting poses a unique challenge: Not only should the generator capture the
conditional dynamics of (stepwise) transitions, but its open-loop rollouts
should also preserve the joint distribution of (multi-step) trajectories. On
one hand, autoregressive models trained by MLE allow learning and computing
explicit transition distributions, but suffer from compounding error during
rollouts. On the other hand, adversarial models based on GAN training alleviate
such exposure bias, but transitions are implicit …
arxiv autoregressive models challenge computing data distribution dynamics generative generator loop mle series transitions