Web: http://arxiv.org/abs/2110.15949

Jan. 14, 2022, 2:11 a.m. | Christian Gumbsch, Martin V. Butz, Georg Martius

cs.LG updates on arXiv.org arxiv.org

A common approach to prediction and planning in partially observable domains
is to use recurrent neural networks (RNNs), which ideally develop and maintain
a latent memory about hidden, task-relevant factors. We hypothesize that many
of these hidden factors in the physical world are constant over time, changing
only sparsely. To study this hypothesis, we propose Gated $L_0$ Regularized
Dynamics (GateL0RD), a novel recurrent architecture that incorporates the
inductive bias to maintain stable, sparsely changing latent states. The bias is
implemented by means of a novel internal gating function and a …

arxiv for prediction

Statistics and Computer Science Specialist

@ Hawk-Research | Remote

Data Scientist, Credit/Fraud Strategy

@ Fora Financial | New York City

Postdoctoral Research Associate - Biomedical Natural Language Processing and Deep Learning

@ Oak Ridge National Laboratory - Oak Ridge, TN | Oak Ridge, TN, United States

Senior Machine Learning / Computer Vision Engineer

@ Glass Imaging | Los Altos, CA

Research Scientist in Biomedical Natural Language Processing and Deep Learning

@ Oak Ridge National Laboratory | Oak Ridge, TN

W3-Professorship for Intelligent Energy Management

@ Universität Bayreuth | Bayreuth, Germany