April 1, 2024, 4:41 a.m. | Debdipta Goswami

cs.LG updates on arXiv.org arxiv.org

arXiv:2403.19806v1 Announce Type: new
Abstract: This paper proposes a novel and interpretable recurrent neural-network structure using the echo-state network (ESN) paradigm for time-series prediction. While the traditional ESNs perform well for dynamical systems prediction, it needs a large dynamic reservoir with increased computational complexity. It also lacks interpretability to discern contributions from different input combinations to the output. Here, a systematic reservoir architecture is developed using smaller parallel reservoirs driven by different input combinations, known as features, and then they …

abstract arxiv complexity computational computer cs.lg cs.sy dynamic echo eess.sy feature interpretability network networks novel paper paradigm prediction series state systems type

Lead Developer (AI)

@ Cere Network | San Francisco, US

Research Engineer

@ Allora Labs | Remote

Ecosystem Manager

@ Allora Labs | Remote

Founding AI Engineer, Agents

@ Occam AI | New York

AI Engineer Intern, Agents

@ Occam AI | US

AI Research Scientist

@ Vara | Berlin, Germany and Remote